WWW Data Source/Proxy issue

I've seen a similar post, but wanted to see if anyone has encountered either of these errors while trying to crawl external web sites into the KD:
Job Operation #1 failed: This crawl could not be launched because the location from which it was supposed to start, http://www.cnn.com could not be found or was inaccessible. When the crawler attempted to visit this location it received the following message: Exception of type com.plumtree.openkernel.exceptions.OKAssertError was thrown.(282610)
Job Operation #1 failed: This crawl could not be launched because the location from which it was supposed to start, http://www.cnn.com/ could not be found or was inaccessible. When the crawler attempted to visit this location it received the following message: -2147203840 - Error in function PTWebCrawlProvider.GetChildNodes (pSafeArrayDocuments == [email protected], pSafeArrayContainers == [email protected]): -2147203840 - CPTWebCrawlProvider::GetMIMEType, could not open : http://www.cnn.com/(282610)
We have the proxy info in the www data source, and the serverconfig.xml seems ok, but we are getting 407 (proxy) http responses.
Thanks,
Daryl

In addition to Mti post..
You said you had enhanced the DS. have written Code in the CMOD to populate the Data to the fileds.
If already written check the tick mark "Filed only know for Exit". the system will come to know that data is being populated using the FM.

Similar Messages

  • Sharepoint 2013 Excel External Data Source Refresh Issue

    I have been facing this issue for quite some time now.. i have created an Excel sheet in Excel-13 and have imported data from an external data source [SQL server 2012]. 
    Everything is working fine, with the excel sheet on the desktop. Data refreshes, every-time i open the excel file and also at regular intervals that i have configured in the data source properties.
    The problem begins when i save that excel sheet on my sharepoint server. the issues that i am facing are :
    1. Changes made into the original data source, are not reflected immediately inside the excel sheet inside the browser. after 5-10 minutes, it reflects the changes..
    2. The data doesn't refreshes automatically. After i update my data inside the sql server table, i have to manually trigger the refresh of the data connection when viewing the excel sheet inside the browser, even though i have marked "Refresh when opening
    the file", and refresh every 1 minute inside the excel sheet. Any solutions ??
    I have been troubled a lot by this issue, and seek for some quick solution.. Any help here ??

    I found the solution finally, my self ..
    Issue - 1 : It's going to take atleat 5-minutes to refresh the data connection, that is generally not a big time span.
    Issue - 2 : 
    --> Set Your connection to refresh everytime the file is opened. go to internet explorer -> file -> internet options -> general -> Browsing History -> Settings -> Check for newer versions of stored pages... Check 'Every time I visit the
    webpage'. 
    Now everytime i update your original data source, wait for 5-10 minutes and refresh my web page containing the excel sheet.. The Contents of the excel sheet are updated as desired..

  • SSRS 2008 R2 data source authentication issue

    Hello,
    End-users with Browser permissions for the necessary SSRS folders and reports are getting "The permissions denied to user [Windows user] are insufficient for performing this operation (rsAccessDenied)" even though the data source for the report is configured
    to use "credentials stored securely in the report server" and "Use as Windows credentials when connecting to the data source" is checked. Users who are members of the local admin machine hosting the Report Manager site don't get the error. But users with full
    control permissions to the SSRS directories, including the config files, do get the error. Obviously, the solution shouldn't entail giving end-users local admin or even full control permissions to the SQL Server SSRS folders.  Moreover, the error only
    occurs when using Windows credentials but not when using only a database credential, i.e. non Windows credentials.  This workaround doesn't work for me since I'm also using a data source connection to SSAS, which doesn't appear to use database logins
    - only Windows credentials that are added to roles.
    Please advise.
    Thanks,
    Ben Lezin

    Hi Ben,
    Generally, the error "The permissions denied to user [Windows user] are insufficient for performing this operation (rsAccessDenied)" should be caused
    by one of the following reasons:
    1. User Access Control (UAC). Windows Vista, Windows 7 and Windows Server 2008 limit the overuse of elevated permissions by removing administrator permissions
    when accessing applications. Because the operating system removes permissions, members of the local Administrators group run most applications as if they are using the Standard User account.
    If this issue is caused by UAC, please run Internet Explorer as administrator. From the Start menu, click All Programs, right-click Internet Explorer,
    and select Run as administrator. For more information about UAC, please refer to this document:
    http://msdn.microsoft.com/en-us/library/bb630430.aspx
    2. User has insufficient permissions to perform the operation. Please check the following items:
    a. End-users belong to "Browser" role. In Report Manager Home page, click "Folder Settings", make sure the end-users are in the security list and with
    Browser role permission.
    b. "Browser" role has permissions to perform the operation. You can use SSMS to connect to report server, expand Security/Roles, double click "Browser",
    and then check the tasks list.
    If there is anything unclear, please feel free to ask.
    Thanks,
    Albert Ye

  • ODP Data Source Delta issue 2lis_11_VAHDR 2lis_11_VAKON

    Hi Expert,
    I am pulling the data from SAP using ODP data sources. 2LIS_11_VAHDR 2LIS_11_VAITM 2LIS_11_VAKON.
    For all the data sources I am doing the initloads. For 2LIS_11_VAITM data source after init load its enabling the delta.
    For remaining data sources even if you try to pull the delta it's loading full load.
    Steps I am following
    LBWG do the deletion of setup tables.
    OLI7BW fill the setup tables.
    Run the data services job which pull the initial load. It loads all the data.
    Change the ODP data source settings to Init load to No (Capture only delta)
    In SMQ1 all the delta records will be captured
    Run the Jobs which pull the delta records to ODQMON from there if you run Job in data services it has to pull the Delta records.
    This is happening for 2LIS_11_VAITM but not for other data sources 2LIS_11_VAHDR and 2LIS_11_VAKON.
    Event if you try to pull the delta records its loading all the records.
    Can some one throw some light on this
    Regards,
    Murali.

    In RSMO, go to the details tab and check in which stage the load has failed. If the issue is with extraction then go to Environment > Job Overview in the source system. Check if this job is complete.
    I hope you must have cleared the Delta Queue, so now go and check if any entry is being displayed for 2lis_11_vahdr

  • JDBC Data Sources: Potential Issue with JDeveloper 10.1.3.4

    I think I found a bug or issue with the latest JDeveloper 10.1.3.4 release when using JDBC Data Sources on the Embedded OC4J container.
    To state the issule bluntly, If I use a JDBC Data Source in an ADF Faces application, I get the following error on the screen when I run my application if I create a simple page using a Form layout for database data:
    [http://cs.uwindsor.ca/~ruston7/jdbcError.jpg]
    Or if I use a simple drag and drop ADF Faces Table:
    javax.faces.el.PropertyNotFoundException: Error testing property '<<FIRST_FIELD_ON_THE_PAGE>>' in bean of type null
        at com.sun.faces.el.PropertyResolverImpl.isReadOnly(PropertyResolverImpl.java:274)
        at oracle.adfinternal.view.faces.model.FacesPropertyResolver.isReadOnly(FacesPropertyResolver.java:124)
        at com.sun.faces.el.impl.ArraySuffix.isReadOnly(ArraySuffix.java:236)
        at com.sun.faces.el.impl.ComplexValue.isReadOnly(ComplexValue.java:209)
        at com.sun.faces.el.ValueBindingImpl.isReadOnly(ValueBindingImpl.java:266)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.EditableValueRenderer.getReadOnly(EditableValueRenderer.java:211)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.FormElementRenderer.renderAsElement(FormElementRenderer.java:155)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.InputLabelAndMessageRenderer.getLabelFor(InputLabelAndMessageRenderer.java:53)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.LabelAndMessageRenderer$Label.getForId(LabelAndMessageRenderer.java:500)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.OutputLabelRenderer.encodeAll(OutputLabelRenderer.java:69)
        at oracle.adfinternal.view.faces.renderkit.core.CoreRenderer.delegateRenderer(CoreRenderer.java:281)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.LabelAndMessageRenderer.encodeAll(LabelAndMessageRenderer.java:123)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.InputLabelAndMessageRenderer.encodeAll(InputLabelAndMessageRenderer.java:94)
        at oracle.adfinternal.view.faces.renderkit.core.CoreRenderer.encodeEnd(CoreRenderer.java:169)
        at oracle.adf.view.faces.component.UIXComponentBase.encodeEnd(UIXComponentBase.java:624)
        at oracle.adfinternal.view.faces.renderkit.core.CoreRenderer.encodeChild(CoreRenderer.java:246)When I change my Application Module connection to a JDBC URL, this all works perfectly. Also, everything works fine when I deploy to our Oracle AS 10.1.3 application servers.
    I also tried this on a different computer using a fresh install of JDeveloper just to make sure that the copy of JDeveloper that I downloaded didn't have a fluke in it.
    Thanks!

    M. Ruston,
    It must be something on your side. i just tried the same thing using the employees table from the HR sample schema (it has a date column). It works with JDBC URL and datasource both.
    Just out of curiosity - if you look at the properties for your Business Components project at the business components section, what does it show for the SQL Flavor and Type Map?
    John

  • CRM Data source 0CRM_OPPT_H  issue with full update

    Hi Gurus!
    When I tried to start Full Upload via 0CRM_OPPT_H data source at BW side  I faced with issue that
    No data available
    Diagnosis
    The data request was a full update.
    In this case, the corresponding table in the source system does not
    contain any data.
    System Response
    Info IDoc received with status 8.
    Procedure
    Check the data basis in the source system.
    I have checked the data source via RSA3 at CRM side and can extract 11 entries. The table CRMD_ORDER_INDEX has 39 entries.
    What have I done:
    1. Activate data source at CRM side in RSA5 and can see it in RSA6
    2. Activate BW Adapter Metadata in BWA5
    3. Tried to start full uploading at BW side but got error(No data available).
    4. Initialized delta-upload and can see active status for 0CRM_OPPT_H in tr. BWA7 at CRM side in active status (Delta active = TRUE, Initial upload = empty).
    5. There is no any authorization issues (user has SAP_ALL in both systems)
    6. RFC connection between systems is Ok. Extraction via 0CRM_OPPT_I was Ok.
    Have you any ideas how to start full uploading for BW via 0CRM_OPPT_H?

    Please, look above:
    "I have checked the data source via RSA3 at CRM side and can extract 11 entries. The table CRMD_ORDER_INDEX has 39 entries.

  • Data Source Assignment issue in BW

    Hi
    i have replicate 2 data source in Production BW system but those are not automatically assigned to the corresponding
    info source.Since Production system i can't do any change. it seems to me that i need to transport these 2 info source again in Production BW system.
    Let me know your views.what needs to be done here to assign data source to Info si
    Regards
    Atul

    Hi,
    Before you transport data source / info-source to production system, you should make sure that in target system ( in your case production system ) you have maintained a setting of 'Mapping of Source system names'. This you can find in RSA1 --> Tools --> Mapping of source system names.
    Here you define source system for data source should change to XXX(production) source system instead of earlier XXX(development) source system.
    Regards,
    Akshay

  • *** Transporting Generated Export Data Source *** urgent issue

    Hi All,
    I have generated export data source from info object Company Code,
    And trying to transport the Export data source, i.e. 8Company_code to QA system.
    Its showing the log that, imported sucessfully, but, i can't see the object in QA.
    Iam currently working on BW 3.5 system.
    Please help.
    Thanks,
    Nisha

    Hi Luis,
    Thanks for this.
    Yes, i went into Transport connector >> choosen the object >> Data flow before >>
    Ticked the Data sources >> changed the Package from $TMP to ZXXX >> Generated Transport request >> Released it >> Transported it.
    But, still can't see....
    Cheers,
    Nisha

  • Generic Data Source - Delta Issue

    HI,
    I have created a generic delta source (transactions) with the following options: -
    -  using the extract structure
    - enabled the generic delta with billing creation date as delta specific field
    - calendar day selected
    - safety interval upper limit = 1 calendar days
    - additive delta
    carried out the folling steps in BW
    - Full update from above generic data source (all records)
    - Init upload without data transfers (1 record)
    - Delta enabled
    Now, even though i am editing / modifying any record on my own extract strucutres record on R/3 side, it is not getting reflected in RSA7 (delta queue).
    P.S:- I have modified one of the record from full update.
    Ideally the modified record should be shown in RSA7, but i am not getting the same.
    can you please help me
    Thanks
    Raj

    yes,
    I also thought the same point. But the records modified / edited yesterday are not appearing in today's delta. I am wondering why.
    Is it anything to deal with FULL / INIT without deta transfer / DELTA procedure..
    Regards
    Rajiv

  • 10g Reports issue with XML Data Source

    Hi,
    Has anybody ever encountered an issue with Oracle 10g report using an XML as the data source? What happens is, some of the values in the XML are printed to the wrong column.
    One of the elements in our XML file is a complex type with 10 elements under it. The first 5 are picked up properly, but the last 6 are not. Elements #6 to #9 has a minimum occurence of 0. What happens is when element #6 is present, but #7 is, the value for element #7 is passed on to element #6.
    The XSD and XSL files are both valid since the reports were working when we were still using 9i. There is no hidden logic in the report which might cause this issue to come up, i.e., the report just picks up the values from the XML and prints it to the appropriate columns.
    Any help will be greatly appreciated.

    XSD used
    <?xml version="1.0" encoding="UTF-8"?>
    <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified">
            <!-- trade instructions detail & trailer -->
            <xs:element name="TradeDetail">
                    <xs:complexType>
                            <xs:sequence>
                                    <xs:element ref="TradeType"/>
                                    <xs:element ref="TradeID"/>
                                    <xs:element ref="TradeDate"/>
                                    <xs:element ref="FundID"/>
                                    <xs:element ref="FundName"/>
                                    <xs:element ref="DollarValue" minOccurs="0"/>
                                    <xs:element ref="UnitValue" minOccurs="0"/>
                                    <xs:element ref="PercentageValue" minOccurs="0"/>
                                    <xs:element ref="OriginalTradeID" minOccurs="0"/>
                                    <xs:element ref="CancellationFlag"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <xs:element name="Instruction">
                    <xs:complexType>
                            <xs:sequence minOccurs="0">
                                    <xs:element ref="TradeDetail" maxOccurs="unbounded"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <!-- overall trade instruction message -->
            <xs:element name="InterchangeHeader">
                    <xs:complexType>
                            <xs:sequence>
                                    <xs:element ref="Instruction"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <!-- definition of simple elements -->
            <xs:element name="FundID" type="xs:string"/>
            <xs:element name="TradeType" type="xs:string"/>
            <xs:element name="TradeID" type="xs:string"/>
            <xs:element name="TradeDate" type="xs:string"/>
            <xs:element name="FundName" type="xs:string"/>
            <xs:element name="DollarValue" type="xs:decimal"/>
            <xs:element name="UnitValue" type="xs:decimal"/>
            <xs:element name="PercentageValue" type="xs:decimal"/>
            <xs:element name="OriginalTradeID" type="xs:string"/>
            <xs:element name="CancellationFlag" type="xs:string"/>
    </xs:schema>
    XML used
    <?xml version = '1.0' encoding = 'UTF-8'?>
    <InterchangeHeader xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="TradeInstruction.xsd">
       <Instruction>
          <TradeDetail>
             <TradeType>Purchase</TradeType>
             <TradeID>M000038290</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>ARO0011AU</FundID>
             <FundName>ABN Fund</FundName>
             <DollarValue>2111.53</DollarValue>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
          <TradeDetail>
             <TradeType>Redemption</TradeType>
             <TradeID>M000038292</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>ARO0011AU</FundID>
             <FundName>AMRO Equity Fund</FundName>
             <UnitValue>104881.270200</UnitValue>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
          <TradeDetail>
             <TradeType>ISPurchase</TradeType>
             <TradeID>M000038312</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>MLC0011AU</FundID>
             <FundName>Cash Fund</FundName>
             <OriginalTradeID>M000038311</OriginalTradeID>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
       </Instruction>
    </InterchangeHeader>
    XSLT used
    <?xml version="1.0" encoding="UTF-8"?>
    <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
            <xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>
            <xsl:template match="/">
                    <InterchangeHeader>
                            <xsl:for-each select="InterchangeHeader/Instruction/TradeDetail">
                            <xsl:sort select="FundName"/>
                            <xsl:sort select="TradeDate"/>
                                    <TradeDetail>
                                            <TradeType><xsl:value-of select="TradeType"/></TradeType>
                                            <TradeID><xsl:value-of select="TradeID"/></TradeID>
                                            <TradeDate><xsl:value-of select="TradeDate"/></TradeDate>
                                            <FundID><xsl:value-of select="FundID"/></FundID>
                                            <FundName><xsl:value-of select="FundName"/></FundName>
                                            <DollarValue><xsl:value-of select="DollarValue"/></DollarValue>
                                            <UnitValue><xsl:value-of select="UnitValue"/></UnitValue>
                                            <PercentageValue><xsl:value-of select="PercentageValue"/></PercentageValue>
                                            <OriginalTradeID><xsl:value-of select="OriginalTradeID"/></OriginalTradeID>
                                            <CancellationFlag><xsl:value-of select="CancellationFlag"/></CancellationFlag>
                                    </TradeDetail>
                            </xsl:for-each>
                    </InterchangeHeader>
            </xsl:template>
    </xsl:stylesheet>

  • CO_PA DATA SOURCE ISSUE

    Hi,
    I am trying to display a COPA data source, i have given proper operating concern name and tried with costing based and account based it is throwing below error message:
    Table entry missing for data source.
    when i look into the details it is showing below information.
    an attempt was made to extract data using this data source.
    this is not possible due to missing of control entries in the system tables.
    System Response:
    The process was terminated.
    Procedure:
    if the data source was transported into this system, check the import logs for errors.
    I tried to debug it from KEB2, here also i am getting same error message.
    Please provide me the answer to over come this issue.

    Hello,
    The administration of the delta method for CO-PA DataSources occurs in part in the OLTP system. In particular, the time up until which the data has already been extracted is stored in the control tables of the DataSource. Since the control tables for the delta method for the extractor are managed in the OLTP system, certain restrictions apply.
    There can only ever be one valid initial package for a DataSource. If, for the same DataSource, a separate initialization is scheduled for different selections, for example, and data is posted to the operating concern between the individual initializations, data inconsistencies could occur between SAP BW and OLTP. The reason for this is that, with each initialization, the time stamp of the DataSource in the OLTP system is set to the current value. Consequently, records from a previous selection are no longer selected with the next delta upload if they were posted with a different selection prior to the last initial run.
    See this doc for more info [How to Connect Between CO-PA and SAP BW for a Replication Model|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/fb07ab90-0201-0010-c489-d527d39cc0c6]
    Also see
    [How to Connect Between CO-PA and SAP BW for Data Retraction|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1910ab90-0201-0010-eea3-c4ac84080806]
    [SAP Network Blog: Best Practices for Profitability Management with SAP Business Profitability Management and CO-PA|/people/community.user/blog/2007/09/10/best-practices-for-profitability-management-with-sap-business-profitability-management-and-co-pa]
    [SAP Network Blog: Configuration Characteristics in Profitability Analysis|/people/udo.werner/blog/2007/05/10/configuration-characteristics-in-profitability-analysis]
    Thanks
    Chandran

  • Issue with data source after deploying

    We are experiencing an issue with our data source after deployment of a cube. On the datasource properties in Visual Studio 2012, we have the max connections set to 0 before the deployment. Once the cube is deployed, I can navigate to the <name>.0.ds.xml
    file and open it and see that the <MaxActiveConnections>0</MaxActiveConnections> is indeed set to 0. At some point over the next couple days, a process of the cube or some other action causes that value to get updated to some number too large to
    be converted to an int, and makes the datasource invalid. At that point we cannot view the datasource properties in SSMS, we cannot open the cube project from Visual Studio, and we’ve even had failures when trying to process the cube.  Is there a config
    somewhere that would cause this value to get overwritten, or some other behind the scenes process that we can look at?
    Our server information is:
    Microsoft SQL Server 2012 (SP1) - 11.0.3153.0 (X64)
                    Jul 22 2014 15:26:36
                    Copyright (c) Microsoft Corporation
                    Enterprise Edition: Core-based Licensing (64-bit) on Windows NT 6.2 <X64> (Build 9200: ) (Hypervisor)
    Chad Dotzenrod SWC | TECHNOLOGY PARTNERS 1420 Kensington Road, Suite 110 Oak Brook, Illinois 60523-2144 http://www.swc.com

    Typically you would import the metadata from the source location and either use that location as the data source (and so not need to redeploy), or deploy it to a separate target location.
    The replace action is destructive as you've found, and effectively performs a drop table followed by create table. Hence any data in the table is lost.
    If you just want the Control Center Manager to correctly display that the table is deployed, try setting the action to "Upgrade". This will try to upgrade the deployed object to match the definition in OWB, but as the two are identical this will result in no changes. However, it will update the deployment records to indicate that the object is deployed.
    Nigel.

  • Issue with table ROOSPRMSF entries for data source 0FI_AP_4

    Hi Experts,
    I am facing with an issue where we found incosistencies with table ROOSPRMSF in R/3 system.
    In BW , we have done initializations based on fiscal period selections (none of the selections overlap) for data source 0FI_AP_4.
    We have done in total 7 initializations. So in BW system in table RSSDLINITSEL we have 7 initialization requests.
    But in R/3 system we have 49 records for data source 0FI_AP_4 in ROOSPRMSF table out of which 42 are invalid records.
    I suspect that these 42 invalid records are created due to the execution of program RSSM_OLTP_INIT_DELTA_UPDATE when the tables ROOSPRMSF are actually holding the 7 initialization request entries.   Due to this each and every initialization request is linked to rest of the other intialization requests and ended with 49 records in table ROOSPRMSF table.
    Now our data loads are running fine but daily a short dump is raised . In the daily loads, BW init records in RSSDLINITSEL are compared with ROOSPRMSF entries and all the other 42 records which are invalid are written into system log and a short dump is raised.
    In order to fix these inconsistencies i checked for OSS note 852443. (Point 3 in OSS note)
    But it is specified to delete the delta queue for data source 0FI_AP_4 in RSA7 and instructed to execute the program RSSM_OLTP_INIT_DELTA_UPDATE so that the ROOSPRMSF table will be reconstructed with valid records available in RSSDLINITSEL. 
    From OSS note 852443 point 3
    "3. If the RSSDLINIT table in the BW system already contains entries, check the requests listed there in the RNR column in the monitor (transaction RSRQ). Compare these entries with the entries in the ROOSPRMSF and ROOSPRMSC tables with the INITRNR field. If, in the ROOSPRMSF and ROOSPRMSC tables for your DataSource source system combination, there are more entries with different INITRNR numbers, use transaction RSA7 in an OLTP source system to delete all entries and then use the RSSM_OLTP_INIT_DELTA_UPDATE report mentioned in the next section. For a DataMart source system, delete the entries that you cannot find in the RSSDLINIT table using the procedure described above."
    My question is if we delete the delta queue in RSA7 then all the tables in R/3 (ROOSPRMSF, ROOSPRMSC, Time stamp table) and BW (RSSDLINITSEL, initialization requests will be deleted) will be cleared. Then how will the program RSSM_OLTP_INIT_DELTA_UPDATE  copy entries into ROOSPRMSF table in R/3 ?
    Could any one please clarify this ?
    Thanks
    Regards,
    Jeswanth

    Hi Amarnath,
    Did you unhide the new field in RSA6 and regenerated the DataSource?
    Often SAP will populate newly added fields (belonging to the same (set) of table(s) used for extraction) automatically (e.g. SAP uses 'move-corresponding' in it's extractor-code, or, in this case, reading all fields from the DD, FM BWFIU_TRANSFORM_FIELDLIST).
    If the DataSource looks fine to you and the field is still not populated in RSA3 you can't go without a user-exit.
    Grtx,
    Marco

  • Performance issue with Oracle data source

    Hi all,
    I've a rather strange problem that I'm stuck on need some assistance on.
    I have a rules file which drags data in via an SQL data source thats an Oracle server. If I cut/paste the 3 sections of "select" "from" and "where" into SQL-Developer and run the query, it takes less than 1 second to complete. When I run the "load data" with this rule file or even use the "Retrieve" with the rules file edit, it takes up to an hour to complete/retrieve the data.
    The table in question being used has millions of rows and I'm using one of the indexed fields to retrieve the data. It's as if the Essbase/Rule file is ognoring the index, or I have a config issue with the ODBC settings on the server that is causing the problem.
    ODBC.INI file entry for the Oracle server as follows (changed any sensitive info to xxx or 999).
    [XXX]
    Driver=/opt/data01/hyperion/common/ODBC-64/Merant/5.2/lib/ARora22.so
    Description=DataDirect 5.2 Oracle Wire Protocol
    AlternateServers=
    ApplicationUsingThreads=1
    ArraySize=60000
    CachedCursorLimit=32
    CachedDescLimit=0
    CatalogIncludesSynonyms=1
    CatalogOptions=0
    ConnectionRetryCount=0
    ConnectionRetryDelay=3
    DefaultLongDataBuffLen=1024
    DescribeAtPrepare=0
    EnableDescribeParam=0
    EnableNcharSupport=0
    EnableScrollableCursors=1
    EnableStaticCursorsForLongData=0
    EnableTimestampWithTimeZone=0
    HostName=999.999.999.999
    LoadBalancing=0
    LocalTimeZoneOffset=
    LockTimeOut=-1
    LogonID=xxx
    Password=xxx
    PortNumber=1521
    ProcedureRetResults=0
    ReportCodePageConversionErrors=0
    ServiceType=0
    ServiceName=xxx
    SID=
    TimeEscapeMapping=0
    UseCurrentSchema=1
    Can anyone please advise on this lack of performance.
    Thanks in advance
    Bagpuss

    One other thing that I've seen is that if your Oracle data source and Essbase server are in different geographic locations, you can get some delay when it retrieves data over the WAN. I guess there is some handshaking going on when passing the data from Oracle to Essbase (either by record or groups of records) that is slowed WAY down over the WAN.
    Our solution to this was remove teh query out of the load rule, run it via SQL+ on a command line at the geographic location where the Oracle database is, then ftp the resulting file to where the Essbase server is.
    With upwards of 6 million records being retrieved, it took around 4 hours in the load rule, but running the query via command line took 10 minutes, then the ftp took less than 5.

  • Data load issue with export data source - BW 3.5

    Hi,
    We are facing issues in loading data with the help of export data source.
    We have created export data source of 0PCA_C01 cube. With the help of this export datasource,  we are loading data to other custom cube. Scenario is working fine in development server.
    But when we transported objects to quality server data is not getting loaded to custom target cube.
    It is extracting zero records.  All transports are ok and we have generated export datasource in quality before transports .Also regenerated export datasource after transport and activated infosource, update rule via RS* programs.  Every object is active but data is not getting extracted.
    RSA3 for 80PCA_C01 datasource isn't extracting any record in Quality. Records getting extracted in development.   We are in BW 3.5 with patch level 19.
    Please guide us to resolve the issue.
    Thanks,
    Aditya

    Hi
    Make sure that you have relevant Role & Authorization at Quality/PRS.
    You have to Transport the Source Cube first and then Create a Generate Export Data Source in QAS. Then, replicate data sources for BW QAS Soruce System. Make sure this replicated Data Source in QAS. Only then can transport new update rules for second cube.
    Hope it helps and clear

Maybe you are looking for