Issue with Data Load from R/3

Hi,
I have created a generic data source in R/3 based on the option Extraction from SAP Query. the SAP Query (Infoset) created in the TCODE SQ02 is based on the tables BKPF and BSEG. I have generated a Data source on this and successfully replicate dinto BW but when i trigger the load uing infopackage it immediately fails and gives the following messages.
*If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.*
+Job terminated in source system --> Request set to red+
Can you point where the issues is. I have tested the source system connection and it is fine as other extractors are working fine and data can be pulled
Thanks
Rashmi.

Hi Rashmi,
Try the following:
- RSA3 in source system -> test extraction OK?
- Shortdump (ST22) found in BW or R/3?
- Locate the job of the extraction in R/3 (SM37) and look at the joblog...
Hope this leads you to the cause...
Grtx
Marco

Similar Messages

  • Issue with a load from R3 to BW 3.5

    Hi Guys,
    We are having an issue here with a load from R3 to BW 3.5 to an ODS and
    a transactional infocube.
    In a daily basis we are running loads to BW from R/3 infosets and all
    of them but one loads fine.
    The one that is having problems is actually the one that loads more
    data, therefore the infopackage is divided into two infopackages.
    In the update rule of the first ODS we are running an initial routine
    in which we are doing a RSDRD_SEL_DELETION (selective deletion) of the
    data to be loaded to both the ODS and the infocube, and is actually
    here where we got the core dump.
    Our first assumption was that maybe there was any yellow request in the
    transactional infocube avoiding the selective deletion but we have
    discarded this.
    We think that, as the only one failing is the one that divides into two
    infopackages, the problem might be that at the moment that the first is
    triying to delete the second one is loading data into the ods and there
    we get the dump.
    The question here is ¿Could this be the problem? ¿How could we
    workaround this if this is the case?
    Thanks for any suggestion.

    Hi,
    Be carefull on ODS activation, if you're using 2 infopackage and/or deleting data from an ODS which is lock for an activation, you'll occurs a dump.
    Check SM12 / ST22 / SM21 and job logs in SM37.
    And if pb still occurs for a load, debug it in start routine.
    Regards,

  • Issue with Data Loading

    Hi Experts,
    Need your kind help in overcoming the following issue.
    We are doing a full load to an ODS daily from flat file system. Due to certain change request the update rule was changed in between and later on we found out that it was not correct. So the update rule was corrected again. Now its working fine. But still some requests are there where the data has been updated in ODS with the faulty update rule. All the data fields in the update rule for the ODS are using overwrite functionality. The problem lies here is we are not sure if the faulty records in the ODS have not been updated by the following requests. As we are not sure of the exact nature of the data coming in.
    From this ODS one Infocube and one more ODS is being updated. The Key figure calculation in the Cube uses the 'maximum', 'minimum' & 'Addition' calculations.
    And the load from ODS to cube is delta.
    Kindly let me know what can we do to close the above issue.
    I was planning to delete the concerned requests from ODS and reconstruct it. But that way I can correct the data in ODS only. I am not sure if the changes made to the ODS data will be properly relayed to the cube in the proper fashion. please let me know if it will be correct?
    All the flat file data is still residing in the PSA. Also kindly let me know if there is any method to check the PSA for the requests spanning the whole period apart from going to each request and check it one by one.
    Thanks in advance.

    Hi Experts,
    My concern was Say I delete the requests from the day the data was loaded with thw wrong update rule, I have the data sitting in the PSA and I can reconstruct them. I understand that the data can be corrected in ODS.
    But in the cube, some key figures are there with the update method as 'addition'. So say I delete the content of the cube and reload it from the ODS all over again, wont the key figures get corruipted yet again?
    Also as I was thinking, say we delete the respective request from the cube as well, and reconstruct the data in the ODS. can I fire the delta request again to capture the changed records to be loaded to cube. to be exact will all the data from the date of the data corruption(as I am going to delete the requests from the cube from that day also) will be loaded to the cube?
    Also another thing is will it be a better idea to reconstruct a request in the ODS and run the delta package for the cube.. And go on ..
    Kindly suggest..
    Thanks in advance !!!

  • Issue with Data Load to InfoCube with Nav Attrivutes Turned on in it

    Hi,
    I am having a issue with loading data to a InfoCube. When i turn
    on the navgational attributes in the cube the data load is failing
    and it just says "PROCESSED WITH ERRORS". when i turn them off
    the data load is going fine. I have done a RSRV test both on
    the infoobject as well as the cube and it shows no errors. What
    could be the issue and how do I solve it.
    Thanks
    Rashmi.

    Hi,
    To activate a navigation attribute in the cube the data need not be dropped from the cube.
    You can always activate the navigation attribute in the cube with data in the cube.
    I think you have tried to activate it in the master data as well and then in the cube or something like that??
    follow the correct the procedure and try again.
    Thanks
    Ajeet

  • Issue with delta load from R/3 to BW

    Hi frnds,
    There is a standarded D.S  2LIS_05_ITEM in R/3. Evereday We extraceted data from R/3 to BW based on this standard Data Source with delta load.There are two fields ZX and ZY in BW ,some times the data haven't extracted for this 2 fields in BW with the delta update.But some times this 2 fields properly extracted in to BW with the same delta update.
    Why is it happen like this ? Please give some inputs on this.
    Regards,
    Satya.

    Hello,
    if its a standard field then its getting populated in correct way.
    if its custom field then you need to analyze the records for which its getting populated and for one which its not.
    Quite possible that some cutomization in CMOD results in this behaviour.
    Also,check underlying tables to see the correct values.
    Regards
    Ajeet

  • Issue with Data Load Table

    Hi All,
           i am facing issue with apex 4.2.4 ,using the  Data Load Table concept's and in this look up used the
          Where Clause option  ,it seems to be not working this where clause ,Please help me on this

    hi all,
        it looks this where clause not filter with 'N'  data ,Please help me ,how to solve this or help me on this

  • Issue with Data Load

    HI All,
    I am trying to load data from CRM System to ODS. I am coming across warning
    "Processing (data packet): No data "   ( Yellow Clolor )
    Rest all is fine.(Requests (messages): Everything OK , Extraction (messages): Everything OK ,
    Transfer (IDocs and TRFC): Everything OK .
    I checked in CRM Box in RSA3 and i can see around 30 records for this extractor.
    I checked the IDOCs in BE87 and every thing looks fine . can some one help in this issue .
    Thanks,
    Abraham

    In RSMO i dont see any data records beeing Pulled, i tried loading data to the PSA but it gives back the same Warning messages in the Details Tab .
    Requests (messages): Everything OK
    Extraction (messages): Everything OK
    Transfer (IDocs and TRFC): Everything OK
    Processing (data packet): No data ( This is in Yellow Color ) 
    But it says that the data Load is Successfull though it is pulling 0 records .

  • Issue with Data Extraction from OAGIS 9 User Area Tags ...

    We are working on displaying the data coming from OAGIS9 into Excel sheet. We created a custom Schema for the Excel sheet and are able to display all the data in OAGIS9 except for the ones in UserArea. We are facing issue in extracting data from tags present in UserArea of OAGIS 9 schema and transforming it into our Custom built schema.
    Can you please suggest us some way in which we extract data from userArea?
    Thanks in advance ~

    Here is the example with instructions, we have just done it and working
    1.You need to have the custom schema that is coming with in the <UserArea>/<any>
    For example this custom schema is like this one shown below and you are using this CutomerPartyMaster of OAGIS
    <?xml version="1.0" encoding="utf-8"?>
    <xs:schema xmlns="http://www.userarea.com/mdm/oagisextensions" elementFormDefault="qualified" targetNamespace="http://www.userarea.com/mdm/oagisextensions" xmlns:xs="http://www.w3.org/2001/XMLSchema">
    <xs:element name="Customer" type="CustomerType" />
    <xs:complexType name="CustomerType">
    <xs:sequence>
    <xs:element minOccurs="0" name="Gender" type="xs:string" />
    <xs:element minOccurs="0" name="DOB" type="xs:date" />
    <xs:element minOccurs="0" name="SSN" type="xs:string" />
    <xs:element minOccurs="0" name="PAN" type="xs:string" />
    </xs:sequence>
    </xs:complexType>
    </xs:schema>
    and the Input data snippet will be like this inside CustomerPartyMaster
    <tnsa:UserArea>
         <tnso:Customer>
              <tnso:Gender>string</tnso:Gender>
              <tnso:DOB>1980-10-13</tnso:DOB>
              <tnso:SSN>string</tnso:SSN>
              <tnso:PAN>string</tnso:PAN>
         </tnso:Customer>
    </tnsa:UserArea>
    where xmlns:tnsa="http://www.openapplications.org/oagis/9" and xmlns:tnso="http://www.userarea.com/mdm/oagisextensions"
    Lets assume your Target Schema is like this where only DOB is coming from userarea
    <CreateRequest>
    <record>
    <firstname></firstname>
    <lastname></lastname>
    <dob></dob>
    </record>
    </CreateRequest>
    2.declare the name space of incoming userarea custom schema in your xsl file like this
    xmlns:nsext="http://www.userarea.com/mdm/oagisextensions"
    3.here is xsl code that copies element from CustomerPartyMaster/UserArea
    <xsl:template match="/">
    <imp1:CreateRequest>
    <xsl:for-each select="/inp1:ESBProcessCustomerRequest/body/ns4:ProcessCustomerPartyMaster/ns4:DataArea/ns4:CustomerPartyMaster">
         <imp1:record>
    <xsl:for-each select="ns4:UserArea/nsext:Customer/nsext:DOB">
              <imp1:dob>
              <xsl:value-of select="."/>
              </imp1:dob>
         <imp1:record>
    </xsl:for-each>     
    where imp1 is target schema namespace
         ns4 is OAGIS standard namespace.
    *****************

  • Latest PowerQuery issues with data load to data models built with older version + issue when query is changed

    We have a tool built in excel + Powerquery version 2.18.3874.242 - 32 Bit (No PowerPivot) using data load to data model (not to workbook). There are data filters linked to excel cells, inserted in OData query before data is pulled.
    The Excel tool uses organisational credentials to authenticate.
    System config: Win 8.1, Office 2013 (32 bit)
    The tool runs for all users as long as they do not upgrade to PowerQuery_2.20.3945.242 (32-bit).
    Once upgraded users can no longer get the data to load to the Model. Data still loads to the Workbook but the model breaks down. Resetting load to data model erases all measures.
    Here are the exact errors users get:
    1. [DataSource.Error] Cannot parse OData response result. Error: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
    2. The Data Model table could not be refreshed: There isn't enough memory to complete this action. Try using less data or closing other applications. To increase memory available, consider ......

    Hi Nitin,
    Is this still an issue? If so, can you kindly provide the details that Hadeel has asked for?
    Regards,
    Michael Amadi
    Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to vote it as helpful :)
    Website: http://www.nimblelearn.com, Twitter:
    @nimblelearn

  • Issue: No Data loaded from Query - MDX, SQL issue

    Hello,
    I'm having a problem with a report based on Query with a hierarchy (0GLACCEXT Hierarchie, Data in Cube 0FIGL_V10), first version of query had 3 filters, but I also tried to load the data without filters thinking it coused the problem - didn't help to fix the issue.
    If I put some characteristics into the report  - the fields are empty - i did "browse data" in the database and found out that only the data for the "year" was fetched by the SQL expression even though I have some similar reports with hierarchies based on 0GL_Account and data is fetched correctly in those ones.
    I tried MDXTEST on SQL expression
    "SELECT {[Measures].[D5LABJ7EYP982LUFOJAXZ3Z3E]} ON COLUMNS,  NON EMPTY [0GLACCEXT                     CVIS].MEMBERS DIMENSION PROPERTIES [0GLACCEXT                     CVIS].[40GLACCEXT] ON ROWS FROM [0FIGL_V10/ZEN_FIGL_V10_Q0001]"
    Data is being loaded.
    What else can I try?
    ps: just found out that data is not fetched when i assign a hierarchie no matter which one (could it be the issue due to the size of hierarchy?)
    Edited by: Elizaveta Nikitina on May 5, 2009 3:33 PM

    Please re-post if this is still an issue to the Data Connectivity - Crystal Reports Forum or purchase a case and have a dedicated support engineer work with you directly

  • Issue with Data Loading between 2 Cubes

    Hi All
    I have a Cube A which has huge amount of data. Around 7 years of data. This cube is on BWA. In order to empty out space from this Cube we have created a new Cube B.
    We have now starting to load data from Cube A to Cube B based on created on. But we are facing a lot of memory issues hence we are unable to load data for a week in fact. As of now we are loading based on 1 date which is not useful as it will take lot of time to load 4 years of data.
    Can you propose some alternate way by which we can make this data transfer faster between 2 cubes ? I though of loading Cube B from DSO under Cube A but that's not possible as the DSO does not have that much old data.
    Please share your thoughts.
    Thanks
    Prateek

    HI SUV / All
    I have tried running based on Parallel process as there are 4 for my system. there are no routines between Cube to Cube. There is already a MP for this cube. I just want to shift data for 4 years from this cube into another.
    1) Data packet as 10, 000 - 8 out of some 488 packets failed
    2) Data packet as 20, 000 - 4 out of some 244 packets failed
    3) Data packet as 50,000 - Waited for some 50 min. No extraction. So killed the job.
    Error : Dump: Internal session terminated with runtime error DBIF_RSQL_SQL_ERROR (see ST22)
    In ST22:
    Database error text........: "ORA-00060: deadlock detected while waiting for
      resource"
    Can you help resolving this issue or some pointers ?
    Thanks
    Prateek

  • Problem with Data load from file

    Hi,
    if i try to load data from an comma seperated file into oracle, i get an error that the page cannot be displayed, after the dialog where to specify the file and the separation method.
    My Browser does not try long to open the page approx. 1 sec. ...
    Anyone have a Idea about that ?
    P.S I know my englisch is horribile, sorry

    A known bug. See below for a solution to set the timeout. Remember to reboot the PC for the changes take effect.
    See Re: Problem with importing HTML DB applications

  • Error is data loading from 3rd party source system with DBCONNECT

    Hi,
    We have just finished an upgrade of SAP BW 3.10 to SAP NW 7.0 EHP1.
    After the upgrade, we are facing a problem with data loads from a third party Oracle source system using DBConnect.
    The connection is working OK and we can see the tables in the source system. But we cannot load the data.
    The error in the monitor is as follows:
    'Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.'
    But, unfortunately, the error message has no further information.
    If we look at the job log in sm37, the job finished with the following log -                                                                               
    27.10.2009 12:14:19 Job started                                                                                00           516          S 
    27.10.2009 12:14:19 Step 001 started (program RSBATCH1, variant &0000000000119, user ID RXSAHA)                    00           550          S 
    27.10.2009 12:14:23 Start InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG                                              RSM1          797          S 
    27.10.2009 12:14:24 Element NOAUTHORITYCHECK is not available in the container                                     OL           356          S 
    27.10.2009 12:14:24 InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG created request REQU_4FMXSQ6TLSK5CYLXPBOGKF31G     RSM1          796          S 
    27.10.2009 12:14:24 Job finished                                                                                00           517          S 
    In a BW 3.10 system, there is no  message related to element NOAUTHORITYCHECK. So, I am wondering if this is something new in NW 7.0.
    Thanks in advance,
    Rajib

    There will be three things to get the errors like this
    1.RFC CONNECTION FAILED
    2.CHECK THE SOURCE SYSTEM
    3.CHECK IT OUT WITH Oracle Consultants WEATHER THEY ARE FILLING UP THE LOADS.TELL THEM TO STOP
    4.CHECK I DOC PROCESSING
    5.FINALLY MEMORY ISSUES.
    6.CATCH THE DATA SOURCE FIRST CHANGE IT AND THEN ACTIVATE AND RUN THE LOAD
    7.LAST IS MEMORY ISSUE.
    and also Check the RFC connection in SM59 If  it is ok then
    check the SAP note : 692195 for authorization
    Santosh

  • Problem with date format from Oracle DB

    Hi,
    I am facing a problem with date fields from Oracle DB sources. The date format of the field in DB table is 'Date base type is DATE and DDIC type is DATS'.
    I mapped the date fields to Date characters in BI. Now the data that comes to PSA is in weird format. It shows like -0.PR.09-A
    I have changing the field settings in DataSource  to internal and external and also i have tried mapping these date fields to text fields with out luck. All delivers the same format.
    I have also tried using conversion routines like, CONVERSION_EXIT_IDATE_INPUT to change format. It also delivers me the same old result.
    If anybody of you have any suggestions or if anybody have you experienced such probelms, Please share your experience with me.
    Thanks in advance.
    Regards
    Varada

    Thanks for all your reply. I can only the solutions creating view in database. I want some solution to be done in BI. I appreciate if some of you have idea in it.
    The issue again in detail
    I am facing an issue with date fields from oracle data. The data that is sent from Oracle is in the format is -0.AR.04-M. I am able to convert this date in BI with conversion routine in BI into format 04-MAR-0.
    The problem is,  I am getting data of length 10 (Output format) in the format -0.AR.04-M where the month is not in numericals. Since it is in text it is taking one character spacing more.
    I have tried in different ways to convert and increased the length in BI, the result is same. I am wondering if we can change the date format in database.
    I am in puzzle with the this date format. I have checked other Oracle DB connections data for date fields in BI, they get data in the format 20.081.031 which will allow to convert this in BI. Only from the system i am trying creating a problem.
    Regards
    Varada

  • How to find the data loaded from r/3 to bw

    hi
    how to find the data loaded from r/3 to bw is correct . i am not able to find which feild in the query is connected to which feild in the r/3 . where i am geting the data from r/3 . is there any process to find which feild  and table the data is comming from . plz help
    thanks in advance to u all

    Hi Veda ... the mapping between R/3 fields and BW InfoObjects should take place in Transfer Rules. Other transformation could take place in Update Rule.
    So you could proceed this way: look at InfoProvider Data Model and see if the Query does perform any calculation (even with Virtual keyfigures / chars). Than go back to Update Rules and search for other calculation / transformation. At least there are Tranfer Rule and eventually DataSource / Extraction Enhancements.
    As you can easily get there are many points where you have to look for ... it's a quite complex work but very usefull.
    Once you will have identified all mappings / transfromation see if BW data matchs R/3 (considering calculations ...)
    Good job
    GFV

Maybe you are looking for

  • BI 7 : Command to export a table structure of SAP R/3 into a script/text ?

    Hi All. Greetings. Am New to SAP R/3 system.  And request help. We are trying to pull data from SAP R/3 thro Bussiness Objects Data Services into Oracle. For now : we create a target oracle table looking at the table structure of SAP R/3 from SE 11.

  • Eclipse helios Metro runtime

    Hi, I am trying to get Metro runtime in eclipse helios, but without success. I have tried with oepe bundle and also with latest helios release and install glassfish tools bundle via Eclipse marketspace. Any clue? brg

  • How do I control read start position in a very large file where start byte position may be larger than I32 (+/- 2^31)?

    Using LabView, I am trying to read a very large file which may be on the order of 2^32 bytes. I need to be able to step into the file at a byte position which may be greater than the I32 limit set by the read file.vi. Are there any options to the rea

  • Import Chart of Account Data from other system

    Hi FICO Gurus, I am a new student in FICO, here i have a question as following, ask for help, In my new rollout system, i need a chart of account. these data is existing in the other system, i want to import it from there, other then input one by one

  • New fonts in FCPX 10.1

    In addition to the older built in fonts for FCPX/Motion, version 10.1 provides about 40 more fonts: AvenirBlackOblique.ttf AvianoSansBold.ttf Bank Gothic Light.ttf Bank Gothic Medium.ttf BasicCommercialLTCom-Bold.ttf BeaufortPro.ttf BebasNeue.otf Bra