Data Load : DSO to DSO serially.

Hi,
I want to load data from one DSO to Another DSO period wise.
I have one DSO with posting document , live in the system, which is DELTA enabled.
Now i am building a logic with another DSO with opening balance and closing balance.
Opening balance i am uploading to the DSO2 using flat file.
and now from DSO1 , i want to upload for each period one by one there by calculating closing balance.
Can any body suggest  me what option could be possible ?
Thanking you.
Ahmed.

Hi BI Quest,
If the load to ODS is a delta,
Please see the thread:
["RSM1_CHECK_FOR_DELTAUPD".;
Or try to create another infopackage and try the selections.
Hope this helps...
Tharanath

Similar Messages

  • Data load stuck from DSO to Master data Infoobject

    Hello Experts,
    We have this issue where data load is stuck between a DSO and master data infoobject
    Data uploads from DSO( std) to master data infoobject.
    This Infoobject has display and nav attributes in it which are mapped from DSO to Infoobject.
    Now we have added a new infoobject as attribute to the master data infoobject and made it as NAV attri.
    Now when we are doing full load via DTP the load is stuck and is not processing.
    Earlier it took only 5 mns of time to complete the full load.
    Please advise what could be the reason and cause behind this.
    Regards,
    santhosh.

    Hello guys,
    Thanks for the quick response.
    But its nothing proceeding further.
    The request is still running.
    earlier this same data is loaded in 5 mns.
    Please find the screen shot.
    master data for the infoobjects are loaded as well.
    I can see in SM50 the process at P table of the infoobject the process is.
    Please advise.
    Please find the detials
    Updating attributes for InfoObject YCVGUID
    Start of Master Data Update
    Check Duplicate Key Values
    Check Data Values
    Process time dependent attributes- green.
    No Message: Process Time-Dependent Attributes- yellow
    No Message: Generates Navigation Data- yellow
    No Message: Update Master Data Attributes - yellow
    No Message: End of Master Data Update - yellow
    and nothing is going further in Sm37
    Thanks,
    Santhosh.

  • Data Load from DS-- DSO---- Cube

    Hello Guys,
    I have an issue with the data load.
    I use the OFI_GL_14 extractor.
    When I did he innit with data transfer I have around 450 K records.
    When I loaded to DSO from DS by DTP  it loaded the same 450 K .
    When I tried to load the data from DSO to Cube it selects only 14.5K records.
    It says transfered records 14.5 K and added records 14.5 K records.
    Even if it aggregatesat the cube level- it should say transfered 450 K records and 14.5 K records added. is it correct.
    I don't have any filters in the DTP.I checked it. I deleted the DTP and again created another new one. The same result.
    From DS to DSO is delta-DTP and again from DSO to Cube is delta - DTP.
    I am not sure why it filters and select only 14.5 records from 450 K records.
    Do you guys encountered this type of situation. Let me know if I am doing something wrong.
    Thanks again for your input. It is really appreciated .
    Senthil

    Are you using infosource in between? If you do, the infosource is aggregating the the data before it gets to the transformation, in that case you will see the transferred record will be less. If you check all the fields as a key field in the infosource, you will get the same record transferred to the transformation and data will be aggregated going to the cube.
    thanks.
    Wond

  • Check data load performance for DSO

    Hi,
        Please can any one provide the detials, to check the data load performance for perticular DSO.
       Like how much time it took to load perticular (e.g 200000) records in DSO from R/3 system. The DSO data flow is in BW 3.x version.
    Thanks,
    Manjunatha.

    Hi Manju,
    You can take help of BW statistics and its standard content.
    Regards,
    Rambabu

  • Data Load from a DSO to a Cube

    Hello,
    I am facing a problem
    Case:
    There is a DSO with following Char/Key Figure:
    Characteristics     ( Value-1 )     ( Value-2 )     ( Value-3 )
    A(Key)          ( A1 )     ( A2 )     ( A3 )
    B(Key)          ( B1 )     ( B2 )     ( B3 )
    C          ( C1 )     ( C1 )     ( C1 )
    D          ( D1 )     ( D1 )     ( D1 )
    E          ( E1 )     ( E1 )     ( E1 )
    F          ( F1 )     ( F1 )     ( F1 )
    KF1(Key Figure)     ( 10 )     ( 20 )     ( 30 )
    I got below data from the DSO to a Cube(structure is below with data) details are below:
    Dimention          ( Values )     
    C          ( C1 )          
    D          ( D1 )          
    E          ( E1 )          
    F          ( F1 )          
    KF1(Key Figure)     ( 50 )
    My Question is why i am not getting KF1 value as 60.
    When i tried with including Key Fields of the DSO, i got all the records in the Cubes.
    Where i am mistaking?
    Can some one correct me please?
    Thanks.
    Avinash.

    Below are the basic steps which we follow in any BI 2004S system:
    1)Create datasource. Here u can set/check the Soucre System fields.
    2)Create Transformation for that datasource. (no more update rules/transfer rules)
    2.1) While creating transformation for DS it will ask you for data target name, so just assign where u want to update ur data.
    DataSource -> Transformation -> Data Target
    Now if you want to load data into data target from Source System Datasource:
    1) Create infopackage for that data source. If you are creating infopackage for new datasources, it will only allow you update upto PSA, all other options u can see as disabled.
    2)Now Create DTP (Data Transfer Process) for that data source.
    3) NOw schdule the Infopackage, once the data is loaded to PSA, you can execute your DTP which will load data to data target.
    If you are loading data from one one data target to other, no need to use PSA, you can directly execute DTP in that case.
    Data Source -> Transformation (IP/DTP) -> Data Target1 -> DTP ->Data Target 2
    Use the below link for detailed example:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/fc61e12d-0a01-0010-2883-e2fc63ef729b
    Infosources are no more mandatory with BI 7.0, below is the link to scenarios where we use infosources:
    http://help.sap.com/saphelp_nw04s/helpdata/en/44/0243dd8ae1603ae10000000a1553f6/content.htm
    Full or delta depends on your requirement...
    chk the below thread to know better
    difference between the various loads
    hope it helps
    Message was edited by:
            sriram viswanathan

  • Full Data Loads with a DSO as source

    I am using a DSO as a source object running a DTP full load. When I look at the SQL running it is performing a select statement grouping all of the characteristics and summing the key figures.
    Is there any way of changing the extract so that it extracts the source data on a record by record basis (as it does in 3.x)?

    50M records is quite large.  Can you break down the data into smaller chunks ... I personally would make the chunks as close to about 3-5M as I could. You may be able to increase this amount if you are not looking up data or have complex update rules -- if everything is mapped 1:1 (no transformations) between the two DSO's, then you can increase the size a lot (maybe 10M).
    Unless you have a large amount of memory, you will not be able to load large amounts of data like this in one shot.
    Back in the 2.x days, it wasn't recommend to load more than a Million records into a data target, so feel lucky that they increased this limit a lot!
    Brian

  • Data Loading Issue to DSO.

    hi experts,
    I am trying to Load data to a DSO say  XBUG.
    This contains the  fields like  PO Number ,item,company code,gl account,  and keygiure amount.
    the system behavves in different manner in each case.
    Note : There is a routine in the field level.
    1) i don't have any problem in loading when i do it Record by record.
    2) if i do the junk of the same records it fails.
    3) some times  if the junk loading fails i  break it  like 10 PO's one time and then 10 Po's one time. then it goes fine.
    4)  if still fails i take the sort order and do .
    but all these cases are not constant each day it behaves in different manner ,one day goes fine next time fails.
    please help me.
    regards
    Laxman

    First of all i would like to know which all error messages you have recieved during the failures???
    When you do record by record, Is it successful for all the records?
    What i am guessing there could be chance that few records are invalid and they are not being identified by system.
    May be you have not maintained semantic key in DTP so no errorneous records are shown in error stack??
    Also check that you have sufficient batch processes continuesly on your system to process all Data Packages together??
    Also i would suggest you to debug the load and find out where exactly it fails??

  • Data loads from 2 dsos to 1 infocube

    Hi all,
    I am trying to update data from DSO1 (Source1: transaction data) to Infocube(TARGET)
    In the transformations Start routine, I have to read DSO2(Source2: Master data) for some fields.
    DSO1 has CUSTOMER as part of key
    DSO2 has CUSTOMER (key) and other fields....FIELD1, FILED2, FIELD3
    Infocube to be updated with FIELDS1,2 & 3 WHILE READING DSO2.
    WHERE DSO1 CUSTOMER  matches with DSO2 CUSTOMER.
    Also, data NOT TO BE UPLOADED into Infocube if FIELD1 in DSO2= NULL
    Please give me the abap code for the above logic.
    Appreciate any help in this regard.
    Thanks.
    Edited by: Aryaman Krishna on Aug 28, 2011 3:55 PM

    Hi,
    You will have to read data of another DSO2 in start routine,
    1) Create an internal table of type DSO2 and then load data into it.
    SELECT  CUSTOMER
                  Field1
                  Field2 ..... etc
    from DSO2
    into table it_DSO2
    for all entries in source_package
    where customer = source_package-customer.
    Now you have all the data available in internal table it_DSO2, you can use this internal table in Field routine to skip any record.
    In the field routine simply check the required condition and then raise exception CX_RSROUT_SKIP_RECORD to skip particular record.
    Regards,
    Durgesh.

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • Data loading from DSO to Cube

    Hi,
    I have a question,
    In book TBW10 i read about the data load from DSO to InfoCube
    " We feed the change log data to the InfoCube, 10, -10, and 30 add to the correct 30 value"
    My question is cube already have 10 value, if we are sending 10, -10 and 30 Values(delta), the total should be 40 instead of 30.
    Please some one explaine me.
    Thanks

    No, it will not be 40.
    It ll be 30 only.
    Since cube already has 10, so before image ll nullify it by sending -10 and then the correct value in after immage ll be added as 30.
    so it ll be like this 10-10+30 = 30.
    Thank-You.
    Regards,
    Vinod

  • Data load from DSO to Cube in BI7?

    Hi All,
    We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
    a. Infopackage1 from datasource to ODS and
    b. Infopackage2 from ODS to the CUBE.
    Now after we transported the migrated dataflow to production, to load the same infoproviders I use
    1. Infopackage to load PSA.
    2. DTP1 to load from PSA to DSO.
    3. DTP2 to load from DSO to CUBE.
    step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
    Please note that the DSO already has data loaded using infopackage.  (Is this causing the problem?)
    Thanks,
    Sirish.

    Hi Naveen,
    Thanks for the Reply. The creation of DTP is not possible without a transformation.
    The transformation has been moved to production successfully.

  • Data load to DSO

    Hi guys...
    Suppose I have two Datasources that are mapped to a infosource and this infosource is mapped to one dso(all objects until DSO are emulated from 3.x to 7.x)...when I load data,I assume that I have to use two infopackages and I get data into DSO in two requests.I have few questions about this,assuming I have only these two requests in my DSO:
    1.When I tried to create a query directly on DSO in query designer... I couldnot find the infoobject 0REQUESTID in query designer...then how can I do if I want to see data request by request rather than all together?
    2.Suppose the DSO gets data like below:
    Fields in DSO:X1,X2,Y1,Y2,Y3    [X1,X2 are characteristics and also keys,Y1,Y2,Y3 are keyfigures]
    Data feeded by Datasource 1 :   X1  X2  Y1
                                                         a     b     10
    Data feeded by Datasource 2 :   X1   X2   Y2   Y3
                                                         a      b     20    30
    so when I load data,I will load data in two requests and these are the only two requests I have in my DSO....then how will data look in DSO.....does it gets stored in two seperate rows or single row?how is it shown in a query result?
    If the keys are not matched,how will the data be shown for keyfigures that are not loaded by that request?
    3.I know that in DSO,We have two options:Overwrite/Addition....how will be the data loading be in following situation:
    Datasource 1 feeds like this in Request 1:
    X1 X2  Y1
    a   b     10
    Datasource 2 feeds like this in Request 2:
    X1  X2  Y1  Y2 Y3
    a    b      30  40   50
    how will the result be shown in our two options Addition and Overwrite?will request 2 overwrite or add up data in Y1?
    Thanks.

    Hi guys...
    Suppose I have two Datasources that are mapped to a infosource and this infosource is mapped to one dso(all objects until DSO are emulated from 3.x to 7.x)...when I load data,I assume that I have to use two infopackages and I get data into DSO in two requests.I have few questions about this,assuming I have only these two requests in my DSO:
    1.When I tried to create a query directly on DSO in query designer... I couldnot find the infoobject 0REQUESTID in query designer...then how can I do if I want to see data request by request rather than all together?
    Request-ID is only a part of the new data table - after activation of your data your request will get lost. If you want to see whats happening, load you data request by request and activate your data after each request
    2.Suppose the DSO gets data like below:
    Fields in DSO:X1,X2,Y1,Y2,Y3 X1,X2 are characteristics and also keys,Y1,Y2,Y3 are keyfigures
    Data feeded by Datasource 1 : X1 X2 Y1
    a b 10
    Data feeded by Datasource 2 : X1 X2 Y2 Y3
    a b 20 30
    so when I load data,I will load data in two requests and these are the only two requests I have in my DSO....then how will data look in DSO.....does it gets stored in two seperate rows or single row?how is it shown in a query result?
    If the keys are equal, you will have only one dataset in your DSO
    If the keys are not matched,how will the data be shown for keyfigures that are not loaded by that request?
    Then you will have two datasets in your DSO
    3.I know that in DSO,We have two options:Overwrite/Addition....how will be the data loading be in following situation:
    Datasource 1 feeds like this in Request 1:
    X1 X2 Y1
    a b 10
    Datasource 2 feeds like this in Request 2:
    X1 X2 Y1 Y2 Y3
    a b 30 40 50
    how will the result be shown in our two options Addition and Overwrite?will request 2 overwrite or add up data in Y1?
    If you choose overwrite, you will get 30 - if you choose addition, you will get 40
    Thanks.

  • Partial data loading from dso to cube

    Dear All,
    I am loading the data from dso to the cube through DTP. The problem is that some records are getting added  through data
    packet 1 to the cube around 50 crore records while through data packet2 records are not getting added.
    It is full load from dso to cube.
    I have tried deleting the request and executing DTP but again the same no of records get added to the cube through datapacket 1   and after that records are not added to the cube through datapacket 1  ;request remains in yellow state only.
    Please suggest .

    Nidhuk,
    Data load transfers package by package. Your story sounds like it got stuck in second package or something. Suggest you check package size, try to increase to a higher number to see if anything  changes ? 50 records per package kind of low, your load should not spread out into too many packages.
    Regards. Jen
    Edited by: Jen Yakimoto on Oct 8, 2010 1:45 AM

  • Data load to DSO takes long time to finish

    Dear All,
    We have a data load from data source to std  DSO.The data load takes 5 hours to complete  6000 records in single data package which is long time.
    Process monitor shows yellow status at one of the step for long time "No message :Transformation End" and after 5 hours approx  it completes successfully.
    Please find the snapshot of process monitor(Attached File Process monitor.png).
    There is an end routine and the transformation  is having direct mapping except for a target object exchage rate which is master data look up of DSO (Attached FIle : Transformation rule.png)
    The look up DSO /BI0/AFIGL_DS00 in the below code is having DOCNUM as a primary key  but not the POSKY. Since one of the field is not a primary key,secondary index is created for the look up DSO.But,still it takes huge time to finish the last step as mentioned in the snapshot.
    Setting for parallel process is 1
    DTP--> Update tab-->Error handling-->No update,no reporting.But there is a error DTP present which I believe that there is no use when "No update,No reporting" option is chosen.
    Can you please suggest the reason for the such long time.Also,Please suggest how to find the exact place where it consumes lot of time.
    End routine Logic:
        IF NOT RESULT_PACKAGE IS INITIAL.
          REFRESH IT_FIG.
          SELECT DOCNUM  POSKY DEBCRE LOCC
          FROM /BI0/AFIGL_DS00 INTO TABLE IT_FIG
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE DOCNUM = RESULT_PACKAGE-BILNO AND
                POSKY = '02'.
        LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.
            READ TABLE IT_FIG INTO WA_FIG WITH KEY
                       DOCNUM = <RESULT_FIELDS>-BILNO.
            IF SY-SUBRC EQ 0.
              <RESULT_FIELDS>-DEB = WA_FIG-DEBCRE.
              <RESULT_FIELDS>-LOC_CURRC2 = WA_FIG-LOCC.
            ENDIF.
        ENDLOOP.
        ENDIF.
    Thanks in advance
    Regards
    Pradeep

    Hi,
    below code check it and try to load the data.
    IF RESULT_PACKAGE IS NOT INITIAL.
          SELECT DOCNUM 
                          POSKY
                          DEBCRE
                          LOCC
          FROM /BI0/AFIGL_DS00 INTO TABLE IT_FIG
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE DOCNUM = RESULT_PACKAGE-BILNO AND
                POSKY = '02'.
        LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.
            READ TABLE IT_FIG INTO WA_FIG WITH KEY
                       DOCNUM = <RESULT_FIELDS>-BILNO.
            IF SY-SUBRC EQ 0.
               <RESULT_FIELDS>-DEB = WA_DOCNUM.
               <RESULT_FIELDS>-DEB = WA_POSKY.
              <RESULT_FIELDS>-DEB = WA_FIG-DEBCRE.
              <RESULT_FIELDS>-LOC_CURRC2 = WA_FIG-LOCC.
            ENDIF.
        ENDLOOP.
        ENDIF.
    if your are getting any error please let us know
    1.decrease the data packet size in DTP like 10,000 or 20,000.
    2.increase the parallel process at DTP level.
    Thanks,
    Phani.

  • Long time to load data from PSA to DSO -Sequential read RSBKDATA_V

    Hi ,
    It is taking long time to load data from PSA to DSO. It is doing Sequential read on RSBKDATA_V and table contents no data .
    we are at - SAPKW70105. It started since yesterday . There is no changes in system parameters.
    Please advice. 
    Thanks
    Nilesh

    Hi Nilesh,
    I guess the following SAP Note will help you in this situation.
    [1476842 - Performance: read RSBKDATA only when needed|https://websmp107.sap-ag.de/sap/support/notes/1476842]
    Just note that the reference to Support Packages is wrong. It is also included in SAP_BW 701 06.

  • Data load failed while loading data from one DSO to another DSO..

    Hi,
    On SID generation data load failed while loading data  from Source DSO to Target DSO.
    Following are the error which is occuuring--
    Value "External Ref # 2421-0625511EXP  " (HEX 450078007400650072006E0061006C0020005200650066
    Error when assigning SID: Action VAL_SID_CONVERT, InfoObject 0BBP
    So, i'm  not getting  WHY in one DSO i.e Source  it got successful but in another DSO i.e. Target its got failed??
    While analyzing all i check that SIDs Generation upon Activation is ckecked in source DSO but not in Target DSO..so it is reason its got failed??
    Please explain..
    Thanks,
    Sneha

    Hi,
    I hope your data flow has been designed in such a way where the 1st DSO as a staging Device and all transformation rules and routine are maintained in between 1st to 2nd dso and sid generation upon activation maintained in 2nd DSO.  By doing so you will be getting your data 1st DSO same as your source system data since you are not doing any transformation rules and routine etc.. which helps to avoid data load failure.  
    Please analyze the following
    Have you loaded masterdata before transaction data ... if no please do it first
    go to the property of first dso and check whether there maintained sid generation up on activation (it may not be maintained I guess)
    Goto the property of 2nd Dso and check whether there maintained sid generation up on activation (It may be maintained I hope)
    this may be the reason.
    Also check whether there is any special char involvement in your transaction data (even lower case letter)
    Regards
    BVR

Maybe you are looking for

  • How can I change the coIor of one cell in a JTable

    Hello, I need to change the colours of the cells in a table. The problem is that when I try to change the colour of a one cell the colour of entire column is changed. See the way that I tried to use. TableCellRenderer cr = table.getCellRenderer(row,

  • Alpha-numeric Bank Account numbers

    Hi, How do I capture Alpha-numberic Bank account numbers in a vendor bank details on tcode: XK02 Please advise, Themba

  • Solved - Group Fields for Vendor Master Records in IMG

    This was the solution. The new field's data element I added to LFB1 did not have the "change document" field checked.  I made that change and now that field shows up in IMG. Hello all, I added an append structure to the table LFB1.  This structure ha

  • How to get back my pics from lost iphone4s

    hello, i lost my iphone4s , i am worried how i can get my pics and videos back, i think i didn't update the pics on icloud (photo stream), but i am not sure, can i still get back my photos. i haven't bought new iphone yet ,  do i need to buy new ipho

  • XL Reporter Peoblem

    Hi In XL Reporter we are not able to do  print the query in xl sheet. Problem is displaying ( Security Settings in Microsoft Excel Prohibit XL Reporter from Running .