Delta loading procedure from Write Optimized DSO to Infocube

Hi All,
We are using Write optimized DSO in our project to which I am loading data using Standard DSO 0FI_GL_12
From Write optimized DSO, we are loading delta records into Infocube, please provide your inputs to my following questionnaire:
1) I am quite interested to know how the delta records gets loaded into infocube whenever we are using 'Write optimized DSO' as we don't have any image concept in Optimized DSO
Ex: If am using Standard DSO, we have Change log table and image concept will allow to get updated value to Cube
let us assume
Active Table
111            50
111            70 (overwrite)
Change Log Table
111            -50        (X -- Before Image)
111             70    ( '  ' -- After Image) symbol for after image is 'Space'
So if we load this record to the target as a delta the above two records from change log table will get loaded to the CUBE and Cube will have 70 as the updated record
If am using 'Write Optimized',
Active Table
111            50
111            70 (overwrite)
When this record loaded to the cube, as Info Cube is always having 'Additive' feature so the total value will 50+70 =120 which is wrong?
Correct me what feature will work here to get updated value as '70' into Cube from Write Optimized DSO'
2)As the data source is delta capable and having  'ADDITIVE' delta process, only the delta records based on REQUEST ID are loaded into Info Cube with the  updated key figure value?
Thanks for your inputs and much appreciated.
Regards,
Madhu

Hi Madhu,
In best practice, we use WODSO in Initial layer and then Standard DSO. Just for mass data load/staging purpose.
In best practice : Data source ----> WODSO ---> std. DSO
In your case : Data source ----> Std.DSO  -----> WODSO.
In both cases if data load design is not in accurate way, then your cube will have incorrect entries.
For ex:  today 9 am : 111,  50  (in active table)
Data load to cube, same day 11 am : then cube will have 111    50.
Same day, value got changed  in std. DSO  1 pm :   111   70(over write function @ active table).
Same day/next day if you load data to cube, it will have 2 records one is with value 50 and other would be 70.  So to avoid such scenarios we should plan load in accurate way.  Else make change your DTP settings as  ‘source table : change log table.
Coming to your case:
Once after the load to Std. DSO, load data to WODSO by changing the DTP settings ‘Delta Init.Extraction from’  : Change log.
Now data available @WODSO from change log table, then you load to cube In delta mode.

Similar Messages

  • Unable to delete request from write-optimized DSO (Error during rollback)

    Hi Gurus,
    I am trying to delete a delta request from a Write-Optimized DSO. This request was uploaded with a DTP from another Write-optimized DSO.
    The actual overall status of the request is RED and the description of that status is now: 'Error during rollback of request DTPR_4JW6NLSVDUNYY3GTD6F4DQJWR; only rollback allowed'.
    I checked the log of all Request Operations in DataStore (from the same line where the red request is now) and I see my several attemps to delete this request under a RED radiobutton with the title Rollback.  The details for this error are the following:
    Could not delete request data from active table
    Message no. RSODSO_ROLLBACK114
    Diagnosis
    The system could not delete the request data from the active table of a write-optimized DataStore object.
    System Response
    Write-optimized DataStore object: DTFISO02
    Active table: /BIC/ADTFISO0200
    Request: DTPR_4JW6NLSVDUNYY3GTD6F4DQJWR
    Procedure
    Search for Notes containing the key words "Delete write-optimized DSO PSA"
    I am relatively new to SAP BI 7.0 and I do not know how to delete this request.  Any help will be highly appreciated !!
    Leticia

    Hi Leticia:
    Take a look at the SAP Notes below.
    Note 1111065 - "701: Delta consistency check for write-optimized DSOs"
    Note 1263877 - "70SP20: Delta consistency check for write-optimized DSOs"
    Note 1125025 - "P17:PSA:DSO:ODSR missing in PSA process for write-opt. DSO"
    Additionally, some ideas from the alternative presented on the blog by KMR might help you.
    "How to generate a selective deletion program for info provider"
    Regards,
    Francisco Mílán.

  • Problem loading data into write optimized dso.....

    Hi ,
    I am having problem loading the data from PSA to write optimised DSO
    I have changed the normal DSO into a Write Optimised DSO. I have 2 data sources to be loaded into the write optimized DSO. One for Demand and one for Inventory.
    The loading of Demand data from PSA to DSO happens fine with out any error.
    Now while loading the Inventory data from PSA to DSO , i get the below errors
    "Data Structures were changed. Start Transaction before hand"
    Execption CX_RS_FAILED Logged
    I have tried reactivating the DSO, Transformation, DTP and tried to load the data again to the write optimised DSO, but no luck, I get the above error message always.
    Can some one please suggest me what could be done to avoid the above error messages and load the data successfully.
    Thanks in advance.

    Hi,
    Check the transformations is there any routines written.
    Check your Data struture of cube and DS as well.
    Is there any changes in structure.
    Data will load upto PSA normally, if is there any changes in struture of DSO, then the error may occur.just check it.
    Check the below blog:
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    Let us know status.........
    Reg
    Pra

  • Duplicate Error while loading data to Write Optimized DSO

    Hi,
    When i do a dataload for Write Optimized DSO, I am getting an Error as "Duplicate Data Record Detected".  I have Sales Document Number, Fiscal Year Variant & Billing Item as Semantic Key in the DSO. 
    For this DSO, I am getting data from a Test ECC system, in which most of the Sales Document Number column is Blank for this Datasource.
    When i go into the Error Stack of the DSO, all the rows whichever has Sales Document number as Blank are displayed.  For all this rows, the Item Number is 10.
    Am i getting this Duplicate error as the Sales Document Number is Blank & Item number is 10 for all of them?  I read in Threads that Write Optimized DSO doesnt care about the Key Values, it loads the data even if the Key values are same.
    Any help is highly appreciated.
    Regards,
    Murali

    Hi Murali,
    Is the Item Number a key field ?
    When all the key fields are same then data gets aggreagted depending on the setting done in the transformation for the key figures. These 2 options for key figures are :
    1. Add up the key figures
    2. Replace the key figure
    Since the Sales Document No is blank and Item Number is same then their is a possibility that the key figures for these records might get added up or replaced and thus due to this property of SDSO it might not be throwing error.
    Check the KF value in the SDSO for that Sales Doc No and Item No. and try to find out what is the value for KF. It may have addition of all the common data fields or the KF value of last common record.
    Regards
    Raj Rai

  • Init from Write optimized DSO

    Hello,
    I have a requirement wherein a WDSO (Write Optimized) get everyday's Delta where we apply all Business rules. Then need to load this DELTA into a Cube in Analytical layer which is in a different system.
    Questions:
    1. Can we do Init without Data transfer from WDSO to Cubes while using Update rules from Exp DS of this WDSO to the Cubes, and then run Delta loads subsequently?
    2. Can we do Init without Data transfer from WDSO to Cubes while using DTP from this Exp DS to Cubes in a different system, and run Delta loads subsequently?
    As mentioned, please be aware that source WDSO and the Cubes are in 2 different systems.
    thanks in advance for the help...
    Kumar

    Hello Chetan,
    Really appreciate your help on this.
    Guess I am failing to explain my requirement.
    Process that you had suggested works fine if it is full load from source WDSO to the Cubes in a different system. But for this I have to drop and reload the delta in WDSO everyday.
    My actual requirement is to keep adding new Delta requests into WDSO everyday, then pull ONLY DELTA records (Requestes that are loaded on each day) into the Cubes across system.
    So in order to achieve this we have to load from WDSO to PSA in a different system (for doing Init or Delta), then pull using DTP into the Cubes.
    So as said earlier, while doing this Init from WDSO to PSA using Infopackage it is working fine but no subsequent Delta are getting pulled though new records added in WDSO.
    Hope it makes sense....
    This makes me to think whether we can load Deltas from WDSO or not. Appreciate your help.
    thanks
    Kumar

  • Deletion of requests from Write optimized DSO

    Hi all,
    Tried using this RSSM_DELETE_WO_DSO_REQUESTS to delete requests more than 60 days, via process chain.
    But this is not removing all requests more than 60 days. Quite a few are not getting deleted and even on second run of this program, deletes one request at a time. Not sure why this behaves this way.
    Any other alternative as Deletion of WDSO (Write-optimized DataStore object) Load requests and active table data without deleting it from target
    This also seems to work...?
    Also, how should I ensure uniformity of timelines w.r.to PSA and changelog of this DSO flow? PSA 15 days and changelog 90 days retain is fine? Want to ensure no data comes again thats deleted in WDSO

    Can some one suggest pls

  • Request Deletion from Write-optimized DSO

    Hello,
    With the new Write-optimized technology, it is possible to delete manually "older" requests from the W-O DSO.
    Could anyone of you think of an automated process to delete "old" requests from a W-O DSO (not the entire content , the most recent should still be available.
    For instance : delete everyday the request older than 7 days.
    Already checked solutions :
    - Selective deletion at the administration level of the DSO -> cannot be repeatedly scheduled
    - Copying the Selective Deletion generated program to make one's own program and schedule it (system cannot "remember" the generated program)
    - Diverse SAP Function Module -> do not work for this scenario (like RSSM_DELETE_REQUEST, only for cubes, RSSM_PROCESS_REQUDEL_ODSO or    RSSM_DELETE_REQUEST, where you need to specify the Request number
    - We do not want to include a delete job in a routine at the Transformation level.
    - We do not want to complicate the Data Model by creating a new intermediate DSO allowing to flush the DSO at each load..
    Any other ideas??
    We are on Version 7.0, SP 13.
    Many thanks!
    amanda

    Hi ,
    We have worked on similar business requirement .We wrote a report program in SE38 and running it via a process chain .
    If you want i can help you to write code for same .
    It will be a two step process :
    1.deleting request from RSICCONT so that it get deleted from manage tab.
    2.deleting data from active table of that WDSO .
    Code for Program :
    data  :v_time  type c length 17,
              v_date like sy-datum .
    ( N) = 7  put no of days here before which you want to delete records
    v_date = sy-datum.
    subtract 7 from v_date.
    concatenate  v_date   sy-uzeit  into v_time.
    delete from  rsiccont where icube = wdso_name and timestamp LT v_time .
    delete from  wdso_active_table_name where rstt_tsmp LT v_time.
    It will do your work .
    Regards ,
    Jaya

  • Error while loading data from write optimized ODS to cube

    Hi All,
    I am loading data from a write optimized ODS to cube
    I have done Generate Export Datasource
    schedulled the info packge with 1 selection for full load
    then it gave me following error in Transfer IDOCs & TRFC
    Info IDOC 1: IDOC with errors added
    Info IDOC 2: IDOC with errors added
    Info IDOC 3: IDOC with errors added
    Info IDOC 4: IDOC with errors added
    Data packege 1 arrived in BW Processing : selected number does not agree with transferred number
    Processing below is green
    shows update of  4 new records to Datapackage 1.
    Please provide inputs for the resolution
    Thanks & Regards,
    Rashmi.

    please let me know, What more details you need?
    If I click F1 for error details i get following message
    Messages from source system
    see also Processing Steps Request
    These messages are sent by IDoc from the source system. Both the extractor itself as well as the service API can send messages. When errors occur, several messages are usually sent together.
    From the source system, there are several types of messages that can be differentiated by the so-called Info-IDoc-Status. The IDoc with status 2 plays a particular role here; it describes the number of records that have been extracted in a source system and sent to BI. The number of the records received in BI is checked against this information.
    Thanks & Regards,
    Rashmi.

  • Duplication Error while loading data in write optimized DSO

    Hi Experts,
    I have an issue.In BI7 I'm trying to load the data in a WRITE OPTIMIZED ODS from the controlling data source 0CO_OM_CCA_10. I'm getting the data properly in PSA, but while loading it into my WODSO i'm getting the duplication error although my keys fields and data fields are properly placed in my data target(WODSO).
    pls let me know what is the solution to load it successfully.
    Thanks in Advance.
    Amit

    Hi,
    thanks for your reply
    I'm getting this error message:
    Diagnosis
        During loading, there was a key violation. You tried to save more than
        one data record with the same semantic key.
        The problematic (newly loaded) data record has the following properties:
        o   DataStore object: GWFDSR02
        o   Request: DTPR_4BA3GF8JMQFVQ8YUENNZX3VG5
        o   Data package: 000006
        o   Data record number: 101
    Although i have selected the key fields which identifies unique record,then also i'm getting the duplication error.
    Even i have reffered to the BI content for this data source and found that it has the same key fields as of mine.
    Debjani: i need unique records without duplication and i'm doing a full load in DTP.
    What is to be done pls help
    Thanks in advance.
    Edited by: Amit Kotwani on Sep 26, 2008 10:56 AM

  • Write optimized dso - Stand DSO loading by Delta Req by req is not working

    Hi All,
    We have a data flow which has 1st layer having write optimized dso and then we are loading data to Standard DSO.
    We are using Delta DTP to load the data with the option "Get All New Data Request By Request", but it is not working.
    Suppose if  i have 2 requests which are need to update to Standard Dso from Write Optimized Dso, it is updating in single request. We are on SAP BW 701 patch level is 005.
    Please suggest me ....

    Hi Prasanna,
    For reasons of downward compatibility, the system behaves differently for DTPs that were created before "SAP NetWeaver 7.0 Support Package Stack 13"
    DTPs for which the indicator is set only get the first new request, even if there is more than one new source request at the time of processing. This restricts the way in which these DTPs can be used in process chains, because requests accumulate in the source and the target may not contain the current data. The Retrieve Until No More New Data indicator is therefore displayed for these DTPs.
    I Suggest you to set this indicator and activate the DTP.
    If you set the Retrieve Until No More New Data indicator and then activate the DTP, once it completes processing, a DTP request checks whether there are any further requests in the source. If the source contains more requests, a new DTP request is automatically generated and processed. Once the DTP is activated, the indicator is no longer visible in DTP maintenance.
    This applies when the DTP is started by a process chain and also when you start the DTP directly from DTP maintenance.
    The indicator is set by default for new DTPs that get data request by request.
    If you do not select the indicator, the label for the Get All New Data Request By Request indicator changes to Get One Request Only. Once the DTP is activated, only this indicator is displayed in DTP maintenance.
    Regards,
    Sudheer.

  • Unable to delete data target contents of Write-Optimized DSO in Process Chain

    Hi Experts,
    We are using SAP Net Weaver BW 7.01 version and we need to delete the entire data target contents of Write-Optimized DSO in the process chain before the next data load.
    I included this step in process chain but still it is failing with errore message"Message not found (in main memory), Drop Cube Failed In Data Target"
    This process type is working during BW 7.0 version but not in BW 7.01 version.
    However i found that we can use the program RSSM_DELETE_WO_DSO_REQUESTS to delete old requests in the Write-Optimized DSO for BW 7.01 SP07 as per SAP Note 1437407 but still it's not working even after implementing this program as the Prerequisite to delete the request is the data mart status should be updated where it is not happening for the program.
    We had an process type option to 'delete the requests from Write-Optimized DSO' directly in BW 7.3 but still not available in 7.01 version.
    Could you please suggest me on how to resolve this issue in BW 7.01?
    Many thanks for your help in advance.
    Regards,
    Madhu

    Create ABAP program as attached code.
    Then you can use that ABAP program in process chains through ABAP variant
    ABAP varaint should have following properties
    Select call mode as Synchronous; call from Local; and Program
    Give your ABAP program name in "program name" and create one program variant for each write optimized DSO.
    Please refer how to use ABAP program in process chains for further details.
    Hope this helps

  • Load Data with 7.0 DataSource from Falt file to Write Optimized DSO

    Hi all,
    we have a problem loading data from flat file using the 7.0 datasource.
    We have to load a flat file (monthly) into a WO DSO. The infopackage load file in full mode into the Datasource (PSA) and the DTP Load in delta mode data from datasource into the WO DSO.
    When i load the second file in the Datasource, the DTP load all data present in the Datasource and not only the new one as aspected using Delta mode.
    Has anyone any tips to help me?
    Thank you for help.
    Regards
    Emiliano

    Hi,
    Iam facing the similar problem.
    Iam using Write Optimized DSO and i have got only 1 req in PSA (have deleted all previous req from PSA and DSO).
    When iam doing a delta load from PSA to DSO, i expect to see only that 1 req to get loaded into DSO.
    But, its picking up the data from 3 other reqests and doubling the records...
    Can you please help me, how did you managed to get out of that isue?
    Cheers,
    Nisha

  • Testing Delta Consistency for write-optimized DSO

    Hi,
    Please do let me know how to carry  testing for delta consistency for write-optimized DSO.
    Is it just to check whether the check box under the settings.. "Check Delta Consistency " is checked or the other way.
    Please do let me know.
    Thanks & Regards,
    Lavanya.

    Hi,
    The delta consistency flag for W-O DSO is used to ensure data consistency between the w-o DSO and any target propagated to. zfor example, you have dataflow ZWDSO1 -> zsdso1. ZWDSO1 has the flag 'check delta consistency' set to ON. You load data from ZWDSO1 to ZSDSO1 via request 1.
    When the load is successful, if you try to delete the request 1 from ZWDSO1, deletion should not be possible; also you will not be able to change the flag to OFF as delta has already been propagated, and if you change the flag, delta consistency might be lost!
    Now, if you delete request 1 from ZSDSO1 and then try to delete request 1 from ZWDSO1, deletion should be possible.
    Hope the explanation helps!
    Regards,
    Rakesh

  • Multiple data loads in PSA with write optimized DSO objects

    Dear all,
    Could someone tell me how to deal with this situation?
    We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
    The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
    For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
    As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
    We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
    Does any of you have a solution for this?
    Thanks in advance.
    Harald

    Hi Ajax,
    I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
    The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
    Regards,
    Harald

  • Error during loading and deletion of write-optimized DSO

    Hey guys,
    I am using a write optimized DSO ZMYDSO to store data from several sources (two datasources and one DSO).
    I have disabled the check of uniqueness in the DSO, but I defined a semantic key for the fields ZCLIENT, ZGUID, ZSOURCE, ZPOSID which are used in a non-unique index.
    In the given case, I want to delete existing rows in the DSO. I execute these steps in the endroutine. Here the abstract coding:
    LOOP AT RESULT_PACKAGE ASSINING <RESULT_FIELDS>.
    u201Csome other logic [u2026]
    DELETE /BIC/AZMYDSO00
    WHERE /BIC/ZCLIENT = RESULT_FIELDS-/BIC/ZCLIENT
         AND /BIC/ZGUID = RESULT_FIELDS-/BIC/ZGUID
    AND /BIC/ZSOURCE = RESULT_FIELDS-/BIC/ZSOURCE
    AND /BIC/ZPOSID = RESULT_FIELDS-/BIC/ ZPOSID.
    ENDLOOP.
    COMMIT WORK AND WAIT.
    During the Loading (after the transformation step in the updating step), I get the messages (not every time):
    1.     Error while writing the data. (RSAODS131)
    2.     Could not Save DataPackage xy in DataStore ZMYDSO (RSODSO_UPDATE027).
    Diagnosis: DataPackage XY could not be saved. Reasons therefore could be violation of key uniqueness (duplicate data) or general database error.
    3.     Error in the substep of updating DataStore.
    I have checked the system log (SM21) and the system dumps (ST22) but I could not find an exact error description.
    I guess, I am creating some inconsistencies or locks (I also checked the SM12) so that the load process interrupts. But I also tried a serial updating within the DTP (I reduced the number of batch processes to 1). No success.
    Perhaps the loading of one specific package could take a longer time so that the following package would overtake the predecessor. Could that be a problem? Do you generally advise against the deletion of rows within the endroutine?
    Regards,
    Philipp

    Hi,
    is ZMYDSO the name of the DSO?
    And is this the end routine of the transformation while loading the same DSO?
    if so we never do such a thing.
    you are comparing the DSO with the data that is flowing in and then deleting the data from the DSO...
    Which doesnt actually make any sense... because when loading the data to a DSO (or a cube or any table) the DSO (or cube) will be locked exclusively for any modifications of data. You can only read data from it.
    If your requirement is that existing duplicate records need not arrive in the DSO then you can delete the data from the SOURCE_PACKAGE in the start routine like below
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE WHERE <CONDITION>.
    LOOP AT INTERNAL_TABLE.
       DELETE SOURCE_PACKAGE
      WHERE SOURCE_PACKAGE-/BIC/ZCLIENT = INTERNAL_TABLE-/BIC/ZCLIENT
         AND SOURCE_PACKAGE-/BIC/ZGUID = INTERNAL_TABLE-/BIC/ZGUID
      AND SOURCE_PACKAGE-/BIC/ZSOURCE = INTERNAL_TABLE-/BIC/ZSOURCE
      AND SOURCE_PACKAGE-/BIC/ZPOSID = INTERNAL_TABLE-/BIC/ ZPOSID.
    ENDLOOP.
    or if your requirement is that you need to delete the old data from the DSO for the same key which is arriving newly in order to load the new data into the DSO in that case, you could do something like this in the start routine
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE FOR ALL ENTRIES IN SOURCE_PACKAGE
    WHERE /BIC/ZCLIENT = SOURCE_PACKAGE-/BIC/ZCLIENT
         AND /BIC/ZGUID = SOURCE_PACKAGE-/BIC/ZGUID
    AND /BIC/ZSOURCE = SOURCE_PACKAGE-/BIC/ZSOURCE
    AND /BIC/ZPOSID = SOURCE_PACKAGE-/BIC/ ZPOSID.
    * now update the new values you want to write in the loop
    LOOP AT INTERNAL_TABLE INTO WORK_AREA.
    "CODE FOR MANIPULATION of WORK_AREA
    *write a modify statement to update the RESULT_PACKAGE.
    MODIFY RESULT_PACKAGE FROM WORK_AREA TRANSPORTING FIELDS.
    ENDLOOP.
    hope it helps,
    Regards,
    Joe

Maybe you are looking for

  • Why can't I upload Adobe CS3 on another Mac with the same operating system 10.5.8?

    I purchased a MacBook Pro in 2008 and had it installed with Adobe CS3 at the Apple Store. I don't believe I was given any CDs or serial numbers to take home. I still have all the documentation for my other programs I installed myself, like Logic, Fin

  • BT Infinity - Problems and Poor Customer Service

    Hello, We are new customers to BT - having only had our Infinity Fibre Optic connection installed into our new home 13 days ago. Since installation, it's become evident that the Hub struggles to maintain any decent wireless connection to the downstai

  • Cannot download Solution Center for Off Jet 6500 Wireless since going to Windows 8.1

    I get a failure message and am unable to download HP full feature software (a.k.a. the "solution center") for my HP Officejet 6500 Wireless on my Aspire E1-571-688 Laptop.   The solution center was successfully downloaded when I got the laptop in Aug

  • Connection Server does not start

    Hello guys, The Connection Server does not start anymore. When I always try to start it returns me a failed status and shows me that the server is considered failed because it has stopped 5 time(s) in 60 minute(s). It's a BOE XI 3.1 SP2 with IK 3.1 S

  • Margins are not the same when I print

    Help! I have written 80 pages and when I go to print, the pages have a ridiculous amount of margin space on the top, bottom and left sides of the paper. When I'm writing, everything looks right. It is once I get to Print Preview, I can see how off it