Can a write-optimized DSO be used for Delta upload

Hi,
can any one please answer following..
1. can a write optimized DSO be used for Delta upload?
2. Does industry based content is available in BI Content ?
Thanks&Regards
Satya

Hi,
Write-Optimized DataStore does not support the image based delta, it supports request level delta, and you will get brand new delta request for each data load.
Since write-optimized DataStore objects do not have a change log, the system does not create delta (in the sense of a before image and an after image). When you update data into the connected InfoProviders, the system only updates the requests that have not yet been posted.
Write-Optimized Data Store supports request level delta. In order to capture before and after image delta, you must have to post latest request into further targets like Standard DataStore or Infocubes.

Similar Messages

  • How can i write the below code using "For all entries"

    Hi
    How can we write the below code using "for all entries" and need to avoid joins...
    Please help
    SELECT aaufnr aobjnr aauart atxjcd a~pspel
    agstrp awerks carbpl cwerks
    INTO TABLE t_caufv
    FROM caufv AS a
    INNER JOIN afih AS b
    ON aaufnr = baufnr
    INNER JOIN crhd AS c
    ON bgewrk = cobjid
    AND c~objty = 'D'
    WHERE ( a~pspel = space
    OR a~txjcd = space
    OR NOT a~objnr IN
    ( select OBJNR from COBRB AS e
    WHERE objnr = a~objnr ) )
    AND a~werks IN s_plant
    AND a~auart IN s_wtype
    AND NOT a~objnr IN
    ( select OBJNR from JEST AS d
    WHERE objnr = a~objnr
    AND ( dstat = 'A0081'OR dstat = 'A0018' )
    AND d~inact 'X' ).
    Reward points for all helpfull answers
    Thanks
    Ammi.

    Hi,
    SELECT objnr objid aufnr
            from afih
            into table t_afih.
    SELECT objnr
            from JEST
            into table t_JEST
            where stat = 'A0045'
               OR stat = 'A0046'
               AND inact 'X'.
    SELECT objnr
            from COBRB
            into table t_cobrb.
    SELECT arbpl werks objid objty
          from crhd
          INTO table it_crhd
          FOR ALL ENTRIES IN it_afih
          WHERE objty eq 'D'
          AND gewrk = it_afih-objid.
    SELECT aufnr objnr auart txjcd pspel gstrp werks aufnr
            FROM caufv
            INTO table t_caufv
            FOR ALL ENTRIES IN it_afih
            WHERE aufnr = it_afih-aufnr
              And pspel = ' '
              AND txjcd = ' '
             ANd objnr ne it_crhd-objnr
              AND auart in s_wtype
              AND werks in s_plant.
             AND objnr ne it_jest-objnr.
    dont use NE in the select statements, it may effect performance also. Instead use if statements inside
    loops.
    loop at t_caufv.
    read table it_chrd............
      if t_caufv-objnr ne it_chrd-objnr.
      read table it_jest..........
       if   if t_caufv-objnr ne it_jest-objnr.
        (proceed further).
       endif.
      endif.
    endloop.
    hope this helps.
    Reward if useful.
    Regards,
    Anu

  • Duplicate Semantic Key in Write Optimized DSO

    Gurus
    Duplicate semantic keys have a unique index KEY in the key fields of the DSO when WRITE OPTIMIZED DSO is used. (Of course this is assuming the Do not check uniqueness of data indicator is not checked..)
    See help https://help.sap.com/saphelp_crm60/helpdata/en/a6/1205406640c442e10000000a1550b0/frameset.htm
    This means that the DSO can contain duplicate records.
    My question is: What happens to these duplicates when a request level delta update is done to a Standard DSO or Infocube?
    Do duplicates end up in the error stack? Are they simply aggregated in further loads? - because this would be a problem for reporting (double-counting).
    thanks
    tony

    Hi Tony,
    It will aggregate the data in some undesired way.
    Read on...
    https://help.sap.com/saphelp_crm60/helpdata/en/b6/de1c42128a5733e10000000a155106/frameset.htm
    If you want to use write-optimized DataStore objects in BEx queries, we recommend that they have a semantic key and that you run a check to ensure that the data is unique. In this case, the write-optimized DataStore object behaves like a standard DataStore object. If the DataStore object does not have these properties, unexpected results may be produced when the data is aggregated in the query.
    Hope it helps...
    Regards,
    Ashish

  • Problem loading data into write optimized dso.....

    Hi ,
    I am having problem loading the data from PSA to write optimised DSO
    I have changed the normal DSO into a Write Optimised DSO. I have 2 data sources to be loaded into the write optimized DSO. One for Demand and one for Inventory.
    The loading of Demand data from PSA to DSO happens fine with out any error.
    Now while loading the Inventory data from PSA to DSO , i get the below errors
    "Data Structures were changed. Start Transaction before hand"
    Execption CX_RS_FAILED Logged
    I have tried reactivating the DSO, Transformation, DTP and tried to load the data again to the write optimised DSO, but no luck, I get the above error message always.
    Can some one please suggest me what could be done to avoid the above error messages and load the data successfully.
    Thanks in advance.

    Hi,
    Check the transformations is there any routines written.
    Check your Data struture of cube and DS as well.
    Is there any changes in structure.
    Data will load upto PSA normally, if is there any changes in struture of DSO, then the error may occur.just check it.
    Check the below blog:
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    Let us know status.........
    Reg
    Pra

  • Delta loading procedure from Write Optimized DSO to Infocube

    Hi All,
    We are using Write optimized DSO in our project to which I am loading data using Standard DSO 0FI_GL_12
    From Write optimized DSO, we are loading delta records into Infocube, please provide your inputs to my following questionnaire:
    1) I am quite interested to know how the delta records gets loaded into infocube whenever we are using 'Write optimized DSO' as we don't have any image concept in Optimized DSO
    Ex: If am using Standard DSO, we have Change log table and image concept will allow to get updated value to Cube
    let us assume
    Active Table
    111            50
    111            70 (overwrite)
    Change Log Table
    111            -50        (X -- Before Image)
    111             70    ( '  ' -- After Image) symbol for after image is 'Space'
    So if we load this record to the target as a delta the above two records from change log table will get loaded to the CUBE and Cube will have 70 as the updated record
    If am using 'Write Optimized',
    Active Table
    111            50
    111            70 (overwrite)
    When this record loaded to the cube, as Info Cube is always having 'Additive' feature so the total value will 50+70 =120 which is wrong?
    Correct me what feature will work here to get updated value as '70' into Cube from Write Optimized DSO'
    2)As the data source is delta capable and having  'ADDITIVE' delta process, only the delta records based on REQUEST ID are loaded into Info Cube with the  updated key figure value?
    Thanks for your inputs and much appreciated.
    Regards,
    Madhu

    Hi Madhu,
    In best practice, we use WODSO in Initial layer and then Standard DSO. Just for mass data load/staging purpose.
    In best practice : Data source ----> WODSO ---> std. DSO
    In your case : Data source ----> Std.DSO  -----> WODSO.
    In both cases if data load design is not in accurate way, then your cube will have incorrect entries.
    For ex:  today 9 am : 111,  50  (in active table)
    Data load to cube, same day 11 am : then cube will have 111    50.
    Same day, value got changed  in std. DSO  1 pm :   111   70(over write function @ active table).
    Same day/next day if you load data to cube, it will have 2 records one is with value 50 and other would be 70.  So to avoid such scenarios we should plan load in accurate way.  Else make change your DTP settings as  ‘source table : change log table.
    Coming to your case:
    Once after the load to Std. DSO, load data to WODSO by changing the DTP settings ‘Delta Init.Extraction from’  : Change log.
    Now data available @WODSO from change log table, then you load to cube In delta mode.

  • Unable to delete request from write-optimized DSO (Error during rollback)

    Hi Gurus,
    I am trying to delete a delta request from a Write-Optimized DSO. This request was uploaded with a DTP from another Write-optimized DSO.
    The actual overall status of the request is RED and the description of that status is now: 'Error during rollback of request DTPR_4JW6NLSVDUNYY3GTD6F4DQJWR; only rollback allowed'.
    I checked the log of all Request Operations in DataStore (from the same line where the red request is now) and I see my several attemps to delete this request under a RED radiobutton with the title Rollback.  The details for this error are the following:
    Could not delete request data from active table
    Message no. RSODSO_ROLLBACK114
    Diagnosis
    The system could not delete the request data from the active table of a write-optimized DataStore object.
    System Response
    Write-optimized DataStore object: DTFISO02
    Active table: /BIC/ADTFISO0200
    Request: DTPR_4JW6NLSVDUNYY3GTD6F4DQJWR
    Procedure
    Search for Notes containing the key words "Delete write-optimized DSO PSA"
    I am relatively new to SAP BI 7.0 and I do not know how to delete this request.  Any help will be highly appreciated !!
    Leticia

    Hi Leticia:
    Take a look at the SAP Notes below.
    Note 1111065 - "701: Delta consistency check for write-optimized DSOs"
    Note 1263877 - "70SP20: Delta consistency check for write-optimized DSOs"
    Note 1125025 - "P17:PSA:DSO:ODSR missing in PSA process for write-opt. DSO"
    Additionally, some ideas from the alternative presented on the blog by KMR might help you.
    "How to generate a selective deletion program for info provider"
    Regards,
    Francisco Mílán.

  • 4 LSA architecture - EDW layer - write optimized DSO settings

    Dear Colleagues,
    I have a question regarding the 4 LSA architecture, available on the following article.
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/306f254c-1e3f-2d10-9da0-bcff4e35e0ef
    When we activate a SAP business content dataflow.
    If we want to add in the EDW layer write optimized DSO to have a faster upload from source system to SAP BI.
    Should we always check the" do not check uniqueness of data" setting in the write optimized DSO to avoid data upload error ?
    Based on your experience what would be your recommendation ?
    Cheers,

    I would suggest to check "ON"
    -If the check is ON It will allow to load several records with same semantic key and in next layer
    -If the check is OFF It will not allow to load same record twice and throws error
    Did you seen this ?
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    Edited by: Srinivas on Aug 24, 2010 2:16 PM

  • Sematic group for a write optimized DSO

    Hello,
    I am loading from a datasource to a write optmized DSO. While doing so, I am not able to set semtantic key/group for the DTP.  I have turned error handling on using the setting 'Valid records update, No reporting(request RED)'.
    Is this setting not possible for a write optimized DSO? if not why not? and if yes, what am i missing!?
    Thanks a lot in advance,
    Prakash

    Thank you,
    here are my findings:
    - if data has been loaded with delta using the DTP, it is not possible to change the sematic groups any more - all requests have to be deleted first
    - the error handling must be turned on
    The issue is resolved.
    Marek

  • Data archiving for Write Optimized DSO

    Hi Gurus,
    I am trying to archive data in Write Optimized DSO.
    Its allowing me to archive on request basis but it archives entire requests in the DSO(Means all data).
    But i want to select to archive from this request to this request.(Selection of request by my own).
    Please guide me.
    I got the below details from SDN.Kindly check.
    Archiving for Write optimized DSOs follow request based archiving as opposed to time slice archiving in standard DSO. This means that partial request activation is not possible; only complete requests have to be archived.
    Characteristic for Time Slice can be a time characteristic present in the WDSO, or the request creation data/request loading date. You are not allowed to add additional infoobjects for semantic groups, default is 0REQUEST & 0DATAPAKID.
    The actual process of archiving remains the same i.e
    Create a Data Archiving Process
    Create and schedule archiving requests
    Restore archiving requests (optional)
    Regards,
    kiruthika

    Hi,
    Please check the below OSS Note :
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d31313338303830%7d
    -Vikram

  • Testing Delta Consistency for write-optimized DSO

    Hi,
    Please do let me know how to carry  testing for delta consistency for write-optimized DSO.
    Is it just to check whether the check box under the settings.. "Check Delta Consistency " is checked or the other way.
    Please do let me know.
    Thanks & Regards,
    Lavanya.

    Hi,
    The delta consistency flag for W-O DSO is used to ensure data consistency between the w-o DSO and any target propagated to. zfor example, you have dataflow ZWDSO1 -> zsdso1. ZWDSO1 has the flag 'check delta consistency' set to ON. You load data from ZWDSO1 to ZSDSO1 via request 1.
    When the load is successful, if you try to delete the request 1 from ZWDSO1, deletion should not be possible; also you will not be able to change the flag to OFF as delta has already been propagated, and if you change the flag, delta consistency might be lost!
    Now, if you delete request 1 from ZSDSO1 and then try to delete request 1 from ZWDSO1, deletion should be possible.
    Hope the explanation helps!
    Regards,
    Rakesh

  • Error while deleting data from a write optimized DSO using a Process Chain?

    Dear Bwers,
    I am facing a strange error while using process chains to delete data from data target which is a write optimized DSO. The process shows as failed but the data gets deleted in the DSO. The error message is as below. Did anybody have a similar problem? Any suggestions.
    Thanks
    Raj
    Error while deleting content of InfoCube/DataStore object ZLSD_G03
    Message no. RSDODSO153

    Please, check if you get any short dump in ST22 related to this issue.

  • Semantic keys for write optimized DSO

    Hi experts,
    Can anyone tell me more about semantic key in a write optimized DSO ?
    I have a DSO cocnerning sales orders with 3 Datasources.
    How can you define the semantic key ?
    I have schedule line number - document number and position number in semantic key but when I load the PSA, I have an error with duplicate data.
    Any clues ?
    Thanks.

    Hi Oliver,
    If you specify any of the charaterstics symantic fileds, when you load data to the DSO, if any error record comes the following records which has the same characterstic(symantic) combinations will n't update into DSO even though they are correct. they will be written to error stact to ensure the data quality.
    In write optimized DSO, technical fileds are automatically taken, then in the semantic fileds you can specify the charcterstic which should act as primary keys(but not exactly).
    Thanks
    Sreekanth.

  • Write optimized dso - Stand DSO loading by Delta Req by req is not working

    Hi All,
    We have a data flow which has 1st layer having write optimized dso and then we are loading data to Standard DSO.
    We are using Delta DTP to load the data with the option "Get All New Data Request By Request", but it is not working.
    Suppose if  i have 2 requests which are need to update to Standard Dso from Write Optimized Dso, it is updating in single request. We are on SAP BW 701 patch level is 005.
    Please suggest me ....

    Hi Prasanna,
    For reasons of downward compatibility, the system behaves differently for DTPs that were created before "SAP NetWeaver 7.0 Support Package Stack 13"
    DTPs for which the indicator is set only get the first new request, even if there is more than one new source request at the time of processing. This restricts the way in which these DTPs can be used in process chains, because requests accumulate in the source and the target may not contain the current data. The Retrieve Until No More New Data indicator is therefore displayed for these DTPs.
    I Suggest you to set this indicator and activate the DTP.
    If you set the Retrieve Until No More New Data indicator and then activate the DTP, once it completes processing, a DTP request checks whether there are any further requests in the source. If the source contains more requests, a new DTP request is automatically generated and processed. Once the DTP is activated, the indicator is no longer visible in DTP maintenance.
    This applies when the DTP is started by a process chain and also when you start the DTP directly from DTP maintenance.
    The indicator is set by default for new DTPs that get data request by request.
    If you do not select the indicator, the label for the Get All New Data Request By Request indicator changes to Get One Request Only. Once the DTP is activated, only this indicator is displayed in DTP maintenance.
    Regards,
    Sudheer.

  • Multiple data loads in PSA with write optimized DSO objects

    Dear all,
    Could someone tell me how to deal with this situation?
    We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
    The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
    For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
    As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
    We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
    Does any of you have a solution for this?
    Thanks in advance.
    Harald

    Hi Ajax,
    I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
    The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
    Regards,
    Harald

  • Update of write optimized DSO by csv file

    Hi Gurus,
    I have observed few things while trying to upload a write optimized (WO) DSO from a flat file in BI 7.0. The data flow is as follows:
    data source -> transformation -> data target (DSO - WO).
    from a test perspective, i have updated 5 records till PSA with IPAK and subsequently updated it to the DSO using DTP. When i check the in the manage i found records transferred = 5 and added = 5. which is okay.
    then i again updated the same 5 records to PSA, and triggered DTP. Now DTP brings 10 records (5 + 5). records transferred = 10 and updated = 10. when i checked in the header tab in DTP monitor i found the selection brings both the PSA request IDs.
    again i loaded the same 5 records to PSA, a new PSA request ID generated and DTP extracts this PSA id along with the old 2 already transferred. Now records transferred = 10 added = 15. why transferred 10 ? i am getting confused here. I was expecting it to follow the same way, then it should have transferred 15 and added 15. Ideally there is no routine which generates additional record. There for this is not possible at all. Anybody has observed this strange behaviour ?
    Why this is happening ? I was expecting it should only bring 5 records every time ? Is it something specific to write optimized DSO or am i doing something wrong here ? Is there any setting where i can set the parameter no to select the old PSA request that already updated to DSO ?
    Please clarify.
    Soumya
    Message was edited by:
            Soumya Mishra

    Is your DTP full or delta.
    Here is some interesting discussion.
    /thread/348238 [original link is broken]

Maybe you are looking for

  • Display title in spool generated in background processing of report

    Hi All, I am working on a report which uses splitter container to display report details as title in one part of the container and ALV report (using OOPS ALV) in the other part when executed in foreground. I have referred the following links and adde

  • X48c User with Freezing Problem! Please Help!

    Hi Guys, I just recently built my new pc with the new x48c platinum motherboard. It worked fine for about....3 days, then it just started to freeze on me.   And its happening all the time! Mouse, keyboard, sound....everything just locks up! So i have

  • Passing multiple rows from VC to GP

    Hi, I am trying to implement a GP process where I want to get a list of structure from a VC UI. Basically, this is what I want to go: a. Present a GP CO with a VC UI with a table to allow user to insert rows of information b. Transfer these rows of i

  • Is the iTunes store down?

    Is the iTunes store down? I can't connect to it from my iPhone and iPad. Anyone having the same problem?

  • Second hard drive for Yoga 13

    I have asked several times and never received a reply but does anyone know if this SSD fits in the second slot of a Yoga 13? http://www.amazon.com/Crucial-240GB-mSATA-Internal-CT240M500SSD3/dp/B00BQ8RKT4/ref=cm_cr_pr_product_... Solved! Go to Solutio