Testing Delta Consistency for write-optimized DSO

Hi,
Please do let me know how to carry  testing for delta consistency for write-optimized DSO.
Is it just to check whether the check box under the settings.. "Check Delta Consistency " is checked or the other way.
Please do let me know.
Thanks & Regards,
Lavanya.

Hi,
The delta consistency flag for W-O DSO is used to ensure data consistency between the w-o DSO and any target propagated to. zfor example, you have dataflow ZWDSO1 -> zsdso1. ZWDSO1 has the flag 'check delta consistency' set to ON. You load data from ZWDSO1 to ZSDSO1 via request 1.
When the load is successful, if you try to delete the request 1 from ZWDSO1, deletion should not be possible; also you will not be able to change the flag to OFF as delta has already been propagated, and if you change the flag, delta consistency might be lost!
Now, if you delete request 1 from ZSDSO1 and then try to delete request 1 from ZWDSO1, deletion should be possible.
Hope the explanation helps!
Regards,
Rakesh

Similar Messages

  • Data archiving for Write Optimized DSO

    Hi Gurus,
    I am trying to archive data in Write Optimized DSO.
    Its allowing me to archive on request basis but it archives entire requests in the DSO(Means all data).
    But i want to select to archive from this request to this request.(Selection of request by my own).
    Please guide me.
    I got the below details from SDN.Kindly check.
    Archiving for Write optimized DSOs follow request based archiving as opposed to time slice archiving in standard DSO. This means that partial request activation is not possible; only complete requests have to be archived.
    Characteristic for Time Slice can be a time characteristic present in the WDSO, or the request creation data/request loading date. You are not allowed to add additional infoobjects for semantic groups, default is 0REQUEST & 0DATAPAKID.
    The actual process of archiving remains the same i.e
    Create a Data Archiving Process
    Create and schedule archiving requests
    Restore archiving requests (optional)
    Regards,
    kiruthika

    Hi,
    Please check the below OSS Note :
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d31313338303830%7d
    -Vikram

  • Semantic keys for write optimized DSO

    Hi experts,
    Can anyone tell me more about semantic key in a write optimized DSO ?
    I have a DSO cocnerning sales orders with 3 Datasources.
    How can you define the semantic key ?
    I have schedule line number - document number and position number in semantic key but when I load the PSA, I have an error with duplicate data.
    Any clues ?
    Thanks.

    Hi Oliver,
    If you specify any of the charaterstics symantic fileds, when you load data to the DSO, if any error record comes the following records which has the same characterstic(symantic) combinations will n't update into DSO even though they are correct. they will be written to error stact to ensure the data quality.
    In write optimized DSO, technical fileds are automatically taken, then in the semantic fileds you can specify the charcterstic which should act as primary keys(but not exactly).
    Thanks
    Sreekanth.

  • Unable to delete request from write-optimized DSO (Error during rollback)

    Hi Gurus,
    I am trying to delete a delta request from a Write-Optimized DSO. This request was uploaded with a DTP from another Write-optimized DSO.
    The actual overall status of the request is RED and the description of that status is now: 'Error during rollback of request DTPR_4JW6NLSVDUNYY3GTD6F4DQJWR; only rollback allowed'.
    I checked the log of all Request Operations in DataStore (from the same line where the red request is now) and I see my several attemps to delete this request under a RED radiobutton with the title Rollback.  The details for this error are the following:
    Could not delete request data from active table
    Message no. RSODSO_ROLLBACK114
    Diagnosis
    The system could not delete the request data from the active table of a write-optimized DataStore object.
    System Response
    Write-optimized DataStore object: DTFISO02
    Active table: /BIC/ADTFISO0200
    Request: DTPR_4JW6NLSVDUNYY3GTD6F4DQJWR
    Procedure
    Search for Notes containing the key words "Delete write-optimized DSO PSA"
    I am relatively new to SAP BI 7.0 and I do not know how to delete this request.  Any help will be highly appreciated !!
    Leticia

    Hi Leticia:
    Take a look at the SAP Notes below.
    Note 1111065 - "701: Delta consistency check for write-optimized DSOs"
    Note 1263877 - "70SP20: Delta consistency check for write-optimized DSOs"
    Note 1125025 - "P17:PSA:DSO:ODSR missing in PSA process for write-opt. DSO"
    Additionally, some ideas from the alternative presented on the blog by KMR might help you.
    "How to generate a selective deletion program for info provider"
    Regards,
    Francisco Mílán.

  • Duplicate Error while loading data to Write Optimized DSO

    Hi,
    When i do a dataload for Write Optimized DSO, I am getting an Error as "Duplicate Data Record Detected".  I have Sales Document Number, Fiscal Year Variant & Billing Item as Semantic Key in the DSO. 
    For this DSO, I am getting data from a Test ECC system, in which most of the Sales Document Number column is Blank for this Datasource.
    When i go into the Error Stack of the DSO, all the rows whichever has Sales Document number as Blank are displayed.  For all this rows, the Item Number is 10.
    Am i getting this Duplicate error as the Sales Document Number is Blank & Item number is 10 for all of them?  I read in Threads that Write Optimized DSO doesnt care about the Key Values, it loads the data even if the Key values are same.
    Any help is highly appreciated.
    Regards,
    Murali

    Hi Murali,
    Is the Item Number a key field ?
    When all the key fields are same then data gets aggreagted depending on the setting done in the transformation for the key figures. These 2 options for key figures are :
    1. Add up the key figures
    2. Replace the key figure
    Since the Sales Document No is blank and Item Number is same then their is a possibility that the key figures for these records might get added up or replaced and thus due to this property of SDSO it might not be throwing error.
    Check the KF value in the SDSO for that Sales Doc No and Item No. and try to find out what is the value for KF. It may have addition of all the common data fields or the KF value of last common record.
    Regards
    Raj Rai

  • Write Optimized DSO Probelm

    Hi Friends,
    I have to create a process chain for write optimized DSO so i know that every day we have to delete the data from both data target and psa but i am confused how to set delete data every day in psa of my data source.
    can any body help me in this issue
    Thanks and Regards
    pedamarla

    Hi,
    While creating the process chain, select the process type Delete PSA, give the datasource name and schedule it for daily run. Thenafter, run the load through process chain,
    Cheers...
    Puneesh

  • Can a write-optimized DSO be used for Delta upload

    Hi,
    can any one please answer following..
    1. can a write optimized DSO be used for Delta upload?
    2. Does industry based content is available in BI Content ?
    Thanks&Regards
    Satya

    Hi,
    Write-Optimized DataStore does not support the image based delta, it supports request level delta, and you will get brand new delta request for each data load.
    Since write-optimized DataStore objects do not have a change log, the system does not create delta (in the sense of a before image and an after image). When you update data into the connected InfoProviders, the system only updates the requests that have not yet been posted.
    Write-Optimized Data Store supports request level delta. In order to capture before and after image delta, you must have to post latest request into further targets like Standard DataStore or Infocubes.

  • Write optimized dso - Stand DSO loading by Delta Req by req is not working

    Hi All,
    We have a data flow which has 1st layer having write optimized dso and then we are loading data to Standard DSO.
    We are using Delta DTP to load the data with the option "Get All New Data Request By Request", but it is not working.
    Suppose if  i have 2 requests which are need to update to Standard Dso from Write Optimized Dso, it is updating in single request. We are on SAP BW 701 patch level is 005.
    Please suggest me ....

    Hi Prasanna,
    For reasons of downward compatibility, the system behaves differently for DTPs that were created before "SAP NetWeaver 7.0 Support Package Stack 13"
    DTPs for which the indicator is set only get the first new request, even if there is more than one new source request at the time of processing. This restricts the way in which these DTPs can be used in process chains, because requests accumulate in the source and the target may not contain the current data. The Retrieve Until No More New Data indicator is therefore displayed for these DTPs.
    I Suggest you to set this indicator and activate the DTP.
    If you set the Retrieve Until No More New Data indicator and then activate the DTP, once it completes processing, a DTP request checks whether there are any further requests in the source. If the source contains more requests, a new DTP request is automatically generated and processed. Once the DTP is activated, the indicator is no longer visible in DTP maintenance.
    This applies when the DTP is started by a process chain and also when you start the DTP directly from DTP maintenance.
    The indicator is set by default for new DTPs that get data request by request.
    If you do not select the indicator, the label for the Get All New Data Request By Request indicator changes to Get One Request Only. Once the DTP is activated, only this indicator is displayed in DTP maintenance.
    Regards,
    Sudheer.

  • Sematic group for a write optimized DSO

    Hello,
    I am loading from a datasource to a write optmized DSO. While doing so, I am not able to set semtantic key/group for the DTP.  I have turned error handling on using the setting 'Valid records update, No reporting(request RED)'.
    Is this setting not possible for a write optimized DSO? if not why not? and if yes, what am i missing!?
    Thanks a lot in advance,
    Prakash

    Thank you,
    here are my findings:
    - if data has been loaded with delta using the DTP, it is not possible to change the sematic groups any more - all requests have to be deleted first
    - the error handling must be turned on
    The issue is resolved.
    Marek

  • Delta loading procedure from Write Optimized DSO to Infocube

    Hi All,
    We are using Write optimized DSO in our project to which I am loading data using Standard DSO 0FI_GL_12
    From Write optimized DSO, we are loading delta records into Infocube, please provide your inputs to my following questionnaire:
    1) I am quite interested to know how the delta records gets loaded into infocube whenever we are using 'Write optimized DSO' as we don't have any image concept in Optimized DSO
    Ex: If am using Standard DSO, we have Change log table and image concept will allow to get updated value to Cube
    let us assume
    Active Table
    111            50
    111            70 (overwrite)
    Change Log Table
    111            -50        (X -- Before Image)
    111             70    ( '  ' -- After Image) symbol for after image is 'Space'
    So if we load this record to the target as a delta the above two records from change log table will get loaded to the CUBE and Cube will have 70 as the updated record
    If am using 'Write Optimized',
    Active Table
    111            50
    111            70 (overwrite)
    When this record loaded to the cube, as Info Cube is always having 'Additive' feature so the total value will 50+70 =120 which is wrong?
    Correct me what feature will work here to get updated value as '70' into Cube from Write Optimized DSO'
    2)As the data source is delta capable and having  'ADDITIVE' delta process, only the delta records based on REQUEST ID are loaded into Info Cube with the  updated key figure value?
    Thanks for your inputs and much appreciated.
    Regards,
    Madhu

    Hi Madhu,
    In best practice, we use WODSO in Initial layer and then Standard DSO. Just for mass data load/staging purpose.
    In best practice : Data source ----> WODSO ---> std. DSO
    In your case : Data source ----> Std.DSO  -----> WODSO.
    In both cases if data load design is not in accurate way, then your cube will have incorrect entries.
    For ex:  today 9 am : 111,  50  (in active table)
    Data load to cube, same day 11 am : then cube will have 111    50.
    Same day, value got changed  in std. DSO  1 pm :   111   70(over write function @ active table).
    Same day/next day if you load data to cube, it will have 2 records one is with value 50 and other would be 70.  So to avoid such scenarios we should plan load in accurate way.  Else make change your DTP settings as  ‘source table : change log table.
    Coming to your case:
    Once after the load to Std. DSO, load data to WODSO by changing the DTP settings ‘Delta Init.Extraction from’  : Change log.
    Now data available @WODSO from change log table, then you load to cube In delta mode.

  • Missing PARTNO field in Write Optimized DSO

    Hi,
    I have a write optimized DSO, for which the Partition has been deleted (reason unknown) in Dev system.
    For the same DSO, partition parameters exists in QA and production.
    Now while transporting this DSO to QA, I am getting an error "Old key field PARTNO has been deleted", and the DSO could not be activated in the target system.
    Please let me know, how do I re-insert this techinal key of PARTNO in my DSO.
    I pressuming it has something to do with partitioning of the DSO.
    Please Help.......

    Hi,
    Since the write-optimized DataStore object only consists of the table of active data, you do not have to activate the data, as is necessary with the standard DataStore object. This means that you can process data more quickly.
    The loaded data is not aggregated; the history of the data is retained. If two data records with the same logical key are extracted from the source, both records are saved in the DataStore object. The record mode responsible for aggregation remains, however, so that the aggregation of data can take place later in standard DataStore objects.
    The system generates a unique technical key for the write-optimized DataStore object. The standard key fields are not necessary with this type of DataStore object. If standard key fields exist anyway, they are called semantic keys so that they can be distinguished from the technical keys. The technical key consists of the Request GUID field (0REQUEST), the Data Package field (0DATAPAKID) and the Data Record Number field (0RECORD). Only new data records are loaded to this key.
    You can specify that you do not want to run a check to ensure that the data is unique. If you do not check the uniqueness of the data, the DataStore object table may contain several records with the same key. If you do not set this indicator, and you do check the uniqueness of the data, the system generates a unique index in the semantic key of the InfoObject. This index has the technical name "KEY". Since write-optimized DataStore objects do not have a change log, the system does not create delta (in the sense of a before image and an after image). When you update data into the connected InfoProviders, the system only updates the requests that have not yet been posted.
    PS: Excerpt from http://help.sap.com/saphelp_nw2004s/helpdata/en/b6/de1c42128a5733e10000000a155106/frameset.htm
    Hope this helps.
    Best Regards,
    Rajani

  • Update of write optimized DSO by csv file

    Hi Gurus,
    I have observed few things while trying to upload a write optimized (WO) DSO from a flat file in BI 7.0. The data flow is as follows:
    data source -> transformation -> data target (DSO - WO).
    from a test perspective, i have updated 5 records till PSA with IPAK and subsequently updated it to the DSO using DTP. When i check the in the manage i found records transferred = 5 and added = 5. which is okay.
    then i again updated the same 5 records to PSA, and triggered DTP. Now DTP brings 10 records (5 + 5). records transferred = 10 and updated = 10. when i checked in the header tab in DTP monitor i found the selection brings both the PSA request IDs.
    again i loaded the same 5 records to PSA, a new PSA request ID generated and DTP extracts this PSA id along with the old 2 already transferred. Now records transferred = 10 added = 15. why transferred 10 ? i am getting confused here. I was expecting it to follow the same way, then it should have transferred 15 and added 15. Ideally there is no routine which generates additional record. There for this is not possible at all. Anybody has observed this strange behaviour ?
    Why this is happening ? I was expecting it should only bring 5 records every time ? Is it something specific to write optimized DSO or am i doing something wrong here ? Is there any setting where i can set the parameter no to select the old PSA request that already updated to DSO ?
    Please clarify.
    Soumya
    Message was edited by:
            Soumya Mishra

    Is your DTP full or delta.
    Here is some interesting discussion.
    /thread/348238 [original link is broken]

  • Standard DSO - Write Optimized DSO, key violation, same semantic key

    Hello everybody,
    I'm trying to load a Write-Optimized DSO from another Standard DSO and then is raised the "famous" error:
    During loading, there was a key violation. You tried to save more than
    one data record with the same semantic key.
    The problematic (newly loaded) data record has the following properties:
    o   DataStore object: ZSD_O09
    o   Request: DTPR_D7YTSFRQ9F7JFINY43QSH1FJ1
    o   Data package: 000001
    o   Data record number: 28474
    I've seen many different previous posts regarding the same issue but not quite equal as mine:
    [During loading, there was a key violation. You tried to save more than]
    [Duplicate data records at dtp]
    ...each of them suggests to make some changes in the Semantic Key. Here's my particular context:
    Dataflow goes: ZSD_o08 (Standard DSO) -> ZSD_o09 (Write-Optimized DSO)
    ZSD_o08 Semantic Keys:
    SK1
    SK2
    SK3
    ZSD_o09 Semantic Keys:
    SK1
    SK2
    SK3
    SK4 (value is taken in a routine as SY-DATUM-1)
    As far as I can see there are no repeated records for semantic keys into ZSD_o08 this is confirmed by querying at active data table for ZSD_o08 ODS. Looking for the Temporary Storage for the crashed DTP at the specific package for the error I can't neither see any "weird" thing.
    Let's suppose that the Semantic Key is crucial as is currently set.
    Could you please advice?. I look forward for your quick response. Thank you and best regards,
    Bernardo

    Hi  Bernardo:
    By maintaining the settings on your DTP you can indicate wether data should be extracted from the Active Table or Change Log Table as described below.
    >-Double click on the DTP that transfers the data from the Standard DSO to the Write Optimized DSO and click on the "Extraction" Tab, on the group at the bottom select one of the 4 options:
    >Active Table (With Archive)
    >Active Table (Without Archive)
    >Active Table (Full Extraction Only)
    >Change Log
    >Hit the F1 key to access the documentation
    >
    >===================================================================
    >Indicator: Extract from Online Database
    >The settings in the group frame Extraction From... or Delta Extraction From... of the Data Transfer Process maintenance specify the source from which the data of the DTP is extracted.  For a full DTP, these settings apply to all requests started by the DTP. For a delta DTP, the settings only apply to the first request (delta initialization), since because of the delta logic, the following requests must all be extracted from the change log.
    >For Extraction from the DataStore Object, you have the following options:
    >Active Table (with Archive)
    >The data is read from the active table and from the archive or from a near-line storage if one exists. You can choose this option even if there is no active data archiving process yet for the DataStore object.
    >Active Table (Without Archive)
    >The data is only read from the active table. If there is data in the archive or in a near-line storage at the time of extraction, this data is not extracted.
    >Archive (Only Full Extraction)
    >The data is only read from the archive or from a near-line storage. Data is not extracted from the active table.
    >Change Log
    >The data is read from the change log of the DataStore object.
    >For Extraction from the InfoCube, you have the following options:
    >InfoCube Tables
    >Data is only extracted from the database (E table and F table and aggregates).
    >Archive (Only Full Extraction)
    >The data is only read from the archive or from a near-line storage.
    Have you modified the default settings on the DTP? How is the DTP configured right now? (or how was configured before your testing?)
    Hope this helps,
    Francisco Milán.

  • Error occurs while activating a 'Write Optimized' DSO.

    I am getting error " There is no PSA for infosource 'XXXX'  and source system 'XXX' while activating a newly defined DSO object.
    I am able to activate a Standard DSOs, however the error occurs while activating a 'Write Optimized' DSO

    Hi,
    For write optimised DSO, check if you have tick the uniqueness of the records. If you check that and if there are two same records coming from source in one go, then you will get error
    From SAP help
    You can specify that you do not want to run a check to ensure that the data is unique. If you do not check the uniqueness of the data, the DataStore object table may contain several records with the same key. If you do not set this indicator, and you do check the uniqueness of the data, the system generates a unique index in the semantic key of the InfoObject. This index has the technical name "KEY". Since write-optimized DataStore objects do not have a change log, the system does not create delta (in the sense of a before image and an after image). When you update data into the connected InfoProviders, the system only updates the requests that have not yet been posted.
    Thanks
    Srikanth

  • Duplicate Semantic Key in Write Optimized DSO

    Gurus
    Duplicate semantic keys have a unique index KEY in the key fields of the DSO when WRITE OPTIMIZED DSO is used. (Of course this is assuming the Do not check uniqueness of data indicator is not checked..)
    See help https://help.sap.com/saphelp_crm60/helpdata/en/a6/1205406640c442e10000000a1550b0/frameset.htm
    This means that the DSO can contain duplicate records.
    My question is: What happens to these duplicates when a request level delta update is done to a Standard DSO or Infocube?
    Do duplicates end up in the error stack? Are they simply aggregated in further loads? - because this would be a problem for reporting (double-counting).
    thanks
    tony

    Hi Tony,
    It will aggregate the data in some undesired way.
    Read on...
    https://help.sap.com/saphelp_crm60/helpdata/en/b6/de1c42128a5733e10000000a155106/frameset.htm
    If you want to use write-optimized DataStore objects in BEx queries, we recommend that they have a semantic key and that you run a check to ensure that the data is unique. In this case, the write-optimized DataStore object behaves like a standard DataStore object. If the DataStore object does not have these properties, unexpected results may be produced when the data is aggregated in the query.
    Hope it helps...
    Regards,
    Ashish

Maybe you are looking for

  • How to capture all queries in a log file

    I need a way to capture all queries fired by all my 200+ DAO java objects easily in a log file easy debugging purpose. Any thoughts?

  • Can individual questions in a test be tracked for comparision purposes?

    I want to be able to track the results of individual question  (user response and correct or incorrect) over a large number of students.  The intent is to identify questions that are misleading or have some other issue that causes students to select

  • Open the Same RAW twice

    Sorry for my English.... I like to open the same RAW a second time on Photoshop 11. Earlier in the Day it works to open the same file it a second time. I can open the Same Raw twice with different Setting. You know what i mean? Since a few Minutes, i

  • Problem translating region template (fr = de)

    The template has form of <tr>      <td style="font-weight:bold">Immeuble </td><td>#NUM_NOM_IMM#</td>      <td style="font-weight:bold">Responsable immeuble</td><td>#RSP_IMM# </td> </tr> As long as I don't translate the template, it renders OK. Howeve

  • Adobe reader suddenly full load

    no error message , no windows log error. don't know why full load.... anyone saw it before? Thanks.