Updating Write-Optimized ODS

How to update a write optimized ODS ? Are there API's avaliable ?

Hi Satya,
Some important points on Write Optimized ODS:
• The system does not generate SIDs for write-optimized DataStore objects and you do not need to activate them. This means that you can save and further process data quickly. Reporting is possible on the basis of these DataStore objects. However, SAP recommend that you use them as a consolidation layer, and update the data to additional InfoProviders, standard DataStore objects, or InfoCubes
• For performance reasons, SID values are not created for the characteristics that are loaded. The data is still available for BEx queries. However, in comparison to standard DataStore objects, you can expect slightly worse performance because the SID values have to be created during reporting.
• If you want to use write-optimized DataStore objects in BEx queries, SAP recommend that they have a semantic key and that you run a check to ensure that the data is unique. In this case, the write-optimized DataStore object behaves like a standard DataStore object. If the DataStore object does not have these properties, unexpected results may be produced when the data is aggregated in the query.
Regards,
Anil

Similar Messages

  • Error while loading data from write optimized ODS to cube

    Hi All,
    I am loading data from a write optimized ODS to cube
    I have done Generate Export Datasource
    schedulled the info packge with 1 selection for full load
    then it gave me following error in Transfer IDOCs & TRFC
    Info IDOC 1: IDOC with errors added
    Info IDOC 2: IDOC with errors added
    Info IDOC 3: IDOC with errors added
    Info IDOC 4: IDOC with errors added
    Data packege 1 arrived in BW Processing : selected number does not agree with transferred number
    Processing below is green
    shows update of  4 new records to Datapackage 1.
    Please provide inputs for the resolution
    Thanks & Regards,
    Rashmi.

    please let me know, What more details you need?
    If I click F1 for error details i get following message
    Messages from source system
    see also Processing Steps Request
    These messages are sent by IDoc from the source system. Both the extractor itself as well as the service API can send messages. When errors occur, several messages are usually sent together.
    From the source system, there are several types of messages that can be differentiated by the so-called Info-IDoc-Status. The IDoc with status 2 plays a particular role here; it describes the number of records that have been extracted in a source system and sent to BI. The number of the records received in BI is checked against this information.
    Thanks & Regards,
    Rashmi.

  • Queries on Write Optimized ODS

    Hi All,
    Do BEx Queries work efficiently on write optimized ODS ?
    Regards,
    Satya

    Hi Satya,
    Some important points on Write Optimized ODS:
    • The system does not generate SIDs for write-optimized DataStore objects and you do not need to activate them. This means that you can save and further process data quickly. Reporting is possible on the basis of these DataStore objects. However, SAP recommend that you use them as a consolidation layer, and update the data to additional InfoProviders, standard DataStore objects, or InfoCubes
    • For performance reasons, SID values are not created for the characteristics that are loaded. The data is still available for BEx queries. However, in comparison to standard DataStore objects, you can expect slightly worse performance because the SID values have to be created during reporting.
    • If you want to use write-optimized DataStore objects in BEx queries, SAP recommend that they have a semantic key and that you run a check to ensure that the data is unique. In this case, the write-optimized DataStore object behaves like a standard DataStore object. If the DataStore object does not have these properties, unexpected results may be produced when the data is aggregated in the query.
    Regards,
    Anil

  • Updating Write Optimized DataStore Objects

    Are there any API's avaliable to update Write Optimized DataStore Objects ?

    Hello,
    You may wish to check below OSS note -
    <b>954550 -Unable to convert status for write-opt. DataStores </b>
    This solves the problem ,in case it doesnt ,raise customer message to SAP pls.
    Hope it Helps
    Chetan
    @CP..

  • How to update request of write-optimised ODS to other ODS or Cube by DTP?

    Hi,gurus here.
    I've tried to update write-optimized ODS request to another ODS by DTP in both full and delta mode, bit it's futile.
    So is that possible? how to do?
    Thanks.

    Hi..
    Please check the below things.
    1. Is the request 'Green" in Write-Optimized DSO?
    2. Does your target DSO already contains the request? Write-Opt DTP sends Delta based on request , so if one request has been loaded once to target, next time it will not be loaded with Delta DTP.
    3. Is there any filter settings in DTP? Suppose you have a Filter on CALYEAR 2008 but in your W-O DSO you do not have any data for 2008.
    If the above does not solve your problem , please paste the DTP monitor log .
    Regards
    Anindya

  • Duplication Error while loading data in write optimized DSO

    Hi Experts,
    I have an issue.In BI7 I'm trying to load the data in a WRITE OPTIMIZED ODS from the controlling data source 0CO_OM_CCA_10. I'm getting the data properly in PSA, but while loading it into my WODSO i'm getting the duplication error although my keys fields and data fields are properly placed in my data target(WODSO).
    pls let me know what is the solution to load it successfully.
    Thanks in Advance.
    Amit

    Hi,
    thanks for your reply
    I'm getting this error message:
    Diagnosis
        During loading, there was a key violation. You tried to save more than
        one data record with the same semantic key.
        The problematic (newly loaded) data record has the following properties:
        o   DataStore object: GWFDSR02
        o   Request: DTPR_4BA3GF8JMQFVQ8YUENNZX3VG5
        o   Data package: 000006
        o   Data record number: 101
    Although i have selected the key fields which identifies unique record,then also i'm getting the duplication error.
    Even i have reffered to the BI content for this data source and found that it has the same key fields as of mine.
    Debjani: i need unique records without duplication and i'm doing a full load in DTP.
    What is to be done pls help
    Thanks in advance.
    Edited by: Amit Kotwani on Sep 26, 2008 10:56 AM

  • Write optimized parallel loads ?

    Hello BI gurus,
    Is parallel loads possible for write optimized ODS ?
    if possible what are the limitations of it.
    Please let me know your opinion.
    Thanks in advance,
    BWer

    In write optimized DSOs the technical key is generated automatically when loading data, and the standard keys are not really required. If you still have those standard Keys , they are called "semantic keys". Those semantic keys can "bundled" into semantic groups in the DTP.
    http://help.sap.com/saphelp_nw2004s/helpdata/en/b6/de1c42128a5733e10000000a155106/frameset.htm
    Hope it helps,
    Gili

  • Update of write optimized DSO by csv file

    Hi Gurus,
    I have observed few things while trying to upload a write optimized (WO) DSO from a flat file in BI 7.0. The data flow is as follows:
    data source -> transformation -> data target (DSO - WO).
    from a test perspective, i have updated 5 records till PSA with IPAK and subsequently updated it to the DSO using DTP. When i check the in the manage i found records transferred = 5 and added = 5. which is okay.
    then i again updated the same 5 records to PSA, and triggered DTP. Now DTP brings 10 records (5 + 5). records transferred = 10 and updated = 10. when i checked in the header tab in DTP monitor i found the selection brings both the PSA request IDs.
    again i loaded the same 5 records to PSA, a new PSA request ID generated and DTP extracts this PSA id along with the old 2 already transferred. Now records transferred = 10 added = 15. why transferred 10 ? i am getting confused here. I was expecting it to follow the same way, then it should have transferred 15 and added 15. Ideally there is no routine which generates additional record. There for this is not possible at all. Anybody has observed this strange behaviour ?
    Why this is happening ? I was expecting it should only bring 5 records every time ? Is it something specific to write optimized DSO or am i doing something wrong here ? Is there any setting where i can set the parameter no to select the old PSA request that already updated to DSO ?
    Please clarify.
    Soumya
    Message was edited by:
            Soumya Mishra

    Is your DTP full or delta.
    Here is some interesting discussion.
    /thread/348238 [original link is broken]

  • What is the main difference of Direct update DSO and Write optimized DSO

    What is the main difference of Direct update DSO and Write optimized DSO?

    Hi chandra:
      Check this link.
    http://help.sap.com/saphelp_nw04s/helpdata/en/f9/45503c242b4a67e10000000a114084/content.htm
    You can find another difference on page 147, section "Reclustering DataStore Objects" of the document "Enterprise Data Warehousing"
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/67efb9bb-0601-0010-f7a2-b582e94bcf8a?quicklink=index&overridelayout=true
    >You can only use reclustering for standard DataStore objects and DataStore objects for direct update. You cannot use reclustering for write-optimized DataStore objects. User-defined multidimensional clustering is not available for write-optimized DataStore objects
    Regards,
    Francisco Milán.
    Edited by: Francisco Milan on Aug 11, 2010 7:09 PM

  • Difference between -  Write Optimized and Direct Update DSO

    Hi Gurus,
    I know the similarities of the Write Optimized and Direct Update DSO.
    But I want to know what is the difference between the both.
    Can any expert let me know the difference between the both please.
    Thanks

    Hi,
    Write Optimised DSO:
    Write optimsed DSO has been designed to be the initial staging of the source system  data from where the data could be transfered to the standard DSO or the Infocube.
    The Data is immediately written to the further data targets.We can save the activation time.Reporting also possible on this.
    SAP recommends to use Write-Optimized DataStore as a EDW inbound layer, and update the data into further targets such as standard DataStore objects or InfoCubes.
    Direct Update DSO:
    Dat a store object for direct update conatains daya in a single version.therefore data is stored same form in which it was written to DSO for direct update by the application.We can use this type of DSO for APD.
    In IP we can use this dso and directly we can enter the data using RSINPUT.
    Thanks
    Madhavi

  • Error occurs while activating a 'Write Optimized' DSO.

    I am getting error " There is no PSA for infosource 'XXXX'  and source system 'XXX' while activating a newly defined DSO object.
    I am able to activate a Standard DSOs, however the error occurs while activating a 'Write Optimized' DSO

    Hi,
    For write optimised DSO, check if you have tick the uniqueness of the records. If you check that and if there are two same records coming from source in one go, then you will get error
    From SAP help
    You can specify that you do not want to run a check to ensure that the data is unique. If you do not check the uniqueness of the data, the DataStore object table may contain several records with the same key. If you do not set this indicator, and you do check the uniqueness of the data, the system generates a unique index in the semantic key of the InfoObject. This index has the technical name "KEY". Since write-optimized DataStore objects do not have a change log, the system does not create delta (in the sense of a before image and an after image). When you update data into the connected InfoProviders, the system only updates the requests that have not yet been posted.
    Thanks
    Srikanth

  • Write optimized dso - Stand DSO loading by Delta Req by req is not working

    Hi All,
    We have a data flow which has 1st layer having write optimized dso and then we are loading data to Standard DSO.
    We are using Delta DTP to load the data with the option "Get All New Data Request By Request", but it is not working.
    Suppose if  i have 2 requests which are need to update to Standard Dso from Write Optimized Dso, it is updating in single request. We are on SAP BW 701 patch level is 005.
    Please suggest me ....

    Hi Prasanna,
    For reasons of downward compatibility, the system behaves differently for DTPs that were created before "SAP NetWeaver 7.0 Support Package Stack 13"
    DTPs for which the indicator is set only get the first new request, even if there is more than one new source request at the time of processing. This restricts the way in which these DTPs can be used in process chains, because requests accumulate in the source and the target may not contain the current data. The Retrieve Until No More New Data indicator is therefore displayed for these DTPs.
    I Suggest you to set this indicator and activate the DTP.
    If you set the Retrieve Until No More New Data indicator and then activate the DTP, once it completes processing, a DTP request checks whether there are any further requests in the source. If the source contains more requests, a new DTP request is automatically generated and processed. Once the DTP is activated, the indicator is no longer visible in DTP maintenance.
    This applies when the DTP is started by a process chain and also when you start the DTP directly from DTP maintenance.
    The indicator is set by default for new DTPs that get data request by request.
    If you do not select the indicator, the label for the Get All New Data Request By Request indicator changes to Get One Request Only. Once the DTP is activated, only this indicator is displayed in DTP maintenance.
    Regards,
    Sudheer.

  • Duplicate Semantic Key in Write Optimized DSO

    Gurus
    Duplicate semantic keys have a unique index KEY in the key fields of the DSO when WRITE OPTIMIZED DSO is used. (Of course this is assuming the Do not check uniqueness of data indicator is not checked..)
    See help https://help.sap.com/saphelp_crm60/helpdata/en/a6/1205406640c442e10000000a1550b0/frameset.htm
    This means that the DSO can contain duplicate records.
    My question is: What happens to these duplicates when a request level delta update is done to a Standard DSO or Infocube?
    Do duplicates end up in the error stack? Are they simply aggregated in further loads? - because this would be a problem for reporting (double-counting).
    thanks
    tony

    Hi Tony,
    It will aggregate the data in some undesired way.
    Read on...
    https://help.sap.com/saphelp_crm60/helpdata/en/b6/de1c42128a5733e10000000a155106/frameset.htm
    If you want to use write-optimized DataStore objects in BEx queries, we recommend that they have a semantic key and that you run a check to ensure that the data is unique. In this case, the write-optimized DataStore object behaves like a standard DataStore object. If the DataStore object does not have these properties, unexpected results may be produced when the data is aggregated in the query.
    Hope it helps...
    Regards,
    Ashish

  • Missing PARTNO field in Write Optimized DSO

    Hi,
    I have a write optimized DSO, for which the Partition has been deleted (reason unknown) in Dev system.
    For the same DSO, partition parameters exists in QA and production.
    Now while transporting this DSO to QA, I am getting an error "Old key field PARTNO has been deleted", and the DSO could not be activated in the target system.
    Please let me know, how do I re-insert this techinal key of PARTNO in my DSO.
    I pressuming it has something to do with partitioning of the DSO.
    Please Help.......

    Hi,
    Since the write-optimized DataStore object only consists of the table of active data, you do not have to activate the data, as is necessary with the standard DataStore object. This means that you can process data more quickly.
    The loaded data is not aggregated; the history of the data is retained. If two data records with the same logical key are extracted from the source, both records are saved in the DataStore object. The record mode responsible for aggregation remains, however, so that the aggregation of data can take place later in standard DataStore objects.
    The system generates a unique technical key for the write-optimized DataStore object. The standard key fields are not necessary with this type of DataStore object. If standard key fields exist anyway, they are called semantic keys so that they can be distinguished from the technical keys. The technical key consists of the Request GUID field (0REQUEST), the Data Package field (0DATAPAKID) and the Data Record Number field (0RECORD). Only new data records are loaded to this key.
    You can specify that you do not want to run a check to ensure that the data is unique. If you do not check the uniqueness of the data, the DataStore object table may contain several records with the same key. If you do not set this indicator, and you do check the uniqueness of the data, the system generates a unique index in the semantic key of the InfoObject. This index has the technical name "KEY". Since write-optimized DataStore objects do not have a change log, the system does not create delta (in the sense of a before image and an after image). When you update data into the connected InfoProviders, the system only updates the requests that have not yet been posted.
    PS: Excerpt from http://help.sap.com/saphelp_nw2004s/helpdata/en/b6/de1c42128a5733e10000000a155106/frameset.htm
    Hope this helps.
    Best Regards,
    Rajani

  • Semantic keys for write optimized DSO

    Hi experts,
    Can anyone tell me more about semantic key in a write optimized DSO ?
    I have a DSO cocnerning sales orders with 3 Datasources.
    How can you define the semantic key ?
    I have schedule line number - document number and position number in semantic key but when I load the PSA, I have an error with duplicate data.
    Any clues ?
    Thanks.

    Hi Oliver,
    If you specify any of the charaterstics symantic fileds, when you load data to the DSO, if any error record comes the following records which has the same characterstic(symantic) combinations will n't update into DSO even though they are correct. they will be written to error stact to ensure the data quality.
    In write optimized DSO, technical fileds are automatically taken, then in the semantic fileds you can specify the charcterstic which should act as primary keys(but not exactly).
    Thanks
    Sreekanth.

Maybe you are looking for

  • Hidden application in itunes store. How to delete?

    hi how do delete hidden apps completely. I delete an app from my ipad and itunes, it still appears under hidden purchases in itunes store. the real issue why i want them gone is bcoz everytime i open itunes, it link with store and looks for available

  • Error in generating reference number of process

    Hi All Masters, I'm new in HCM P&F. Currently I have a problem. I was running a sample process or copied process from the sample. Clicking 'check and send'. Still OK. But when I clicked 'Send' button, I got an error message displayed 'Error in genera

  • Date and time clash in iPhone 3G

    I have a weird problem with my 3G running 3.2.1. I travel to the US fairly regularly and since my last return, my phone has continued to treat any appointments entered in Outlook 2003 and synced on my laptop as if I am still on California time. This

  • Final entry issue in Service Entry and PO

    Dear SAP Guru, I am having the situation like this where the servie PO is being created, then service entry and service acceptance are being carried out at ML81N. The PO amount is 100. When the service PO is being check at the ME2N, the field "Still

  • PSM - Funds Management - Subsequent Posting (transaction FMB0)

    Good day to all forumites. I am currently implementing Funds Management at our company (City Power Johannesburg South Africa) an electricity utility. The aim of implementing FM is to activate active availibility control (AVC) for cost center/cost ele