Stack Dump while creating Write Optimized DSO on BW 7.3, MS SQL - any SAP Notes?

While attempting to create a Write Optimized DSO on BW 7.3, MS SQL, we had a stack dump - ABAP Program     CL_RSD_ODSO_VERS==============CP with short text " Access using NULL object reference is not possible. "
Has anyone seen any SAP Notes on this subject?

Hi John,
Was any field deleted from the DSO, if yes, probably the system is not able to relate the pointer from the DSO table. Error like these come up very often when fields are deleted from DSO. Solution is to remove the field manually from the table also. However, the system is supposed to do but sometimes such ambiguity causes such issues.

Similar Messages

  • Unable to delete data target contents of Write-Optimized DSO in Process Chain

    Hi Experts,
    We are using SAP Net Weaver BW 7.01 version and we need to delete the entire data target contents of Write-Optimized DSO in the process chain before the next data load.
    I included this step in process chain but still it is failing with errore message"Message not found (in main memory), Drop Cube Failed In Data Target"
    This process type is working during BW 7.0 version but not in BW 7.01 version.
    However i found that we can use the program RSSM_DELETE_WO_DSO_REQUESTS to delete old requests in the Write-Optimized DSO for BW 7.01 SP07 as per SAP Note 1437407 but still it's not working even after implementing this program as the Prerequisite to delete the request is the data mart status should be updated where it is not happening for the program.
    We had an process type option to 'delete the requests from Write-Optimized DSO' directly in BW 7.3 but still not available in 7.01 version.
    Could you please suggest me on how to resolve this issue in BW 7.01?
    Many thanks for your help in advance.
    Regards,
    Madhu

    Create ABAP program as attached code.
    Then you can use that ABAP program in process chains through ABAP variant
    ABAP varaint should have following properties
    Select call mode as Synchronous; call from Local; and Program
    Give your ABAP program name in "program name" and create one program variant for each write optimized DSO.
    Please refer how to use ABAP program in process chains for further details.
    Hope this helps

  • Error while deleting data from a write optimized DSO using a Process Chain?

    Dear Bwers,
    I am facing a strange error while using process chains to delete data from data target which is a write optimized DSO. The process shows as failed but the data gets deleted in the DSO. The error message is as below. Did anybody have a similar problem? Any suggestions.
    Thanks
    Raj
    Error while deleting content of InfoCube/DataStore object ZLSD_G03
    Message no. RSDODSO153

    Please, check if you get any short dump in ST22 related to this issue.

  • Duplicate Error while loading data to Write Optimized DSO

    Hi,
    When i do a dataload for Write Optimized DSO, I am getting an Error as "Duplicate Data Record Detected".  I have Sales Document Number, Fiscal Year Variant & Billing Item as Semantic Key in the DSO. 
    For this DSO, I am getting data from a Test ECC system, in which most of the Sales Document Number column is Blank for this Datasource.
    When i go into the Error Stack of the DSO, all the rows whichever has Sales Document number as Blank are displayed.  For all this rows, the Item Number is 10.
    Am i getting this Duplicate error as the Sales Document Number is Blank & Item number is 10 for all of them?  I read in Threads that Write Optimized DSO doesnt care about the Key Values, it loads the data even if the Key values are same.
    Any help is highly appreciated.
    Regards,
    Murali

    Hi Murali,
    Is the Item Number a key field ?
    When all the key fields are same then data gets aggreagted depending on the setting done in the transformation for the key figures. These 2 options for key figures are :
    1. Add up the key figures
    2. Replace the key figure
    Since the Sales Document No is blank and Item Number is same then their is a possibility that the key figures for these records might get added up or replaced and thus due to this property of SDSO it might not be throwing error.
    Check the KF value in the SDSO for that Sales Doc No and Item No. and try to find out what is the value for KF. It may have addition of all the common data fields or the KF value of last common record.
    Regards
    Raj Rai

  • Error during loading and deletion of write-optimized DSO

    Hey guys,
    I am using a write optimized DSO ZMYDSO to store data from several sources (two datasources and one DSO).
    I have disabled the check of uniqueness in the DSO, but I defined a semantic key for the fields ZCLIENT, ZGUID, ZSOURCE, ZPOSID which are used in a non-unique index.
    In the given case, I want to delete existing rows in the DSO. I execute these steps in the endroutine. Here the abstract coding:
    LOOP AT RESULT_PACKAGE ASSINING <RESULT_FIELDS>.
    u201Csome other logic [u2026]
    DELETE /BIC/AZMYDSO00
    WHERE /BIC/ZCLIENT = RESULT_FIELDS-/BIC/ZCLIENT
         AND /BIC/ZGUID = RESULT_FIELDS-/BIC/ZGUID
    AND /BIC/ZSOURCE = RESULT_FIELDS-/BIC/ZSOURCE
    AND /BIC/ZPOSID = RESULT_FIELDS-/BIC/ ZPOSID.
    ENDLOOP.
    COMMIT WORK AND WAIT.
    During the Loading (after the transformation step in the updating step), I get the messages (not every time):
    1.     Error while writing the data. (RSAODS131)
    2.     Could not Save DataPackage xy in DataStore ZMYDSO (RSODSO_UPDATE027).
    Diagnosis: DataPackage XY could not be saved. Reasons therefore could be violation of key uniqueness (duplicate data) or general database error.
    3.     Error in the substep of updating DataStore.
    I have checked the system log (SM21) and the system dumps (ST22) but I could not find an exact error description.
    I guess, I am creating some inconsistencies or locks (I also checked the SM12) so that the load process interrupts. But I also tried a serial updating within the DTP (I reduced the number of batch processes to 1). No success.
    Perhaps the loading of one specific package could take a longer time so that the following package would overtake the predecessor. Could that be a problem? Do you generally advise against the deletion of rows within the endroutine?
    Regards,
    Philipp

    Hi,
    is ZMYDSO the name of the DSO?
    And is this the end routine of the transformation while loading the same DSO?
    if so we never do such a thing.
    you are comparing the DSO with the data that is flowing in and then deleting the data from the DSO...
    Which doesnt actually make any sense... because when loading the data to a DSO (or a cube or any table) the DSO (or cube) will be locked exclusively for any modifications of data. You can only read data from it.
    If your requirement is that existing duplicate records need not arrive in the DSO then you can delete the data from the SOURCE_PACKAGE in the start routine like below
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE WHERE <CONDITION>.
    LOOP AT INTERNAL_TABLE.
       DELETE SOURCE_PACKAGE
      WHERE SOURCE_PACKAGE-/BIC/ZCLIENT = INTERNAL_TABLE-/BIC/ZCLIENT
         AND SOURCE_PACKAGE-/BIC/ZGUID = INTERNAL_TABLE-/BIC/ZGUID
      AND SOURCE_PACKAGE-/BIC/ZSOURCE = INTERNAL_TABLE-/BIC/ZSOURCE
      AND SOURCE_PACKAGE-/BIC/ZPOSID = INTERNAL_TABLE-/BIC/ ZPOSID.
    ENDLOOP.
    or if your requirement is that you need to delete the old data from the DSO for the same key which is arriving newly in order to load the new data into the DSO in that case, you could do something like this in the start routine
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE FOR ALL ENTRIES IN SOURCE_PACKAGE
    WHERE /BIC/ZCLIENT = SOURCE_PACKAGE-/BIC/ZCLIENT
         AND /BIC/ZGUID = SOURCE_PACKAGE-/BIC/ZGUID
    AND /BIC/ZSOURCE = SOURCE_PACKAGE-/BIC/ZSOURCE
    AND /BIC/ZPOSID = SOURCE_PACKAGE-/BIC/ ZPOSID.
    * now update the new values you want to write in the loop
    LOOP AT INTERNAL_TABLE INTO WORK_AREA.
    "CODE FOR MANIPULATION of WORK_AREA
    *write a modify statement to update the RESULT_PACKAGE.
    MODIFY RESULT_PACKAGE FROM WORK_AREA TRANSPORTING FIELDS.
    ENDLOOP.
    hope it helps,
    Regards,
    Joe

  • Write optimized dso - Stand DSO loading by Delta Req by req is not working

    Hi All,
    We have a data flow which has 1st layer having write optimized dso and then we are loading data to Standard DSO.
    We are using Delta DTP to load the data with the option "Get All New Data Request By Request", but it is not working.
    Suppose if  i have 2 requests which are need to update to Standard Dso from Write Optimized Dso, it is updating in single request. We are on SAP BW 701 patch level is 005.
    Please suggest me ....

    Hi Prasanna,
    For reasons of downward compatibility, the system behaves differently for DTPs that were created before "SAP NetWeaver 7.0 Support Package Stack 13"
    DTPs for which the indicator is set only get the first new request, even if there is more than one new source request at the time of processing. This restricts the way in which these DTPs can be used in process chains, because requests accumulate in the source and the target may not contain the current data. The Retrieve Until No More New Data indicator is therefore displayed for these DTPs.
    I Suggest you to set this indicator and activate the DTP.
    If you set the Retrieve Until No More New Data indicator and then activate the DTP, once it completes processing, a DTP request checks whether there are any further requests in the source. If the source contains more requests, a new DTP request is automatically generated and processed. Once the DTP is activated, the indicator is no longer visible in DTP maintenance.
    This applies when the DTP is started by a process chain and also when you start the DTP directly from DTP maintenance.
    The indicator is set by default for new DTPs that get data request by request.
    If you do not select the indicator, the label for the Get All New Data Request By Request indicator changes to Get One Request Only. Once the DTP is activated, only this indicator is displayed in DTP maintenance.
    Regards,
    Sudheer.

  • Missing PARTNO field in Write Optimized DSO

    Hi,
    I have a write optimized DSO, for which the Partition has been deleted (reason unknown) in Dev system.
    For the same DSO, partition parameters exists in QA and production.
    Now while transporting this DSO to QA, I am getting an error "Old key field PARTNO has been deleted", and the DSO could not be activated in the target system.
    Please let me know, how do I re-insert this techinal key of PARTNO in my DSO.
    I pressuming it has something to do with partitioning of the DSO.
    Please Help.......

    Hi,
    Since the write-optimized DataStore object only consists of the table of active data, you do not have to activate the data, as is necessary with the standard DataStore object. This means that you can process data more quickly.
    The loaded data is not aggregated; the history of the data is retained. If two data records with the same logical key are extracted from the source, both records are saved in the DataStore object. The record mode responsible for aggregation remains, however, so that the aggregation of data can take place later in standard DataStore objects.
    The system generates a unique technical key for the write-optimized DataStore object. The standard key fields are not necessary with this type of DataStore object. If standard key fields exist anyway, they are called semantic keys so that they can be distinguished from the technical keys. The technical key consists of the Request GUID field (0REQUEST), the Data Package field (0DATAPAKID) and the Data Record Number field (0RECORD). Only new data records are loaded to this key.
    You can specify that you do not want to run a check to ensure that the data is unique. If you do not check the uniqueness of the data, the DataStore object table may contain several records with the same key. If you do not set this indicator, and you do check the uniqueness of the data, the system generates a unique index in the semantic key of the InfoObject. This index has the technical name "KEY". Since write-optimized DataStore objects do not have a change log, the system does not create delta (in the sense of a before image and an after image). When you update data into the connected InfoProviders, the system only updates the requests that have not yet been posted.
    PS: Excerpt from http://help.sap.com/saphelp_nw2004s/helpdata/en/b6/de1c42128a5733e10000000a155106/frameset.htm
    Hope this helps.
    Best Regards,
    Rajani

  • Changing of Write Optimized DSO to Standard DSO

    Dear Experts,
    I have created few Write Optimized (WO)  DSOs based on the requirement and I have created few reports also on these WO DSOs.
    The problem is when I am creating an Info Set on the WO DSO, Standard DSO and a Cube (total 3 Info Providers I am including in this info Set) it is throwing an error while I am displaying the data from that Info Set.
    I came to know that the problem is with WO DSO So I want to change this WO DSO to Standard DSO.
    FYI We are in Development stage only.
    If I copy the same DSO and make it as a Standard DSO my reports which I created on these will disturb so I want alternate solution for this.
    Regards,
    Phani.
    Edited by: Sai Phani on Nov 12, 2010 5:25 PM

    Hi Sai
    Write optimized DSO's always help in optimal data performance. I am sure you created the WOD only to take advantage of this point.
    However, WOD are not suitable when it comes to reporting.
    So instead of converting the WOD to a standard DSO, (where you do lose the data load performance advantage), why don't you connect the WOD to a standard DSO wherein you extract data from the WOD to a standard DSO and use the standard DSO for reporting.  This would give you benefit during data loas as well as reportiing.
    Cheers
    Umesh

  • Write -Optimized DSO Activation Issue

    Hi Experts,
    when ever I am activating the DSO(WRITE-OPTIMIZED), I am getting error like
    "no PSA for InfoSource and source system in Bi 7.0" 
    Can you pls provide solution for this issue.
    considerations:
    1. This Write-Optimized DSO does n't contain any semantic keys,all are taken as DATA Fields.
    2. I tried by checking and unchecking the uniqueness of data check box.
    Thanking You,
    R.Dama

    Hi Experts,
    I know that only active data table is available in WO-DSO.I am not trying to add activate process in Process Chain.
    And, I am not trying in Data Load time.
    I am saying that ,the problem is while creating WO-DSO, we need to activate dso..right
    ErrorIt is in the Initial step while creating WO-DSO.
    Pls help me and clarify that why I am getting that error message..

  • Write Optimized DSO Probelm

    Hi Friends,
    I have to create a process chain for write optimized DSO so i know that every day we have to delete the data from both data target and psa but i am confused how to set delete data every day in psa of my data source.
    can any body help me in this issue
    Thanks and Regards
    pedamarla

    Hi,
    While creating the process chain, select the process type Delete PSA, give the datasource name and schedule it for daily run. Thenafter, run the load through process chain,
    Cheers...
    Puneesh

  • Duplicate Semantic Key in Write Optimized DSO

    Gurus
    Duplicate semantic keys have a unique index KEY in the key fields of the DSO when WRITE OPTIMIZED DSO is used. (Of course this is assuming the Do not check uniqueness of data indicator is not checked..)
    See help https://help.sap.com/saphelp_crm60/helpdata/en/a6/1205406640c442e10000000a1550b0/frameset.htm
    This means that the DSO can contain duplicate records.
    My question is: What happens to these duplicates when a request level delta update is done to a Standard DSO or Infocube?
    Do duplicates end up in the error stack? Are they simply aggregated in further loads? - because this would be a problem for reporting (double-counting).
    thanks
    tony

    Hi Tony,
    It will aggregate the data in some undesired way.
    Read on...
    https://help.sap.com/saphelp_crm60/helpdata/en/b6/de1c42128a5733e10000000a155106/frameset.htm
    If you want to use write-optimized DataStore objects in BEx queries, we recommend that they have a semantic key and that you run a check to ensure that the data is unique. In this case, the write-optimized DataStore object behaves like a standard DataStore object. If the DataStore object does not have these properties, unexpected results may be produced when the data is aggregated in the query.
    Hope it helps...
    Regards,
    Ashish

  • Problem loading data into write optimized dso.....

    Hi ,
    I am having problem loading the data from PSA to write optimised DSO
    I have changed the normal DSO into a Write Optimised DSO. I have 2 data sources to be loaded into the write optimized DSO. One for Demand and one for Inventory.
    The loading of Demand data from PSA to DSO happens fine with out any error.
    Now while loading the Inventory data from PSA to DSO , i get the below errors
    "Data Structures were changed. Start Transaction before hand"
    Execption CX_RS_FAILED Logged
    I have tried reactivating the DSO, Transformation, DTP and tried to load the data again to the write optimised DSO, but no luck, I get the above error message always.
    Can some one please suggest me what could be done to avoid the above error messages and load the data successfully.
    Thanks in advance.

    Hi,
    Check the transformations is there any routines written.
    Check your Data struture of cube and DS as well.
    Is there any changes in structure.
    Data will load upto PSA normally, if is there any changes in struture of DSO, then the error may occur.just check it.
    Check the below blog:
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    Let us know status.........
    Reg
    Pra

  • Update of write optimized DSO by csv file

    Hi Gurus,
    I have observed few things while trying to upload a write optimized (WO) DSO from a flat file in BI 7.0. The data flow is as follows:
    data source -> transformation -> data target (DSO - WO).
    from a test perspective, i have updated 5 records till PSA with IPAK and subsequently updated it to the DSO using DTP. When i check the in the manage i found records transferred = 5 and added = 5. which is okay.
    then i again updated the same 5 records to PSA, and triggered DTP. Now DTP brings 10 records (5 + 5). records transferred = 10 and updated = 10. when i checked in the header tab in DTP monitor i found the selection brings both the PSA request IDs.
    again i loaded the same 5 records to PSA, a new PSA request ID generated and DTP extracts this PSA id along with the old 2 already transferred. Now records transferred = 10 added = 15. why transferred 10 ? i am getting confused here. I was expecting it to follow the same way, then it should have transferred 15 and added 15. Ideally there is no routine which generates additional record. There for this is not possible at all. Anybody has observed this strange behaviour ?
    Why this is happening ? I was expecting it should only bring 5 records every time ? Is it something specific to write optimized DSO or am i doing something wrong here ? Is there any setting where i can set the parameter no to select the old PSA request that already updated to DSO ?
    Please clarify.
    Soumya
    Message was edited by:
            Soumya Mishra

    Is your DTP full or delta.
    Here is some interesting discussion.
    /thread/348238 [original link is broken]

  • Write optimized DSO to standard

    Hello,
    I converted an write optimized DSO to Standard DSO as expected the transformation become inactive i made the mappaing and tried to activate again and its throwing an error cannot activate the transformation.
    The error its giving is
    Syntax error in GP_ERR_RSTRAN_MASTER_TMPL, row 1.181 (-> long text)
    Error during generation
    Error when activating Transformation ........................
    can any on e please suggest how to activate this transformation.
    Cheers,
    Vikram

    Try this program (if exists in ur BW system) RSDG_TRFN_ACTIVATE to activate the transformation.
    Also see whether u can delete complete DTP & Transformation and relogin into RSA1 and create new one.
    (if possible also delete this transformation from table RSTRAN, after deleting DTP)
    Also check following OSS notes on this.
    977922 & 957028

  • Sematic group for a write optimized DSO

    Hello,
    I am loading from a datasource to a write optmized DSO. While doing so, I am not able to set semtantic key/group for the DTP.  I have turned error handling on using the setting 'Valid records update, No reporting(request RED)'.
    Is this setting not possible for a write optimized DSO? if not why not? and if yes, what am i missing!?
    Thanks a lot in advance,
    Prakash

    Thank you,
    here are my findings:
    - if data has been loaded with delta using the DTP, it is not possible to change the sematic groups any more - all requests have to be deleted first
    - the error handling must be turned on
    The issue is resolved.
    Marek

Maybe you are looking for