Unable to delete data target contents of Write-Optimized DSO in Process Chain

Hi Experts,
We are using SAP Net Weaver BW 7.01 version and we need to delete the entire data target contents of Write-Optimized DSO in the process chain before the next data load.
I included this step in process chain but still it is failing with errore message"Message not found (in main memory), Drop Cube Failed In Data Target"
This process type is working during BW 7.0 version but not in BW 7.01 version.
However i found that we can use the program RSSM_DELETE_WO_DSO_REQUESTS to delete old requests in the Write-Optimized DSO for BW 7.01 SP07 as per SAP Note 1437407 but still it's not working even after implementing this program as the Prerequisite to delete the request is the data mart status should be updated where it is not happening for the program.
We had an process type option to 'delete the requests from Write-Optimized DSO' directly in BW 7.3 but still not available in 7.01 version.
Could you please suggest me on how to resolve this issue in BW 7.01?
Many thanks for your help in advance.
Regards,
Madhu

Create ABAP program as attached code.
Then you can use that ABAP program in process chains through ABAP variant
ABAP varaint should have following properties
Select call mode as Synchronous; call from Local; and Program
Give your ABAP program name in "program name" and create one program variant for each write optimized DSO.
Please refer how to use ABAP program in process chains for further details.
Hope this helps

Similar Messages

  • "Completely delete data target content" setting in InfoPackage

    Gurus,
    In a Infopackage to load a cube that has 2 sources- Source 1 and source 2,there is a setting
    "Completely delete data target content" under datatargets.
    The InfoPackage is for loading data from source 1. Does this mean that the setting "Completely delete data target content" deletes just the requests related to source 1 from the cube or all the requests in the cue before making any other load.
    Please advice.
    Thanks,
    Simmi

    Hello Simmi,
    This is content from help regarding this feature and it says that entire content of cube will be deleted.
    Delete InfoCube contents completely
    Use
    With every load process before updating the data in an InfoCube, you can delete the entire content of this InfoCube. Mark field Delete entire InfoCube content for this.
    If you are using the InfoPackage in a process chein, the setting is hidden in the scheduler, since it is represented in the Process Chain Maintenance through its own process type and is maintained there.
    Regards,
    Praveen

  • Deletion of data target contents Vs delete overlapping requests

    hi,
         when do we  go for <b>delete overlapping requests</b>? if it is applicable for full load as well as delta load then i would like to first come up with the full load concept, we have the other option called <b>delete data target contents</b> with this we can delete daily full load without going for delete overlapping requests.
    Pls let me know exact difference between DELETE OVERLAPPING REQUESTS FROM INFOCUBE with DELETE DATA TARGET CONTENTS.

    hi
    When you have delta upload twice daily..the date in the previous request and the second request is same....so you might be giving this option to delete the overlap such that data is not loaded twice
    Assign points dont forget
    Regards
    N Ganesh

  • 4 LSA architecture - EDW layer - write optimized DSO settings

    Dear Colleagues,
    I have a question regarding the 4 LSA architecture, available on the following article.
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/306f254c-1e3f-2d10-9da0-bcff4e35e0ef
    When we activate a SAP business content dataflow.
    If we want to add in the EDW layer write optimized DSO to have a faster upload from source system to SAP BI.
    Should we always check the" do not check uniqueness of data" setting in the write optimized DSO to avoid data upload error ?
    Based on your experience what would be your recommendation ?
    Cheers,

    I would suggest to check "ON"
    -If the check is ON It will allow to load several records with same semantic key and in next layer
    -If the check is OFF It will not allow to load same record twice and throws error
    Did you seen this ?
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    Edited by: Srinivas on Aug 24, 2010 2:16 PM

  • Semantic keys for write optimized DSO

    Hi experts,
    Can anyone tell me more about semantic key in a write optimized DSO ?
    I have a DSO cocnerning sales orders with 3 Datasources.
    How can you define the semantic key ?
    I have schedule line number - document number and position number in semantic key but when I load the PSA, I have an error with duplicate data.
    Any clues ?
    Thanks.

    Hi Oliver,
    If you specify any of the charaterstics symantic fileds, when you load data to the DSO, if any error record comes the following records which has the same characterstic(symantic) combinations will n't update into DSO even though they are correct. they will be written to error stact to ensure the data quality.
    In write optimized DSO, technical fileds are automatically taken, then in the semantic fileds you can specify the charcterstic which should act as primary keys(but not exactly).
    Thanks
    Sreekanth.

  • Error while deleting data from a write optimized DSO using a Process Chain?

    Dear Bwers,
    I am facing a strange error while using process chains to delete data from data target which is a write optimized DSO. The process shows as failed but the data gets deleted in the DSO. The error message is as below. Did anybody have a similar problem? Any suggestions.
    Thanks
    Raj
    Error while deleting content of InfoCube/DataStore object ZLSD_G03
    Message no. RSDODSO153

    Please, check if you get any short dump in ST22 related to this issue.

  • Unable to delete data from a partition

    Hi,
    Unable to delete data from a partition
    I am using the command as:
    ALTER TABLE TEST DROP PARTITION DATE_20090820090000
    Its giving 0 rows deleted.Bu there are 200 rows for the Partition.
    The partition is getting deleted but the data remains.
    I want a command where the data in the partition also to be deleted along with partition.
    Any help will be needful for me

    SQL> CREATE TABLE range_part (
    prof_history_id NUMBER(10),
    person_id NUMBER(10) NOT NULL,
    organization_id NUMBER(10) NOT NULL,
    record_date DATE NOT NULL)
    PARTITION BY RANGE (record_date) (
    PARTITION yr0 VALUES LESS THAN (TO_DATE('01-JAN-2007','DD-MON-YYYY'))
    PARTITION yr7 VALUES LESS THAN (TO_DATE('01-JAN-2008','DD-MON-YYYY'))
    PARTITION yr8 VALUES LESS THAN (TO_DATE('01-JAN-2009','DD-MON-YYYY'))
    PARTITION yr9 VALUES LESS THAN (MAXVALUE) );
    Table created.
    SQL> INSERT INTO range_part VALUES (1, 1, 1, SYSDATE-720);
    INSERT INTO range_part VALUES (2, 2, 2, SYSDATE-360);
    INSERT INTO range_part VALUES (3, 3, 3, SYSDATE-180);
    INSERT INTO range_part VALUES (4, 4, 4, SYSDATE);
    1 row created.
    SQL>
    1 row created.
    SQL>
    1 row created.
    SQL>
    1 row created.
    SQL>
    SQL> commit;
    Commit complete.
    SQL> SELECT * FROM range_part;
    PROF_HISTORY_ID PERSON_ID ORGANIZATION_ID RECORD_DATE
    1 1 1 31-AUG-2007 05:53:22
    2 2 2 25-AUG-2008 05:53:22
    3 3 3 21-FEB-2009 05:53:22
    4 4 4 20-AUG-2009 05:53:22
    SQL> SELECT * FROM range_part PARTITION(yr7);
    PROF_HISTORY_ID PERSON_ID ORGANIZATION_ID RECORD_DATE
    1 1 1 31-AUG-2007 05:53:22
    SQL> alter table range_part drop partition yr7;
    Table altered.
    SQL> SELECT * FROM range_part PARTITION(yr7);
    SELECT * FROM range_part PARTITION(yr7)
    ERROR at line 1:
    ORA-02149: Specified partition does not exist
    SQL> SELECT * FROM range_part;
    PROF_HISTORY_ID PERSON_ID ORGANIZATION_ID RECORD_DATE
    2 2 2 25-AUG-2008 05:53:22
    3 3 3 21-FEB-2009 05:53:22
    4 4 4 20-AUG-2009 05:53:22

  • Unable to delete request from write-optimized DSO (Error during rollback)

    Hi Gurus,
    I am trying to delete a delta request from a Write-Optimized DSO. This request was uploaded with a DTP from another Write-optimized DSO.
    The actual overall status of the request is RED and the description of that status is now: 'Error during rollback of request DTPR_4JW6NLSVDUNYY3GTD6F4DQJWR; only rollback allowed'.
    I checked the log of all Request Operations in DataStore (from the same line where the red request is now) and I see my several attemps to delete this request under a RED radiobutton with the title Rollback.  The details for this error are the following:
    Could not delete request data from active table
    Message no. RSODSO_ROLLBACK114
    Diagnosis
    The system could not delete the request data from the active table of a write-optimized DataStore object.
    System Response
    Write-optimized DataStore object: DTFISO02
    Active table: /BIC/ADTFISO0200
    Request: DTPR_4JW6NLSVDUNYY3GTD6F4DQJWR
    Procedure
    Search for Notes containing the key words "Delete write-optimized DSO PSA"
    I am relatively new to SAP BI 7.0 and I do not know how to delete this request.  Any help will be highly appreciated !!
    Leticia

    Hi Leticia:
    Take a look at the SAP Notes below.
    Note 1111065 - "701: Delta consistency check for write-optimized DSOs"
    Note 1263877 - "70SP20: Delta consistency check for write-optimized DSOs"
    Note 1125025 - "P17:PSA:DSO:ODSR missing in PSA process for write-opt. DSO"
    Additionally, some ideas from the alternative presented on the blog by KMR might help you.
    "How to generate a selective deletion program for info provider"
    Regards,
    Francisco Mílán.

  • Write-Optimized DSO data deletion

    Hello All,
    We have a requirement to delete old data from a write-optimized DSO. The specific requirement is to delete the data older than 15 days (requests older than 15 days) in the DSO. I could not find any process type that could be used in process chains which would automatically delete the old data based on our settings (similar to the one used for PSA or Change-log deletion). Has any of you come across a process type or a batch job that be scheduled to delete old data in a Write-optimized DSO. Your help is much appreciated.
    Thanks,
    Veera

    Hi Nagesh,
    Thanks for your answer. But the use of these function modules would still require custom development to identify the requests to be deleted. We are trying to see if SAP has any standard process for deleting the old request out of a write optimized DSO or if anyone was successful in achieveing this with least amount of customization.
    Thanks,
    Veera

  • How to create data target contents.

    hi sap gurus,
    Please guide how to create metachains on cube or ods having the following options
    1.Complete Deleteion of DATA target content.
    2. execute infopackage.
    plz help me as soon as possible.And aslso please forward dome process chain documents.

    Hi,
    You can find the process types Complete Deleteion of DATA target content & Execute infopackage in ur process chain maintancence screen. Just drag & drop & create a process chain as per ur data flow.
    <u>Check this doc for process chains</u>
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/8da0cd90-0201-0010-2d9a-abab69f10045
    <u>help.sap </u>
    http://help.sap.com/saphelp_nw2004s/helpdata/en/8f/c08b3baaa59649e10000000a11402f/frameset.htm
    <u>Check these docs also</u> https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/19683495-0501-0010-4381-b31db6ece1e9
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/36693695-0501-0010-698a-a015c6aac9e1
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/9936e790-0201-0010-f185-89d0377639db
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3507aa90-0201-0010-6891-d7df8c4722f7
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/263de690-0201-0010-bc9f-b65b3e7ba11c

  • Multiple data loads in PSA with write optimized DSO objects

    Dear all,
    Could someone tell me how to deal with this situation?
    We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
    The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
    For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
    As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
    We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
    Does any of you have a solution for this?
    Thanks in advance.
    Harald

    Hi Ajax,
    I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
    The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
    Regards,
    Harald

  • Error during loading and deletion of write-optimized DSO

    Hey guys,
    I am using a write optimized DSO ZMYDSO to store data from several sources (two datasources and one DSO).
    I have disabled the check of uniqueness in the DSO, but I defined a semantic key for the fields ZCLIENT, ZGUID, ZSOURCE, ZPOSID which are used in a non-unique index.
    In the given case, I want to delete existing rows in the DSO. I execute these steps in the endroutine. Here the abstract coding:
    LOOP AT RESULT_PACKAGE ASSINING <RESULT_FIELDS>.
    u201Csome other logic [u2026]
    DELETE /BIC/AZMYDSO00
    WHERE /BIC/ZCLIENT = RESULT_FIELDS-/BIC/ZCLIENT
         AND /BIC/ZGUID = RESULT_FIELDS-/BIC/ZGUID
    AND /BIC/ZSOURCE = RESULT_FIELDS-/BIC/ZSOURCE
    AND /BIC/ZPOSID = RESULT_FIELDS-/BIC/ ZPOSID.
    ENDLOOP.
    COMMIT WORK AND WAIT.
    During the Loading (after the transformation step in the updating step), I get the messages (not every time):
    1.     Error while writing the data. (RSAODS131)
    2.     Could not Save DataPackage xy in DataStore ZMYDSO (RSODSO_UPDATE027).
    Diagnosis: DataPackage XY could not be saved. Reasons therefore could be violation of key uniqueness (duplicate data) or general database error.
    3.     Error in the substep of updating DataStore.
    I have checked the system log (SM21) and the system dumps (ST22) but I could not find an exact error description.
    I guess, I am creating some inconsistencies or locks (I also checked the SM12) so that the load process interrupts. But I also tried a serial updating within the DTP (I reduced the number of batch processes to 1). No success.
    Perhaps the loading of one specific package could take a longer time so that the following package would overtake the predecessor. Could that be a problem? Do you generally advise against the deletion of rows within the endroutine?
    Regards,
    Philipp

    Hi,
    is ZMYDSO the name of the DSO?
    And is this the end routine of the transformation while loading the same DSO?
    if so we never do such a thing.
    you are comparing the DSO with the data that is flowing in and then deleting the data from the DSO...
    Which doesnt actually make any sense... because when loading the data to a DSO (or a cube or any table) the DSO (or cube) will be locked exclusively for any modifications of data. You can only read data from it.
    If your requirement is that existing duplicate records need not arrive in the DSO then you can delete the data from the SOURCE_PACKAGE in the start routine like below
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE WHERE <CONDITION>.
    LOOP AT INTERNAL_TABLE.
       DELETE SOURCE_PACKAGE
      WHERE SOURCE_PACKAGE-/BIC/ZCLIENT = INTERNAL_TABLE-/BIC/ZCLIENT
         AND SOURCE_PACKAGE-/BIC/ZGUID = INTERNAL_TABLE-/BIC/ZGUID
      AND SOURCE_PACKAGE-/BIC/ZSOURCE = INTERNAL_TABLE-/BIC/ZSOURCE
      AND SOURCE_PACKAGE-/BIC/ZPOSID = INTERNAL_TABLE-/BIC/ ZPOSID.
    ENDLOOP.
    or if your requirement is that you need to delete the old data from the DSO for the same key which is arriving newly in order to load the new data into the DSO in that case, you could do something like this in the start routine
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE FOR ALL ENTRIES IN SOURCE_PACKAGE
    WHERE /BIC/ZCLIENT = SOURCE_PACKAGE-/BIC/ZCLIENT
         AND /BIC/ZGUID = SOURCE_PACKAGE-/BIC/ZGUID
    AND /BIC/ZSOURCE = SOURCE_PACKAGE-/BIC/ZSOURCE
    AND /BIC/ZPOSID = SOURCE_PACKAGE-/BIC/ ZPOSID.
    * now update the new values you want to write in the loop
    LOOP AT INTERNAL_TABLE INTO WORK_AREA.
    "CODE FOR MANIPULATION of WORK_AREA
    *write a modify statement to update the RESULT_PACKAGE.
    MODIFY RESULT_PACKAGE FROM WORK_AREA TRANSPORTING FIELDS.
    ENDLOOP.
    hope it helps,
    Regards,
    Joe

  • Load Data with 7.0 DataSource from Falt file to Write Optimized DSO

    Hi all,
    we have a problem loading data from flat file using the 7.0 datasource.
    We have to load a flat file (monthly) into a WO DSO. The infopackage load file in full mode into the Datasource (PSA) and the DTP Load in delta mode data from datasource into the WO DSO.
    When i load the second file in the Datasource, the DTP load all data present in the Datasource and not only the new one as aspected using Delta mode.
    Has anyone any tips to help me?
    Thank you for help.
    Regards
    Emiliano

    Hi,
    Iam facing the similar problem.
    Iam using Write Optimized DSO and i have got only 1 req in PSA (have deleted all previous req from PSA and DSO).
    When iam doing a delta load from PSA to DSO, i expect to see only that 1 req to get loaded into DSO.
    But, its picking up the data from 3 other reqests and doubling the records...
    Can you please help me, how did you managed to get out of that isue?
    Cheers,
    Nisha

  • Changes to write optimized DSO containing huge amount of data

    Hi Experts,
    We have appended two new fields in DSO containg huge amount of data. (new IO are amount and currency)
    We are able to make the changes in Development (with DSO containing data).  But when we tried to
    tranport the changes to our QA system, the transport hangs.  The transport triggers a job which
    filled-up the logs so we need to kill the job which aborts the transport.
    Does anyone of you had the same experience.  Do we need to empty the DSO so we can transport
    successfully?  We really don't want to empty the DSO's as it will take time to load? 
    Any help?
    Thank you very muhc for your help.
    Best regards,
    Rose

    emptying the dso should not be necessary, not for a normal dso and not for a write optimized DSO.
    What are the things in the logs; sort of conversions for all the records?
    Marco

  • Problem loading data into write optimized dso.....

    Hi ,
    I am having problem loading the data from PSA to write optimised DSO
    I have changed the normal DSO into a Write Optimised DSO. I have 2 data sources to be loaded into the write optimized DSO. One for Demand and one for Inventory.
    The loading of Demand data from PSA to DSO happens fine with out any error.
    Now while loading the Inventory data from PSA to DSO , i get the below errors
    "Data Structures were changed. Start Transaction before hand"
    Execption CX_RS_FAILED Logged
    I have tried reactivating the DSO, Transformation, DTP and tried to load the data again to the write optimised DSO, but no luck, I get the above error message always.
    Can some one please suggest me what could be done to avoid the above error messages and load the data successfully.
    Thanks in advance.

    Hi,
    Check the transformations is there any routines written.
    Check your Data struture of cube and DS as well.
    Is there any changes in structure.
    Data will load upto PSA normally, if is there any changes in struture of DSO, then the error may occur.just check it.
    Check the below blog:
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    Let us know status.........
    Reg
    Pra

Maybe you are looking for

  • Metadata Panel missing in Bridge

    I have the latest version of CS4 (11.0.2). I was working in Adobe Bridge when an error msg. appeared saying that Bridge had a problem and would close. I rebooted, and when I accessed Bridge again, the Metadata Panel was missing. I have "Metadata Pane

  • IMac Core 2 Duo vs MacBook Air i7

    hi all, Q: performace-wise, which is "better"? iMac: 2.66ghz Core 2 Duo New MacBook Air: 1.8ghz Core i7 some background: a few months ago (march 2011) i purchased an 11" MBA (1.4ghz Core 2 Duo, 128GB HD (currently with 80GB free), 4GB RAM) and i love

  • Standard cost error..

    Hi Friends We have a material with diffrent valuation type per say MFG and procurment etc. During standard costing I found that standard cost is updated only at header level not at child material level, which cause of a loss of the difference cost. S

  • Videodrivers in Boot Camp, AoC problem

    I'm currently playing Age of Conan (AoC) on my iMac. I'm running on a 100 GB Partition with WinXP SP3. I have the latest version of Bootcamp installed. Unfortunately, when I start AoC, it tells me my video drivers are not up-to-date and that AoC is p

  • Anti Virus program for Windows Conversion

    I am planning to put VMware Fusion onto my ProMac in order to transfer many of my former PC programs. Hearing that AVG was a very good Anti virus program, I e-mailed them about which of their products I could use. They informed me that they are not c