Multiple data loads in PSA with write optimized DSO objects

Dear all,
Could someone tell me how to deal with this situation?
We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
Does any of you have a solution for this?
Thanks in advance.
Harald

Hi Ajax,
I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
Regards,
Harald

Similar Messages

  • Issue with Write Optimized DSO in BI 7.0

    Hi Guys,
    I have an issue reagarding Write Optimized DSO in BI 7.0.
    First, I explain the scenario..
    We have some Master Data Objects around 5. We have a source system for load master data.
    The design is like this
    There is a DSO which gets the data from 30 Master Data Sources and feeds 30 Master Data objects (Info Objects).
    The process of loading master data is before loading the new loads previous load requestes have to be deleted from this Write Optimized DSO and current loads have to be loaded. This entire process is done through Process Chains.
    In this regard. every thing is going smoothly. But for last somedays Deletetion of Requests in Write Optimized DSO process showing its failed (Red Color) with error (pls. check the below screen shot for that) even though the previous requested in DSO got deleted and new load requested added in DSO Manage Tab.
    What I am thinking is, its merely Fron End Problem, If am not wrong please help me in this regard.
    Please let me know is there any notes for related this issue, ot something other solution.
    THanks in Advance
    Peter ....

    how it is resolved how u did u done the procedure ?

  • Error during loading and deletion of write-optimized DSO

    Hey guys,
    I am using a write optimized DSO ZMYDSO to store data from several sources (two datasources and one DSO).
    I have disabled the check of uniqueness in the DSO, but I defined a semantic key for the fields ZCLIENT, ZGUID, ZSOURCE, ZPOSID which are used in a non-unique index.
    In the given case, I want to delete existing rows in the DSO. I execute these steps in the endroutine. Here the abstract coding:
    LOOP AT RESULT_PACKAGE ASSINING <RESULT_FIELDS>.
    u201Csome other logic [u2026]
    DELETE /BIC/AZMYDSO00
    WHERE /BIC/ZCLIENT = RESULT_FIELDS-/BIC/ZCLIENT
         AND /BIC/ZGUID = RESULT_FIELDS-/BIC/ZGUID
    AND /BIC/ZSOURCE = RESULT_FIELDS-/BIC/ZSOURCE
    AND /BIC/ZPOSID = RESULT_FIELDS-/BIC/ ZPOSID.
    ENDLOOP.
    COMMIT WORK AND WAIT.
    During the Loading (after the transformation step in the updating step), I get the messages (not every time):
    1.     Error while writing the data. (RSAODS131)
    2.     Could not Save DataPackage xy in DataStore ZMYDSO (RSODSO_UPDATE027).
    Diagnosis: DataPackage XY could not be saved. Reasons therefore could be violation of key uniqueness (duplicate data) or general database error.
    3.     Error in the substep of updating DataStore.
    I have checked the system log (SM21) and the system dumps (ST22) but I could not find an exact error description.
    I guess, I am creating some inconsistencies or locks (I also checked the SM12) so that the load process interrupts. But I also tried a serial updating within the DTP (I reduced the number of batch processes to 1). No success.
    Perhaps the loading of one specific package could take a longer time so that the following package would overtake the predecessor. Could that be a problem? Do you generally advise against the deletion of rows within the endroutine?
    Regards,
    Philipp

    Hi,
    is ZMYDSO the name of the DSO?
    And is this the end routine of the transformation while loading the same DSO?
    if so we never do such a thing.
    you are comparing the DSO with the data that is flowing in and then deleting the data from the DSO...
    Which doesnt actually make any sense... because when loading the data to a DSO (or a cube or any table) the DSO (or cube) will be locked exclusively for any modifications of data. You can only read data from it.
    If your requirement is that existing duplicate records need not arrive in the DSO then you can delete the data from the SOURCE_PACKAGE in the start routine like below
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE WHERE <CONDITION>.
    LOOP AT INTERNAL_TABLE.
       DELETE SOURCE_PACKAGE
      WHERE SOURCE_PACKAGE-/BIC/ZCLIENT = INTERNAL_TABLE-/BIC/ZCLIENT
         AND SOURCE_PACKAGE-/BIC/ZGUID = INTERNAL_TABLE-/BIC/ZGUID
      AND SOURCE_PACKAGE-/BIC/ZSOURCE = INTERNAL_TABLE-/BIC/ZSOURCE
      AND SOURCE_PACKAGE-/BIC/ZPOSID = INTERNAL_TABLE-/BIC/ ZPOSID.
    ENDLOOP.
    or if your requirement is that you need to delete the old data from the DSO for the same key which is arriving newly in order to load the new data into the DSO in that case, you could do something like this in the start routine
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE FOR ALL ENTRIES IN SOURCE_PACKAGE
    WHERE /BIC/ZCLIENT = SOURCE_PACKAGE-/BIC/ZCLIENT
         AND /BIC/ZGUID = SOURCE_PACKAGE-/BIC/ZGUID
    AND /BIC/ZSOURCE = SOURCE_PACKAGE-/BIC/ZSOURCE
    AND /BIC/ZPOSID = SOURCE_PACKAGE-/BIC/ ZPOSID.
    * now update the new values you want to write in the loop
    LOOP AT INTERNAL_TABLE INTO WORK_AREA.
    "CODE FOR MANIPULATION of WORK_AREA
    *write a modify statement to update the RESULT_PACKAGE.
    MODIFY RESULT_PACKAGE FROM WORK_AREA TRANSPORTING FIELDS.
    ENDLOOP.
    hope it helps,
    Regards,
    Joe

  • Error while deleting data from a write optimized DSO using a Process Chain?

    Dear Bwers,
    I am facing a strange error while using process chains to delete data from data target which is a write optimized DSO. The process shows as failed but the data gets deleted in the DSO. The error message is as below. Did anybody have a similar problem? Any suggestions.
    Thanks
    Raj
    Error while deleting content of InfoCube/DataStore object ZLSD_G03
    Message no. RSDODSO153

    Please, check if you get any short dump in ST22 related to this issue.

  • Load Data with 7.0 DataSource from Falt file to Write Optimized DSO

    Hi all,
    we have a problem loading data from flat file using the 7.0 datasource.
    We have to load a flat file (monthly) into a WO DSO. The infopackage load file in full mode into the Datasource (PSA) and the DTP Load in delta mode data from datasource into the WO DSO.
    When i load the second file in the Datasource, the DTP load all data present in the Datasource and not only the new one as aspected using Delta mode.
    Has anyone any tips to help me?
    Thank you for help.
    Regards
    Emiliano

    Hi,
    Iam facing the similar problem.
    Iam using Write Optimized DSO and i have got only 1 req in PSA (have deleted all previous req from PSA and DSO).
    When iam doing a delta load from PSA to DSO, i expect to see only that 1 req to get loaded into DSO.
    But, its picking up the data from 3 other reqests and doubling the records...
    Can you please help me, how did you managed to get out of that isue?
    Cheers,
    Nisha

  • Problem loading data into write optimized dso.....

    Hi ,
    I am having problem loading the data from PSA to write optimised DSO
    I have changed the normal DSO into a Write Optimised DSO. I have 2 data sources to be loaded into the write optimized DSO. One for Demand and one for Inventory.
    The loading of Demand data from PSA to DSO happens fine with out any error.
    Now while loading the Inventory data from PSA to DSO , i get the below errors
    "Data Structures were changed. Start Transaction before hand"
    Execption CX_RS_FAILED Logged
    I have tried reactivating the DSO, Transformation, DTP and tried to load the data again to the write optimised DSO, but no luck, I get the above error message always.
    Can some one please suggest me what could be done to avoid the above error messages and load the data successfully.
    Thanks in advance.

    Hi,
    Check the transformations is there any routines written.
    Check your Data struture of cube and DS as well.
    Is there any changes in structure.
    Data will load upto PSA normally, if is there any changes in struture of DSO, then the error may occur.just check it.
    Check the below blog:
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    Let us know status.........
    Reg
    Pra

  • Duplicate Error while loading data to Write Optimized DSO

    Hi,
    When i do a dataload for Write Optimized DSO, I am getting an Error as "Duplicate Data Record Detected".  I have Sales Document Number, Fiscal Year Variant & Billing Item as Semantic Key in the DSO. 
    For this DSO, I am getting data from a Test ECC system, in which most of the Sales Document Number column is Blank for this Datasource.
    When i go into the Error Stack of the DSO, all the rows whichever has Sales Document number as Blank are displayed.  For all this rows, the Item Number is 10.
    Am i getting this Duplicate error as the Sales Document Number is Blank & Item number is 10 for all of them?  I read in Threads that Write Optimized DSO doesnt care about the Key Values, it loads the data even if the Key values are same.
    Any help is highly appreciated.
    Regards,
    Murali

    Hi Murali,
    Is the Item Number a key field ?
    When all the key fields are same then data gets aggreagted depending on the setting done in the transformation for the key figures. These 2 options for key figures are :
    1. Add up the key figures
    2. Replace the key figure
    Since the Sales Document No is blank and Item Number is same then their is a possibility that the key figures for these records might get added up or replaced and thus due to this property of SDSO it might not be throwing error.
    Check the KF value in the SDSO for that Sales Doc No and Item No. and try to find out what is the value for KF. It may have addition of all the common data fields or the KF value of last common record.
    Regards
    Raj Rai

  • Segmentation fault error during data load in parallel with multiple rules

    Hi,
    I'm trying to do sql data load in parallel with multiple rules (4 or 5 rules, maybe), i'm getting a "segmentation fault" error. I tested 3 rules file and it worked fine. we're using Essbase system 9.3.2., with UDB (v8) as the sql data source. ODBC driver is DataDirect 5.2 DB2 Wire Protocol Driver (ARdb222). Please let me know if you have any information on this.
    thx.
    Y

    Hi Thad,
    I was wondering, if system is unicode or non unicode that should not matter the amount and currency field . As currencies are defined by SAP and it is in pure English at least a currency code part of 3 Chars. 
    Could this because of some incosistency of data ??
    I would like to know for Which currency  had some special characters it it in that particular record ??
    Hope that helps.
    Regards
    Mr Kapadia

  • Disk throughput drops when inserting data packages in write-optimized DSO

    Hi all,
    we are currently testing our new freshly installed SAN.
    To see the performance gain in BI, I'm currently doing some test loads.
    And during the monitoring of those loads, I noticed something I'd like someone to explain :-):
    I execute a DTP from PSA to a write-optimized DSO.
    The n° of parallel processes = 9
    Update method = serial extraction, immediate parallel processing
    N° of records transferred: +23.000.000
    Ok, in the first phase (read the PSA) only one process is used (serial extraction).  When I look in OS07, I notice we have very good throughput: +66.000 TransfKB/s. Very nice!
    But as soon as BI starts inserting the data packages, and parallel processing kicks in, the throughput drops to 4K or something, and sometimes we get 20K at max.  That's not too good.
    We have a massive SAN , but the BI system does not seem to use it?
    I was wondering why this is the case.  I already toyed around with the package size, but it's always the same.
    Also I noticed that the allocated processes don't seem to be active.  I allocated 9 BTC processes to this load.
    They are all used, but we only see 3 inserts at the same time, max.  Also in the DTP-monitor, only 3 packages are processed at the same time.  As it's a write-optimized DSO, RSODSO_SETTINGS does not apply I presume.
    Any ideas?
    tnx!

    Hi,
    can you pls try to give some filetr in DTP and try to pull the data.
    I am not sure why first data package is taking long time and otehr data package is taking less time..
    Do you have any start routine..If datapak = 1.. the do this logic..
    Pls check..
    regards
    Gopal

  • Changes to write optimized DSO containing huge amount of data

    Hi Experts,
    We have appended two new fields in DSO containg huge amount of data. (new IO are amount and currency)
    We are able to make the changes in Development (with DSO containing data).  But when we tried to
    tranport the changes to our QA system, the transport hangs.  The transport triggers a job which
    filled-up the logs so we need to kill the job which aborts the transport.
    Does anyone of you had the same experience.  Do we need to empty the DSO so we can transport
    successfully?  We really don't want to empty the DSO's as it will take time to load? 
    Any help?
    Thank you very muhc for your help.
    Best regards,
    Rose

    emptying the dso should not be necessary, not for a normal dso and not for a write optimized DSO.
    What are the things in the logs; sort of conversions for all the records?
    Marco

  • Write optimized dso - Stand DSO loading by Delta Req by req is not working

    Hi All,
    We have a data flow which has 1st layer having write optimized dso and then we are loading data to Standard DSO.
    We are using Delta DTP to load the data with the option "Get All New Data Request By Request", but it is not working.
    Suppose if  i have 2 requests which are need to update to Standard Dso from Write Optimized Dso, it is updating in single request. We are on SAP BW 701 patch level is 005.
    Please suggest me ....

    Hi Prasanna,
    For reasons of downward compatibility, the system behaves differently for DTPs that were created before "SAP NetWeaver 7.0 Support Package Stack 13"
    DTPs for which the indicator is set only get the first new request, even if there is more than one new source request at the time of processing. This restricts the way in which these DTPs can be used in process chains, because requests accumulate in the source and the target may not contain the current data. The Retrieve Until No More New Data indicator is therefore displayed for these DTPs.
    I Suggest you to set this indicator and activate the DTP.
    If you set the Retrieve Until No More New Data indicator and then activate the DTP, once it completes processing, a DTP request checks whether there are any further requests in the source. If the source contains more requests, a new DTP request is automatically generated and processed. Once the DTP is activated, the indicator is no longer visible in DTP maintenance.
    This applies when the DTP is started by a process chain and also when you start the DTP directly from DTP maintenance.
    The indicator is set by default for new DTPs that get data request by request.
    If you do not select the indicator, the label for the Get All New Data Request By Request indicator changes to Get One Request Only. Once the DTP is activated, only this indicator is displayed in DTP maintenance.
    Regards,
    Sudheer.

  • Delta loading procedure from Write Optimized DSO to Infocube

    Hi All,
    We are using Write optimized DSO in our project to which I am loading data using Standard DSO 0FI_GL_12
    From Write optimized DSO, we are loading delta records into Infocube, please provide your inputs to my following questionnaire:
    1) I am quite interested to know how the delta records gets loaded into infocube whenever we are using 'Write optimized DSO' as we don't have any image concept in Optimized DSO
    Ex: If am using Standard DSO, we have Change log table and image concept will allow to get updated value to Cube
    let us assume
    Active Table
    111            50
    111            70 (overwrite)
    Change Log Table
    111            -50        (X -- Before Image)
    111             70    ( '  ' -- After Image) symbol for after image is 'Space'
    So if we load this record to the target as a delta the above two records from change log table will get loaded to the CUBE and Cube will have 70 as the updated record
    If am using 'Write Optimized',
    Active Table
    111            50
    111            70 (overwrite)
    When this record loaded to the cube, as Info Cube is always having 'Additive' feature so the total value will 50+70 =120 which is wrong?
    Correct me what feature will work here to get updated value as '70' into Cube from Write Optimized DSO'
    2)As the data source is delta capable and having  'ADDITIVE' delta process, only the delta records based on REQUEST ID are loaded into Info Cube with the  updated key figure value?
    Thanks for your inputs and much appreciated.
    Regards,
    Madhu

    Hi Madhu,
    In best practice, we use WODSO in Initial layer and then Standard DSO. Just for mass data load/staging purpose.
    In best practice : Data source ----> WODSO ---> std. DSO
    In your case : Data source ----> Std.DSO  -----> WODSO.
    In both cases if data load design is not in accurate way, then your cube will have incorrect entries.
    For ex:  today 9 am : 111,  50  (in active table)
    Data load to cube, same day 11 am : then cube will have 111    50.
    Same day, value got changed  in std. DSO  1 pm :   111   70(over write function @ active table).
    Same day/next day if you load data to cube, it will have 2 records one is with value 50 and other would be 70.  So to avoid such scenarios we should plan load in accurate way.  Else make change your DTP settings as  ‘source table : change log table.
    Coming to your case:
    Once after the load to Std. DSO, load data to WODSO by changing the DTP settings ‘Delta Init.Extraction from’  : Change log.
    Now data available @WODSO from change log table, then you load to cube In delta mode.

  • Unable to delete data target contents of Write-Optimized DSO in Process Chain

    Hi Experts,
    We are using SAP Net Weaver BW 7.01 version and we need to delete the entire data target contents of Write-Optimized DSO in the process chain before the next data load.
    I included this step in process chain but still it is failing with errore message"Message not found (in main memory), Drop Cube Failed In Data Target"
    This process type is working during BW 7.0 version but not in BW 7.01 version.
    However i found that we can use the program RSSM_DELETE_WO_DSO_REQUESTS to delete old requests in the Write-Optimized DSO for BW 7.01 SP07 as per SAP Note 1437407 but still it's not working even after implementing this program as the Prerequisite to delete the request is the data mart status should be updated where it is not happening for the program.
    We had an process type option to 'delete the requests from Write-Optimized DSO' directly in BW 7.3 but still not available in 7.01 version.
    Could you please suggest me on how to resolve this issue in BW 7.01?
    Many thanks for your help in advance.
    Regards,
    Madhu

    Create ABAP program as attached code.
    Then you can use that ABAP program in process chains through ABAP variant
    ABAP varaint should have following properties
    Select call mode as Synchronous; call from Local; and Program
    Give your ABAP program name in "program name" and create one program variant for each write optimized DSO.
    Please refer how to use ABAP program in process chains for further details.
    Hope this helps

  • Write-Optimized DSO data deletion

    Hello All,
    We have a requirement to delete old data from a write-optimized DSO. The specific requirement is to delete the data older than 15 days (requests older than 15 days) in the DSO. I could not find any process type that could be used in process chains which would automatically delete the old data based on our settings (similar to the one used for PSA or Change-log deletion). Has any of you come across a process type or a batch job that be scheduled to delete old data in a Write-optimized DSO. Your help is much appreciated.
    Thanks,
    Veera

    Hi Nagesh,
    Thanks for your answer. But the use of these function modules would still require custom development to identify the requests to be deleted. We are trying to see if SAP has any standard process for deleting the old request out of a write optimized DSO or if anyone was successful in achieveing this with least amount of customization.
    Thanks,
    Veera

  • Load performance Write-Optimized DSO

    Dear all,
    I'm looking for some practical tips concerning improving load performance from the PSA to a write-optimized DSO (41M records) via a DTP.
    All parameters that could be tweaked have been checked (e.g. packet size, batch jobs, uniqueness of data flag, etc.) and optimized.
    However this load stays extremely slow (init load from source to BW is faster).
    The BW system runs on SP18.
    Please share all your tips our recommendations for us to solve this major issue.
    Your help is much appreciated!
    Thanks
    JvB

    Hi JvB,
    Here are some options I can think of and let you know if I remember something:
    1) Increase number of prallel processes in DTP i.e. DTP->GOTO->Settings for bacth monitor->Number of prallel process to 4 or 5.
    2) Note 409641 - Examples of packet size dependency on ROIDOCPRMS
    The general formula for data transfer is:
    packet size = MAXSIZE * 1000 \ transfer structure size
    but not more than MAXLINES.
    eg. if MAXLINES < than the result of the formula, MAXLINES size is transferred into BW.
    3) you should check if there are any locks or deadlocks in ST04 and also System Analysis SM21....it will show u all the details,,, and Short dump in ST22.....there may be some ..
    4) Check and Analyze DSO in RSRV. If there are any issues repair it.
    5) If you are loading huge volumes of data split the records by Filters in DTP ex. Plant, calendar year or calendar month selections.
    And also Try
    Try partioning,craeting indexes/secondary indexes or archiving old data.
    Check these links for furtehr reference:
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    /message/2987899#2987899 [original link is broken]
    Good Luck.
    Regards
    Satish Arra
    Edited by: Satish Arra on Feb 7, 2009 9:28 PM

Maybe you are looking for

  • OBIEE Report using webservices : problem using the htmlviewservice.

    Hi, I have a requirement that i have to display OBIEE report on web browser. i followed the following article for achieving this. http://oraclebizint.wordpress.com/2007/07/31/customizing-obi-ee-soap-api/. I have a problem while creation of the html p

  • How can I install software of a CD on my MacBook Pro 13' ?

    I have bought software (SPSS, adobe, office) to install on my Macbrook Pro 13'. However, the software has been delivered in CD format. How can I install this software on my macbook? Do I have to copy the CD to a USB? Already thanks for your tips! Fle

  • How to copy net price of the Contract to Valuation price.

    Dear All, I have a requirement to change the valuation price when raising purchase requisitions (ME51n). when selecting source of supply (Contracts), i want to copy the Net price of the selected contract to the Valuation price of the Purchase Requisi

  • Payment Advice output according to no. of lines in FBZP - Payment methods -

    Hi All I would like to use the option  ''Payment adv after lines' in the Payment advice output according to no. of lines. currently we have 'Always pyt adv' active  -- In FBZP - Payment methods for Co code But I see that the first radio button is dis

  • Can't REinstall leopard, and I'm losing my patience

    Here's the thing. My computer has been having kernel panics and has been running way too slow. So, what I wanted to do was to erase everything and reinstall Leopard. So, I tried doing it with a copy of the original Leopard DVD that I have. I selected