Problem loading data into write optimized dso.....

Hi ,
I am having problem loading the data from PSA to write optimised DSO
I have changed the normal DSO into a Write Optimised DSO. I have 2 data sources to be loaded into the write optimized DSO. One for Demand and one for Inventory.
The loading of Demand data from PSA to DSO happens fine with out any error.
Now while loading the Inventory data from PSA to DSO , i get the below errors
"Data Structures were changed. Start Transaction before hand"
Execption CX_RS_FAILED Logged
I have tried reactivating the DSO, Transformation, DTP and tried to load the data again to the write optimised DSO, but no luck, I get the above error message always.
Can some one please suggest me what could be done to avoid the above error messages and load the data successfully.
Thanks in advance.

Hi,
Check the transformations is there any routines written.
Check your Data struture of cube and DS as well.
Is there any changes in structure.
Data will load upto PSA normally, if is there any changes in struture of DSO, then the error may occur.just check it.
Check the below blog:
/people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
Let us know status.........
Reg
Pra

Similar Messages

  • Duplicate Error while loading data to Write Optimized DSO

    Hi,
    When i do a dataload for Write Optimized DSO, I am getting an Error as "Duplicate Data Record Detected".  I have Sales Document Number, Fiscal Year Variant & Billing Item as Semantic Key in the DSO. 
    For this DSO, I am getting data from a Test ECC system, in which most of the Sales Document Number column is Blank for this Datasource.
    When i go into the Error Stack of the DSO, all the rows whichever has Sales Document number as Blank are displayed.  For all this rows, the Item Number is 10.
    Am i getting this Duplicate error as the Sales Document Number is Blank & Item number is 10 for all of them?  I read in Threads that Write Optimized DSO doesnt care about the Key Values, it loads the data even if the Key values are same.
    Any help is highly appreciated.
    Regards,
    Murali

    Hi Murali,
    Is the Item Number a key field ?
    When all the key fields are same then data gets aggreagted depending on the setting done in the transformation for the key figures. These 2 options for key figures are :
    1. Add up the key figures
    2. Replace the key figure
    Since the Sales Document No is blank and Item Number is same then their is a possibility that the key figures for these records might get added up or replaced and thus due to this property of SDSO it might not be throwing error.
    Check the KF value in the SDSO for that Sales Doc No and Item No. and try to find out what is the value for KF. It may have addition of all the common data fields or the KF value of last common record.
    Regards
    Raj Rai

  • Duplication Error while loading data in write optimized DSO

    Hi Experts,
    I have an issue.In BI7 I'm trying to load the data in a WRITE OPTIMIZED ODS from the controlling data source 0CO_OM_CCA_10. I'm getting the data properly in PSA, but while loading it into my WODSO i'm getting the duplication error although my keys fields and data fields are properly placed in my data target(WODSO).
    pls let me know what is the solution to load it successfully.
    Thanks in Advance.
    Amit

    Hi,
    thanks for your reply
    I'm getting this error message:
    Diagnosis
        During loading, there was a key violation. You tried to save more than
        one data record with the same semantic key.
        The problematic (newly loaded) data record has the following properties:
        o   DataStore object: GWFDSR02
        o   Request: DTPR_4BA3GF8JMQFVQ8YUENNZX3VG5
        o   Data package: 000006
        o   Data record number: 101
    Although i have selected the key fields which identifies unique record,then also i'm getting the duplication error.
    Even i have reffered to the BI content for this data source and found that it has the same key fields as of mine.
    Debjani: i need unique records without duplication and i'm doing a full load in DTP.
    What is to be done pls help
    Thanks in advance.
    Edited by: Amit Kotwani on Sep 26, 2008 10:56 AM

  • Is a DTP mandatory in order to load into Write-optimized DSO ?

    I'm wondering if a DTP is mandatory in order to load data into a WO-DSO.
    Reason: in BI7 i have a 3.x load process of order item data:
    2LIS_11_VAITM -> Transfer Rules -> Comm.Area -> Update Rules -> standard ODS.
    I now wanted to load the same data from 11_VAITM into a WO-DSO. I actually simply created the WO-DSO using the template of the standard ODS. THis means that the key fields of the WO-DSO are Request#, Datapak-is and Record#. The semantic key is identical to the key used in the standard ODS.
    I then created the same Update Rules from the InfoSource goint into the new WO-ODS and last but not least I simply changed the InfoPackaging saying that the data should be now loaded into both the standard ODS and WO-DSO.
    2LIS_11_VAITM -> Transfer Rules -> Comm.Area -> Update Rules -> standard ODS
                                                                             |-> Update Rules -> WO-DSO
    What happened is quite annoying and nor understandable right now.  Using this process I always get only one record into the new WO-DSO (it's always the first record of the data flow), not more. The PSA clearly shows me all the records with different record-numbers per request/datapak, therefore the key elements of the WO-DSO are clearly unique.
    However, as there is always only one record appearing the new WO-ODS, I'm really wondering if the overall process isn't correct at all and that I actually need a DTP to load into WO-DSOs.
    Am I right or are there other ideas related to my problem ?
    Thanks

    As far as I understand in order for you to use transformations, you need to use DTP. Infopackages will not work.
    But iyour case u are using update rules, then it should work.
    But because the WOD is a new technology, you need to only use DTP to load from PSA may be.

  • Delta loading procedure from Write Optimized DSO to Infocube

    Hi All,
    We are using Write optimized DSO in our project to which I am loading data using Standard DSO 0FI_GL_12
    From Write optimized DSO, we are loading delta records into Infocube, please provide your inputs to my following questionnaire:
    1) I am quite interested to know how the delta records gets loaded into infocube whenever we are using 'Write optimized DSO' as we don't have any image concept in Optimized DSO
    Ex: If am using Standard DSO, we have Change log table and image concept will allow to get updated value to Cube
    let us assume
    Active Table
    111            50
    111            70 (overwrite)
    Change Log Table
    111            -50        (X -- Before Image)
    111             70    ( '  ' -- After Image) symbol for after image is 'Space'
    So if we load this record to the target as a delta the above two records from change log table will get loaded to the CUBE and Cube will have 70 as the updated record
    If am using 'Write Optimized',
    Active Table
    111            50
    111            70 (overwrite)
    When this record loaded to the cube, as Info Cube is always having 'Additive' feature so the total value will 50+70 =120 which is wrong?
    Correct me what feature will work here to get updated value as '70' into Cube from Write Optimized DSO'
    2)As the data source is delta capable and having  'ADDITIVE' delta process, only the delta records based on REQUEST ID are loaded into Info Cube with the  updated key figure value?
    Thanks for your inputs and much appreciated.
    Regards,
    Madhu

    Hi Madhu,
    In best practice, we use WODSO in Initial layer and then Standard DSO. Just for mass data load/staging purpose.
    In best practice : Data source ----> WODSO ---> std. DSO
    In your case : Data source ----> Std.DSO  -----> WODSO.
    In both cases if data load design is not in accurate way, then your cube will have incorrect entries.
    For ex:  today 9 am : 111,  50  (in active table)
    Data load to cube, same day 11 am : then cube will have 111    50.
    Same day, value got changed  in std. DSO  1 pm :   111   70(over write function @ active table).
    Same day/next day if you load data to cube, it will have 2 records one is with value 50 and other would be 70.  So to avoid such scenarios we should plan load in accurate way.  Else make change your DTP settings as  ‘source table : change log table.
    Coming to your case:
    Once after the load to Std. DSO, load data to WODSO by changing the DTP settings ‘Delta Init.Extraction from’  : Change log.
    Now data available @WODSO from change log table, then you load to cube In delta mode.

  • DataLoad into Write-optimized DSO with DTP and semantic groups

    hi gurus,
    i'm going crazy with my current problem.
    i searched in the other posts of this topic, but i did not found a solution.
    here my situation:
    i created a w-o-DSO with semantic key (0ucinstalla, 0calmonth, zbelnum, 0unit) and three key figures.
    now i'm loading from several cubes data into the DSO. (i need the historical data)
    in every transformation, i implemented an expertroutine which collects the data into the result_package.
    in the routine, i even clear the record number before collecting the result_rows.
    in the DTPs, i'm using a semantic group for 0ucinstalla in order to select all rows for one 0ucinstalla into one data-package.
    each DTP has to run one time in full mode.
    but when i schedule the third DTP, i get the message: "duplicate record"
    i checked the active data in dso. there was no record with that semantic key.
    i found a record with the same 0ucinstalle for a diferent month - in my opinion, thats no duplicate record...
    is there a dependence between semantic key in DSO and semantic group in DTP?
    how can i solve this error?
    regards,
    philipp

    Hi,
    thx for your fast replies!!
    @ Passing by:
    I know the option "Do not check uniqueness of data".
    At DSO do not arrive any duplicates and i would like to use the check of uniqueness in future, if there really arrive duplicate records.
    @ Durgesh Gandewar: thx for the hint, but i also checked this website...
    Regards,
    Philipp

  • Error while loading data from write optimized ODS to cube

    Hi All,
    I am loading data from a write optimized ODS to cube
    I have done Generate Export Datasource
    schedulled the info packge with 1 selection for full load
    then it gave me following error in Transfer IDOCs & TRFC
    Info IDOC 1: IDOC with errors added
    Info IDOC 2: IDOC with errors added
    Info IDOC 3: IDOC with errors added
    Info IDOC 4: IDOC with errors added
    Data packege 1 arrived in BW Processing : selected number does not agree with transferred number
    Processing below is green
    shows update of  4 new records to Datapackage 1.
    Please provide inputs for the resolution
    Thanks & Regards,
    Rashmi.

    please let me know, What more details you need?
    If I click F1 for error details i get following message
    Messages from source system
    see also Processing Steps Request
    These messages are sent by IDoc from the source system. Both the extractor itself as well as the service API can send messages. When errors occur, several messages are usually sent together.
    From the source system, there are several types of messages that can be differentiated by the so-called Info-IDoc-Status. The IDoc with status 2 plays a particular role here; it describes the number of records that have been extracted in a source system and sent to BI. The number of the records received in BI is checked against this information.
    Thanks & Regards,
    Rashmi.

  • Load Data with 7.0 DataSource from Falt file to Write Optimized DSO

    Hi all,
    we have a problem loading data from flat file using the 7.0 datasource.
    We have to load a flat file (monthly) into a WO DSO. The infopackage load file in full mode into the Datasource (PSA) and the DTP Load in delta mode data from datasource into the WO DSO.
    When i load the second file in the Datasource, the DTP load all data present in the Datasource and not only the new one as aspected using Delta mode.
    Has anyone any tips to help me?
    Thank you for help.
    Regards
    Emiliano

    Hi,
    Iam facing the similar problem.
    Iam using Write Optimized DSO and i have got only 1 req in PSA (have deleted all previous req from PSA and DSO).
    When iam doing a delta load from PSA to DSO, i expect to see only that 1 req to get loaded into DSO.
    But, its picking up the data from 3 other reqests and doubling the records...
    Can you please help me, how did you managed to get out of that isue?
    Cheers,
    Nisha

  • Data archiving for Write Optimized DSO

    Hi Gurus,
    I am trying to archive data in Write Optimized DSO.
    Its allowing me to archive on request basis but it archives entire requests in the DSO(Means all data).
    But i want to select to archive from this request to this request.(Selection of request by my own).
    Please guide me.
    I got the below details from SDN.Kindly check.
    Archiving for Write optimized DSOs follow request based archiving as opposed to time slice archiving in standard DSO. This means that partial request activation is not possible; only complete requests have to be archived.
    Characteristic for Time Slice can be a time characteristic present in the WDSO, or the request creation data/request loading date. You are not allowed to add additional infoobjects for semantic groups, default is 0REQUEST & 0DATAPAKID.
    The actual process of archiving remains the same i.e
    Create a Data Archiving Process
    Create and schedule archiving requests
    Restore archiving requests (optional)
    Regards,
    kiruthika

    Hi,
    Please check the below OSS Note :
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d31313338303830%7d
    -Vikram

  • Do Not Check Uniqueness of Data in Write Optimised DSO

    Hello,
    I am working with write optimised DSO with already billion records in it. I have flag 'Do Not Check Uniqueness of Data' in Settings as checked(means it will surely not check for uniqueness of data). I am thinking of removing this flag off and activate the DSO again. I am willing to remove this flag as it will provide option of Req ID input in listcube on DSO and without this list cube will never return back with results.(I have to analyze aggregations)
    I tried removing this flag and then activate DSO in production system with 17 million records which took 5 mins for activation (index creation). So maths says for a billion records activation transport will take around 6 hrs for moving to Production.
    I am willing to remove this flag as it will provide option of Req ID input in listcube and without this list cube will never return back with results.
    Questions:
    How does this flag checks the uniqueness of record? WIll it check Active table or from the Index?
    To what extent DTP will slow down the process of subsequent data load?
    Any other factors/risks/precautions to be taken?
    Let me know if questions are not clea or further inputs are required from my side.
    Thanks.
    Agasti

    Hi,
    Please go through the site :
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    As far as your ques are concerned... i hope above blog will answer most of it , and if it does'nt please read below mentioned thread.
    Use of setting "Do Not Check Uniqueness of Data" for Write Optimized DSO
    Regards
    Raj

  • Problem in Data loading in dso (write optimized DSO)

    Hi all,
    While loading data from  PSA to DSO .. i am getting following error...
    " <b>You attempted to assign a field to a typed field symbol,
    but the field does not have the required type."</b>
    Regards
    Paddy

    ur are trying to map a field in PSA with the dSO with diff datatype. Chk for that field and do the correct mapping.
    *Assign points if helpful

  • Multiple data loads in PSA with write optimized DSO objects

    Dear all,
    Could someone tell me how to deal with this situation?
    We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
    The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
    For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
    As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
    We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
    Does any of you have a solution for this?
    Thanks in advance.
    Harald

    Hi Ajax,
    I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
    The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
    Regards,
    Harald

  • Error during loading and deletion of write-optimized DSO

    Hey guys,
    I am using a write optimized DSO ZMYDSO to store data from several sources (two datasources and one DSO).
    I have disabled the check of uniqueness in the DSO, but I defined a semantic key for the fields ZCLIENT, ZGUID, ZSOURCE, ZPOSID which are used in a non-unique index.
    In the given case, I want to delete existing rows in the DSO. I execute these steps in the endroutine. Here the abstract coding:
    LOOP AT RESULT_PACKAGE ASSINING <RESULT_FIELDS>.
    u201Csome other logic [u2026]
    DELETE /BIC/AZMYDSO00
    WHERE /BIC/ZCLIENT = RESULT_FIELDS-/BIC/ZCLIENT
         AND /BIC/ZGUID = RESULT_FIELDS-/BIC/ZGUID
    AND /BIC/ZSOURCE = RESULT_FIELDS-/BIC/ZSOURCE
    AND /BIC/ZPOSID = RESULT_FIELDS-/BIC/ ZPOSID.
    ENDLOOP.
    COMMIT WORK AND WAIT.
    During the Loading (after the transformation step in the updating step), I get the messages (not every time):
    1.     Error while writing the data. (RSAODS131)
    2.     Could not Save DataPackage xy in DataStore ZMYDSO (RSODSO_UPDATE027).
    Diagnosis: DataPackage XY could not be saved. Reasons therefore could be violation of key uniqueness (duplicate data) or general database error.
    3.     Error in the substep of updating DataStore.
    I have checked the system log (SM21) and the system dumps (ST22) but I could not find an exact error description.
    I guess, I am creating some inconsistencies or locks (I also checked the SM12) so that the load process interrupts. But I also tried a serial updating within the DTP (I reduced the number of batch processes to 1). No success.
    Perhaps the loading of one specific package could take a longer time so that the following package would overtake the predecessor. Could that be a problem? Do you generally advise against the deletion of rows within the endroutine?
    Regards,
    Philipp

    Hi,
    is ZMYDSO the name of the DSO?
    And is this the end routine of the transformation while loading the same DSO?
    if so we never do such a thing.
    you are comparing the DSO with the data that is flowing in and then deleting the data from the DSO...
    Which doesnt actually make any sense... because when loading the data to a DSO (or a cube or any table) the DSO (or cube) will be locked exclusively for any modifications of data. You can only read data from it.
    If your requirement is that existing duplicate records need not arrive in the DSO then you can delete the data from the SOURCE_PACKAGE in the start routine like below
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE WHERE <CONDITION>.
    LOOP AT INTERNAL_TABLE.
       DELETE SOURCE_PACKAGE
      WHERE SOURCE_PACKAGE-/BIC/ZCLIENT = INTERNAL_TABLE-/BIC/ZCLIENT
         AND SOURCE_PACKAGE-/BIC/ZGUID = INTERNAL_TABLE-/BIC/ZGUID
      AND SOURCE_PACKAGE-/BIC/ZSOURCE = INTERNAL_TABLE-/BIC/ZSOURCE
      AND SOURCE_PACKAGE-/BIC/ZPOSID = INTERNAL_TABLE-/BIC/ ZPOSID.
    ENDLOOP.
    or if your requirement is that you need to delete the old data from the DSO for the same key which is arriving newly in order to load the new data into the DSO in that case, you could do something like this in the start routine
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE FOR ALL ENTRIES IN SOURCE_PACKAGE
    WHERE /BIC/ZCLIENT = SOURCE_PACKAGE-/BIC/ZCLIENT
         AND /BIC/ZGUID = SOURCE_PACKAGE-/BIC/ZGUID
    AND /BIC/ZSOURCE = SOURCE_PACKAGE-/BIC/ZSOURCE
    AND /BIC/ZPOSID = SOURCE_PACKAGE-/BIC/ ZPOSID.
    * now update the new values you want to write in the loop
    LOOP AT INTERNAL_TABLE INTO WORK_AREA.
    "CODE FOR MANIPULATION of WORK_AREA
    *write a modify statement to update the RESULT_PACKAGE.
    MODIFY RESULT_PACKAGE FROM WORK_AREA TRANSPORTING FIELDS.
    ENDLOOP.
    hope it helps,
    Regards,
    Joe

  • Load performance Write-Optimized DSO

    Dear all,
    I'm looking for some practical tips concerning improving load performance from the PSA to a write-optimized DSO (41M records) via a DTP.
    All parameters that could be tweaked have been checked (e.g. packet size, batch jobs, uniqueness of data flag, etc.) and optimized.
    However this load stays extremely slow (init load from source to BW is faster).
    The BW system runs on SP18.
    Please share all your tips our recommendations for us to solve this major issue.
    Your help is much appreciated!
    Thanks
    JvB

    Hi JvB,
    Here are some options I can think of and let you know if I remember something:
    1) Increase number of prallel processes in DTP i.e. DTP->GOTO->Settings for bacth monitor->Number of prallel process to 4 or 5.
    2) Note 409641 - Examples of packet size dependency on ROIDOCPRMS
    The general formula for data transfer is:
    packet size = MAXSIZE * 1000 \ transfer structure size
    but not more than MAXLINES.
    eg. if MAXLINES < than the result of the formula, MAXLINES size is transferred into BW.
    3) you should check if there are any locks or deadlocks in ST04 and also System Analysis SM21....it will show u all the details,,, and Short dump in ST22.....there may be some ..
    4) Check and Analyze DSO in RSRV. If there are any issues repair it.
    5) If you are loading huge volumes of data split the records by Filters in DTP ex. Plant, calendar year or calendar month selections.
    And also Try
    Try partioning,craeting indexes/secondary indexes or archiving old data.
    Check these links for furtehr reference:
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    /message/2987899#2987899 [original link is broken]
    Good Luck.
    Regards
    Satish Arra
    Edited by: Satish Arra on Feb 7, 2009 9:28 PM

  • Error while deleting data from a write optimized DSO using a Process Chain?

    Dear Bwers,
    I am facing a strange error while using process chains to delete data from data target which is a write optimized DSO. The process shows as failed but the data gets deleted in the DSO. The error message is as below. Did anybody have a similar problem? Any suggestions.
    Thanks
    Raj
    Error while deleting content of InfoCube/DataStore object ZLSD_G03
    Message no. RSDODSO153

    Please, check if you get any short dump in ST22 related to this issue.

Maybe you are looking for

  • IS-PS 4.6C-Goods Receipt printer assignment

    Hi, I am trying to change the designated printer that is assigned for printing our Goods Receipts. We are on 4.6C. I have looked at all of the IMG configuration tools and have not found the one that has the currently assigned printer shown. This incl

  • External Hard Drives and iTunes, on Mac Book Pro and PC?

    If you transfer your iTunes library from PC to an external hard drive, can you use itunes, from the external hard drive when you plug it into Mac Book Pro? I'm very confused about this! All my music is on PC and I want to save it to my external hard

  • Is Apple Going To Make A Personal Case For The 5g iPod?

    Is apple planning to make its own personal cover/case for the 5g iPod?

  • Using file "iTunes Library.itl" on an earlier version of iTunes

    With the problems and disappointment I've had with iTunes 8.0, I decided to remove it. After uninstalling iTunes 8.0, I reinstalled iTunes 7.7.1.11. When I launched iTunes 7.7.1.11, I received the following message: The file "iTunes Library.itl" cann

  • Texts in ServiceDesk disappear

    Hi Guru’s, We’re using the Solution Manager Service Desk for incident logging purposes. This is working without a problem. But now we want to activate the BSP DSWPNOTIFCREATE to let end-users enter their own incidents in Solution Manager. In order fo