0RECORDMODE: overwrite vesus addition

Good day
Please assist. I have created an ODS (version 3.0) and 0recordmode forms part of the Comm Structure.I need to load sales data for the same dealer, same month and the data should be added, not overwrite. How do I get the ODS to accept to add the data, not overwrite?
Thanks
Cornelius

Hi,
The value of InfoObject 0RECORDMODE determines whether the update rule for key figures supports addition or overwrite procedures. This InfoObject is required for delta loads and is added by the system if a data source is delta-capable. It is added to an ODS during the creation process. Records are updated during the delta process using a variety of ROCANCEL/0RECORDMODE values including N for a new record image, A for an additive image, and Y for an update record image used by ODS key figures processing with a minimum or maximum aggregation. We will limit our discussion here to the following four more commonly used values:X indicates the before image of a record D deletes the record  R denotes a reverse image A blank character represents a record’s after image  When delta requests are processed by the ODS, the ROCANCEL values assigned in R/3 and maintained by InfoObject 0RECORDMODE accurately update the data target automatically in different ways depending on if the ODS update rules are configured for the Addition or Overwrite mode.

Similar Messages

  • Addition and Overwrite update types for ODS

    Hello BW Experts ,
    I have an issue.
    For a particular Order and Cost element on R/3 side I have Four cost Figures .
    When I do full upload from the related standard data source to ODS with update type Overwritten I am getting only the last cost figure of the same order and Cost elemet.
    I am loadind data from this ODS to infocube further.
    Now the problem is I want the sum of all the cost figures for the same order and same cost element.
    But as the last cost figure is overwritten in ODS I am not getting the correct sum.
    So i have made changes to the update type from Overwrite to Addition.
    Now I am getting the addition of all cost figures correctly .
    Now I am doubtful that If I add further data to ODS by full upload then all cost figures will get doubled.
    Please explain me what to do in such a case.
    Thanks in Advance,
    Amol .

    Hello Amol,
    check if the datasource supports delta!! you can see it in the rsa6 datasource display. check for the delta checkbox. also from the roosource table this can be found out.
    if yes, shift to init-delta loads from full loads.
    only the init may take some time but the deltas on daily basis should not take much time.
    also one more thing to add to the earlier responses is you can automate the deletion of similar request in infopakcage setting so that u need not manually delete the full upload request daily( if u are working on full uploads on daily basis).
    hope it helps..
    regards,

  • 2LIS_11_VAITM - Deleted Line Items

    I am having an issue with sales order line items that have been deleted.
    I load my data using 2LIS_11_VAITM into a DSO EDW Layer. I then pass this up to another DSO layer. I noticed that users were deleting line items from Orders and this was causing me an issue in my data. To remove the problem I have linked ROCANCEL to 0RECORDMODE in the Technical rule up to my EDW layer and now in my EDW layer the line item goes completely when it is deleted out of the ERP system ad the deletion is passed into BW.
    I got the idea for this from here Deleted line item appears in BI report
    My problem lies in the layer above where I am still left with a line in the DSO relating to the deleted line. But the KF are zero. Is there anyway to also get this line to be deleted out as per my EDW layer?
    Thanks,
    Joel

    Hi Joel,
    What is not completely clear to me is what happens in the first DSO. Are the deleted records removed from the active table or are the key figures set to 0?
    Furthermore, what is update rule for key figures: overwrite or addition (in the transformation to both DSOs)?
    The record mode is indeed the crucial factor. It must become clear which one is delivered by the DataSource: 'R' for a reversed record or a 'D' for an entire deletion? And what is the record mode in the update to second DSO?
    Best regards,
    Sander

  • Data in the cube is showing multiple entries when compared with ODS

    Hello BW Gurus,
    We have a waste report in production planning on Cube and ODS separately. The same info package loads both targets (which means same infosource) but when we run a report on Cube, the records are showing multiple entries (i.e. Key Figures are not matching when compared to ODS) where as the ODS records are showing correctly as it was in R/3. There are totally 6 key figures out of which 4 pulled from R/3 and 2 are populated in BW.
    An Example:
    Waste report in PP run for plant 1000 for 12/2005 and process order 123456. The operational scrap should be 2.46% and the component scrap should be 3.00% for material 10000000. The report is showing 7.87% for planned operational waste % and 9.6% for planned component waste %. These values are not correct. The ODS values for order 123456 matched the data in R/3 for component and operational scrap.
    There is a Start routine to the ODS and also to the cube. I am not good at ABAP so requesting your Help.
    Here is the ODS Code:
    tables: /BI0/PPRODORDER.
    loop at data_package.
    select single COORD_TYPE
    PRODVERS
    into (/BI0/PPRODORDER-COORD_TYPE,
    /BI0/PPRODORDER-PRODVERS)
    from /BI0/PPRODORDER
    where PRODORDER = data_package-PRODORDER
    and OBJVERS = 'A'.
    if sy-subrc = 0.
    if /BI0/PPRODORDER-COORD_TYPE = 'XXXX'
    or /BI0/PPRODORDER-COORD_TYPE = 'YYYY'.
    data_package-PRODVERS = space.
    else.
    data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
    endif.
    endif.
    if data_package-calday = space
    or data_package-calday = '00000000'.
    if data_package-TGTCONSQTY NE 0.
    data_package-calday = data_package-ACTRELDATE.
    endif.
    endif.
    modify data_package.
    endloop.
    Here is Cube Code:
    tables: /BI0/PPRODORDER,
    /BIC/ODS.
    TYPES:
    BEGIN OF ys_mat_unit,
    material TYPE /bi0/oimaterial,
    mat_unit TYPE /bi0/oimat_unit,
    numerator TYPE /bi0/oinumerator,
    denomintr TYPE /bi0/oidenomintr,
    END OF ys_mat_unit.
    DATA:
    l_s_mat_unit TYPE ys_mat_unit,
    e_factor type p decimals 5.
    loop at data_package.
    select single COORD_TYPE
    PRODVERS
    into (/BI0/PPRODORDER-COORD_TYPE,
    /BI0/PPRODORDER-PRODVERS)
    from /BI0/PPRODORDER
    where PRODORDER = data_package-PRODORDER
    and OBJVERS = 'A'.
    if sy-subrc = 0.
    if /BI0/PPRODORDER-COORD_TYPE = 'XXX'
    or /BI0/PPRODORDER-COORD_TYPE = 'YYY'.
    data_package-PRODVERS = space.
    else.
    data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
    endif.
    endif.
    if data_package-calday = space
    or data_package-calday = '00000000'.
    if data_package-TGTCONSQTY NE 0.
    data_package-calday = data_package-ACTRELDATE.
    endif.
    endif.
    data_package-agsu = 'GSU'.
    data_package-agsu_qty = 0.
    select single gr_qty
    base_uom
    into (/BIC/ODS-gr_qty,
    /BIC/ODS-base_uom)
    from /BIC/ODS
    where prodorder = data_package-prodorder
    and material = data_package-material.
    if sy-subrc = 0.
    if /BIC/ODS-base_uom = 'GSU'.
    data_package-agsu_qty = /BIC/ODS-gr_qty.
    else.
    SELECT SINGLE * FROM /bi0/pmat_unit
    INTO CORRESPONDING FIELDS OF l_s_mat_unit
    WHERE material = data_package-material
    AND mat_unit = 'GSU'
    AND objvers = 'A'.
    IF sy-subrc = 0.
    IF l_s_mat_unit-denomintr <> 0.
    e_factor = l_s_mat_unit-denomintr /
    l_s_mat_unit-numerator.
    multiply /BIC/ODS-gr_qty by e_factor.
    data_package-agsu_qty = /BIC/ODS-gr_qty.
    ENDIF.
    else.
    CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
    EXPORTING
    INPUT = /BIC/ODS-gr_qty
    NO_TYPE_CHECK = 'X'
    ROUND_SIGN = ' '
    UNIT_IN = /BIC/ODS-base_uom
    UNIT_OUT = 'GSU'
    IMPORTING
    OUTPUT = DATA_PACKAGE-gsu_qty
    EXCEPTIONS
    CONVERSION_NOT_FOUND = 1
    DIVISION_BY_ZERO = 2
    INPUT_INVALID = 3
    OUTPUT_INVALID = 4
    OVERFLOW = 5
    TYPE_INVALID = 6
    UNITS_MISSING = 7
    UNIT_IN_NOT_FOUND = 8
    UNIT_OUT_NOT_FOUND = 9
    OTHERS = 10.
    endif.
    endif.
    endif.
    modify data_package.
    endloop.
    some how the AGSU qyt is not populating in the cube and when I dbug the code, I could see a clean record in the internal table but not in the cube.
    your suggestion and solutions would be highly appreciated.
    thanks,
    Swathi.

    Hi Swathi
    In ODs we have option of overwriting and addition however in Cube we have only adition.Thats why you are getting multiple enteries.
    If you are running daily full load on the cube then please delete the earlier requests.
    So at one point of time there should be only one full load request in cube. Hope this will solve your problem.
    Regards,
    Monika

  • Active table in DSO

    Hi Friends,
    What is the property of the Active Table of D.S.O means it alway has the functionality like Overwrite or addition also.
    OR it is based on the settings of the field in the transformation.
    Thanks in advance.
    Thanks & Regards,
    Vas
    Edited by: Vas_srini on Jul 8, 2010 8:35 AM
    Please search the forum before posting a thread
    Edited by: Pravender on Jul 8, 2010 12:14 PM

    Hi Suneel,
      1)Try to activate the request manually,then u will be getting "Green" symbol.The reason why the request is "Red" although ur long text are in green  might be due to the system performance or  there might be anyother errors in the 'Processing' tab.Try to check it 7 activate the request manually.
    2)U can 'repeat' the ODS by right clicking on it why bcoz  activation of records might be terminated so that request will become 'Green'.
    It might help you.
    Thanks& regards,
    Madhu

  • Data load for target after target enhancement

    Dear all,
    We are using BI7.00 and in one of our design we have the data first loaded to the ODS and then to the Cube. Now we wanted to get the value of one more field. This field is already avaialable in the data source and data is flowing upto PSA. I have added that field in the ODS. My problem starts here. I want the data to flow for all the previous requests i.e., earlier requests in the ODS and the Cube (Prior to enhancement) without disturbing the data load.
    Kindly provide the step by step instructions. I do not want the earlier data to be deleted. The current field value to be updated to the previous loads to the ODS and the Cube.
    Regards,
    M.M

    Hi,
    Overwrite optrion is available in the update mode of key figures.
    Its key figure which determines whether the DSO is in overwrite or addition mode.
    Go to the mappings of each infoobject in the tranformation from the data source to DSO and check the mapping type of each key figure and then see if its overwrite or not??
    To avoide reloading to the cube.....One of the option is to add the fields in both cubes and DSO and then activate the transformation and mappings,DTP ect.
    After that load the data to the DSO and then schedule the delta from the DSO to the cube.Thi will bring all the chnages to the DSO in the cube.
    But this delta may be a huge one and may fail depending upon the data in the DSO.
    But you can try for this and if it is not working then you may have to delete the cube and reload it.
    Thanks
    Ajeet

  • Where to check UPDATE TYPE in Transformation in BI

    Hi All
    In BI , Where to check UPDATE TYPE( Like whether the filed/infoobject is overwrite or Addition ) in Transformation in BI.
    In BW 3.5 we can check this in UPdate Rule Maintanenance,but in BI where do we check the same.
    Shankar

    Hi,
    you can check Update type in transformation :-
    -->in target, double click on first Column (Rule Type)
    -->Aggregation
    -->you can see both Option (Overwrite, Summation)
    but when target is DSO. in case of Cube, you will have only Summation
    you can check the Update Type in DSO Update Rule as well:-
    -->in Update Rule, Single Click on Key Fig (Amount, Qty....)
    -->Update Rule :Details will come
    -->in that you can see, the Update Type Section has three option (Addition, Overwrite, no update)
    in case of cube you only have ( Addition, no update)
    Best Regards
    Obaid

  • Loading data from Cube to ODS

    Hi All,
    I have a cube which contains projects data. I want to load the data from cube to a write optimized ODS. In the cube for a project i have multiple records. But while loading the data to ODS, I have to summarise the data in such a way that i should have one record for a project i.e if the cube has data as follows:
    Project Cost
    abc      100
    abc       200
    abc       300
    Then in the ODS the record should be
    Project    Cost
    abc        600
    How do i achieve this ?
    Thanks,
    Satya

    Hi Satya,
    Generate export datasource from Cube ( Right click on cube ->generate export datasource).
    Create update rules for the ODS and in the update rule of Cost , you will find two options
    1. overwrite
    2. addition.
    Choose addition and activate update rules. ..This will work exactly same way you wanted.
    One more thing to mention ,Put project info object in to key fields of ODS and rest all info objects in to data fields.
    Hope that helps.
    Regards
    Mr Kapadia
    Assigning points is the way to say thanks in SDN.
    Message was edited by:
            Mr Kapadia

  • What are the key steps & order to follow: changes in my flat file structure

    HI,
    I have a Cube which sits on ODS.
    In the ODS,  the are 6 characterisics: Char1, Char2,...., char6; and 3 key figures: kf1, kf2, and kf3.
    The ODS is loaded through a flat file.
    Through an update rule and startup routine, the ODS updates the cube.
    Now, I have a new requirement to add 2 new characteristics (Cha10, char20) and one 2 key figures (KF55, KF66)).i.e. the flat file will now be coming in with these new fields.
    I have an idea but this this is the first time I really have to implement, I need to be sure.
    What are the key steps that I need to go through and in what order?
    Thanks

    What version are you running ?
    Will you need to load history data as for these new fields or is it just going forward ?
    Is it all one to one mapping for the new fields ?
    1. Make sure you know to what dimensions you need to add the new chars ? New dimension for these two ?
    2. New chars to be made keyfields ?
    3. Update Type for keyfigures Overwrite or Additive ?
    Enhance the Cube, DSO, change  TRFN/TR/UR..

  • Blog Clarification: Role of BI developer to get Deltas functional (part 1)

    Hi,
    I will appreciate some clarification on the following blog that I reviewed:
    /people/swapna.gollakota/blog/2007/12/27/how-does-a-datasource-communicates-delta-with-bw  
    This blog seems to be suggesting that as a BI developer, you need to know the DELTA TYPEs for each datasource (selected in R3) and with that knowledge, you will be in a better position to choose Update types in ODS/DSO Overwrite or Addition, in the BW/BI environment.
    At least that is what I came up with from the review.
    1. For each datasource you activate as a BW developer in the BW environment, do you really need to go to R3 and study this datasource through the table RODELTAM to see the datasource type e.g. ABR, ABR1, etc and all the settings such as serialization =1 or 2?
    2. I thought these datasouces, e.g. 2LIS_02_HDR are SAP delivered, ready to be used, isnu2019t it?
    So, I understood that a decision will have to be made between ABR and ABR1 as the delta type.
    i. whose responsibility is it to make such decisions?
    ii. The R3 folks or the BW developer?
    iii. What influences such decision making?
    iv. In this table(RODELTAM), on R3, I saw several u201CXu201Ds under different columns, who sets these?
    3. There were also discussions as to whether a datasource supports DSO/Cube or not?
    Does this support the argument to study a datasource on R3 side in the table RODELTAM before using it?
    Thanks

    1. For each datasource you activate as a BW developer in the BW environment, do you really need to go to R3 and study this datasource through the table RODELTAM to see the datasource type e.g. ABR, ABR1, etc and all the settings such as serialization =1 or 2?
    If you are a newbie and using it for the first time ofcourse you have to.But I would simply use RSA2 transaction than those tables.
    2. I thought these datasouces, e.g. 2LIS_02_HDR are SAP delivered, ready to be used, isnu2019t it?
    So, I understood that a decision will have to be made between ABR and ABR1 as the delta type.
    i. whose responsibility is it to make such decisions?
    ii. The R3 folks or the BW developer?
    iii. What influences such decision making?
    iv. In this table(RODELTAM), on R3, I saw several u201CXu201Ds under different columns, who sets these?
    Ofcourse its BW developer who deals with BW related things.
    iii. What influences such decision making?
    Your datamodel.
    3. There were also discussions as to whether a datasource supports DSO/Cube or not?
    Does this support the argument to study a datasource on R3 side in the table RODELTAM before using it?
    Datasource supports DSO/Cube, thats the whole point for determining the type of delta.
    Hope this helps.

  • Polymorphism in programmatic view objects

    We are trying to benefit from viewobject polymorphism. We have Programmatic VOs that directly talk to a service and do not have any backing EOs. Hierarchy is like follows:
    - Annot
    - VisualAnnot
    - StampAnnot
    - TextAnnot
    - NonVisualAnnot
    We need to get a collection of polymorphic rows when we get a list of Annots. We have defined StampAnnot and TextAnnot as subtypes of Annot in AM. However, all the rows that we get are of type Annot with the minimal set of attributes Annot should have, and not the StampAnnot or TextAnnot with additioanal attributes.
    We tried overwriting createRowFromResultSet() method in Annot VO so that we can generate the row of appropriate type. However, that also does not work because we still use ViewDef of Annot and attribute validation fails. Our questions are:
    - Why polymorphism is not working in our case while we have defined a discriminator field and have the proper subtypes define in our AM? Should one must have EOs to make polymorphism work?
    - If we need to overwrite some additional framework methods, any advice or sample to help us with that?
    thanks and regards

    Hello,
    Steve Muench´s sample 132 implements what you are trying to accomplish. Inspect the sample and let us know if you have questions about it.
    http://blogs.oracle.com/smuenchadf/resource/examples#132
    Juan C.

  • Results in  the query

    Hi Experts,
    I have a query which calculates the quantity for a particular date, the results are wrong in the query. i checked the data in the psa it matches with the data from the source system, the keyfigure is set to addition in update rules.can any one let me know why am i getting the wrong results.
    Thank You,

    Hai Shetty,
                   Your question is very vague. Most of the times, overwriting or addition depends on the way the system is designed and the requirement. As far as I understood. You have data in a cube or ODS and your report values donot match with the values in PSA. If thats the case, I would go step by step.
    1. Select some of the records that have eronous values.
    2. For the same records(based on primary keys), I would check the data in PSA and the data in ods or cube. If you have any intermediate stages, check them too. then you can findout where its going wrong.
    3. If the data in one target(PSA) donot match with the immediate target(ODS or Cube), then there is certainly problem with either start routine or mapping or the addition or over writing. It depends on the system you are working on.
    4. If the data in ODS or cube doesnot match with report, then you can check the query logic.
    Hope this helps. Let me know if you need more information.
    Thanks.

  • Loading the data from essbase studio to the cube

    I am building application using Essbase studio 11.1.2
    I have been able too create the hierarchies, but I am not clear on the process of assigning the data piece.
    In this instance all data is in a table named “FullPolicyCount”
    What is the process for a successful load of the data to the cube
    Please advise

    Hi,
    take a look into psa data. The key is defined by the request, the record number, the data packet and the change indic. Use the same key for your ods/dso and you will get all records to your ods/dso. Use update mode overwrite for the keyfigures (but just because it is the default). With the obove mentioned key it doesn't matter if your update mode is overwrite or addition.
    regards
    Siggi

  • Inventory SnapShot ODS

    The key figures which are defined in the How to Inventory guide for the SnapShot ODS are overwrite or addition? Thanks

    As i understand it is addition, load source is 2lis_03* as in the cube, the thing is to have a picture of stock at end of each period

  • No field values in ODS!

    Hi,
    we have four fields in ODS for which the data is empty!
    i checked in R/3 tables for the repective fields ! there is a data and there is no TR or UR written against these fields!
    How could i trace the missing data?
    Thanks,
    Ravi

    Hi Ravi,
        If mapping is not there in update rules or transferules, these field will be blank only.
    Now you want to load these fields...? What is your dataflow...? loading ODS in overwrite or addition mode...? from which datasource..?
    Hope it Helps
    Srini

Maybe you are looking for

  • Where can I get my serial #s

    where can I get my serial #s

  • DMA2100 won't reconnect to VMC

    Just got a DMA2100. Set it up as an extender, all worked fine(ish)! Tried it next morning...get "Waiting for PC...." then eventually times out. I can ping the extender from the Media Center PC so it appears the extender is connecting to the network O

  • Column browser at left hand side?

    In 'Song' view, I prefer the Column Browser to be at the left hand side, as it used to be on (itunes 10?) At itunes 11 the column browser is at the top It takes me longer to scroll to the item I want as the column browser has less depth (due to the h

  • Digital Juice Video Backgrounds

    I am making a Keynote presentation that I am building and playing back live on a fully loaded MacPro with two monitors. I tried to use a Digital Juice Jumpback background as well as a background from the new Editors Toolbox 2 with some problems. I co

  • Why won't my new music purchased on my iPad not coming up on my iPhone??

    Why won't my new music purchased on my iPad not coming up on my iPhone??