DSO to Cube Delta Failure

Hi All
We appear to have a major problem.
Yesterday the invoice delta from DSO to Cube failed. My colleage traced the problem to invalid characters in the PSA. However it had loaded successfully into the DSO (I dont know why cos it usually fails into DSO if invalid chars are present).
So we deleted that request from the DSO (even though it was green), fixed the invalid chars in PSA and then manually updated from PSA to DSO. Fine all went well so far.
Then we ran the delta load from DSO (datamart) for cube but this came back with error saying 'Previous delta has not finished so cannot proceed' All monitors were RED status but still no good. So we left it thinking that todays process chain run would fix it.
But it didnt. The error today is 'Delta update for ZINV_CUBE is invalidated. Message no. RSBM113'
The data in the DSO is all correct and upto date. However the datamart status for all requests (going back to Nov 2007) is missing. In the infosource view the outgoing request (REQU_) for Yesterday is missing and todays REQU has red status. Yesterdays incoming request (ODSR_*) is followed by todays incoming request.
From reading previous posts I think I need to do initialise without data transfer, but will that leave the cube with 2 missing requests (today and yesterday). How do I get them into the cube ?
I could delete the cube contents and perform init with data but I dont feel comfortable as it involves nearly 2 years worth of data. Also will all DSO requests get rolled up into a single cube request ?
All advice very much appreciated
Asif

Hi,
Hope so the requests for today and yesterday is there in PSA....
Now change the RED status of the failed requests in DSO and cube manually to RED i.e QM status...here only u have faced the issue that previous delta was not yet finished...
U can chk in SM12 for any locks present....
Now u can delete the requests in cube(this will delete the data mart status from ODS) followed by DSO...
Now first u load the data to the ODS manually, activate the ODS and then uplaod it to cube manually thru additional function->update data to targets->Delta update and continue...this'll load the complete data for the two days...
No need to perform INIT if the data is presetn in PSA....
rgds,
Edited by: Krishna Rao on May 21, 2009 6:10 PM

Similar Messages

  • Delta records not updating from DSO to CUBE in BI 7

    Hi Experts,
    Delta records not updating from DSO to CUBE
    in DSO keyfigure value showing '0' but in CUBE same record showing '-I '
    I cheked in Change log table in DSO its have 5 records
    ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M -  -1
    ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M -   0
    ODSR_4LIF02ZV32F1M85DXHUCSH0DL -   0
    ODSR_4LIF02ZV32F1M85DXHUCSH0DL -   1
    ODSR_4LH8CXKUJPW2JDS0LC775N4MH -   0
    but active data table have one record - 0
    how to corrcct the delta load??
    Regards,
    Jai

    Hi,
    I think initially the value was 0 (ODSR_4LH8CXKUJPW2JDS0LC775N4MH - 0, new image in changelog) and this got loaded to the cube.
    Then the value got changed to 1 (ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 0, before image & ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 1, after image). Now this record updates the cube with value 1. The cube has 2 records, one with 0 value and the other with 1.
    The value got changed again to 0 (ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - (-1), before image &
    ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - 0, after image). Now these records get aggregated and update the cube with (-1).
    The cube has 3 records, with 0, 1 and -1 values....the effective total is 0 which is correct.
    Is this not what you see in the cube? were the earlier req deleted from the cube?

  • New field added to cube, delta DTP from DSO to cube is failing

    Dear all,
    Scenerio in BI 7.0 is;
    data source -delta IPs-> DSO -
    delta DTP---> Cube.
    Data load using proces chain daily.
    We added new field to Cube --> transformation from DSO to cube is active, transport was successful.
    Now, delta from DSO to cube is failing.
    Error is: Dereferencing of the NULL reference,
    Error while extracting from source <DSO name>
    Inconsistent input parameter (parameter: Fieldname, value DATAPAKID)
    my conclusion, system is unable to load delta due to new field. And it wants us to initialize it again ( am i right ?)
    Do I have only one choice of deleting data from cube & perform init dtp again ? or any other way ?
    Thanks in advance!
    Regards,
    Akshay Harshe

    Hi Durgesh / Murli,
    Thanks for quick response.
    @ durgesh: we have mapped existing DSO field to a new field in cube. So yes in DTP I can see the field in filter. So I have to do re-init.
    @ Murli: everything is active.
    Actully there are further complications as the cube has many more sources, so wanted to avoid seletive deletion.
    Regards,
    Akshay

  • Data is not coming correctly through delta from DSO to Cube

    HI All,
    When there is change in Org. Unit, Job or any characteristic of employee then the the delta of data source picks that change records.
    I have used DSO before Cube. Now in DSO active table have one record against the Employee and same Schedule hours as ECC have.
    But while loading to the cube, this data of schedule hours will get added to existing schedule hours  of Cube , which makes the schedule hours double for that Employee. For exmaple.
    DSO Data.
    Emp NO.     Date Org. Unit     Sch. Hrs
    1                    ABC                              16
    CUBE Data
    Emp NO.     Org. Unit     Sch. Hrs
    1                    ABC               16
    Now Employee Org. Unit get changed to XYZ.
    DSO Data.
    Emp NO.     Org. Unit     Sch. Hrs
    1                    XYZ               16
    CUBE Data
    Emp NO.     Org. Unit     Sch. Hrs
    1                    ABC               16
    1                    XYZ               16
    I need data as it is in the DSO.
    Please help, thanks in advance.
    Arvind

    Hi Arvind,
    DSO and Infocube works different for char values......you can define Key Fields in DSO...
    whereas in Infocube each Characteristic value work as Key Field.....that's the reason if there is any change in any characteristic value ...
    DSO Active data will not change if Key field has not change..... the newly updated Data field value will be available in DSO Active Table.....whereas in Infocube... it will create a new entry......
    In order to get the updated value each time you need to explore some other options .....like complete deletion of Info cube data and full load everyday.......
    Other options depends on design and data volume and other parameters.
    Hope this helps........
    Regards
    Mayank

  • Semantic Partitioning Delta issue while load data from DSO to Cube -BW 7.3

    Hi All,
    We have created the semantic partion with the help of BADI to perform Partitions. Everthing looks good,
    first time i have loaded the data it is full update.
    Second time i init the data load. I pulled the delta  from DSO to Cube . DSO is standard where as Cube is partition with the help of semantic partition. What i can see is the records which are updated in Latest delta are shown into the report rest all are ignored by the system.
    I tried compression but still it did not worked.
    Can some has face this kind
    Thanks

    Yaseen & Soujanya,
    It is very hard to guess the problem with the amount of information you have provided.
    - What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
    - How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
    - Is there data already in the InfoCube?
    - Is there change log data for DSO or did someone delete all the PSA data?
    Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
    Good luck.
    Sudhi Karkada
    <a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a>

  • Delta between DSO and Cube

    Hi All,
    I have been working on a scenario where in we have to realign org structure information of some sales orders , now we have existing sales orders data in DSO , which feeds to a cube . As part of the realignment some master data attributes, more specifically Customer Master attributes,  will be changed in R3 for those Sales Orders.
    These attributes are part of the transaction data load which is happening to the DSO from the source, now since this transaction load is taking time, to speed up the process of realignment in BW I have used a routine to lookup the masterdata attributes from 0CUSTOMER in BW.
    so before BW pulls the realigned data ( Sales Orders ) from R3, I will run a self loop in the DSO where in it will do a lookup in the 0CUSTOMER master data in BW to get the updated customer attributes (ofcourse the 0CUSTOMER load will happen before we do any realignment of the source data) .
    Once the self loop transformation is completed and the sales orders are updated with the updated customer attributes, I will load the data to the cubes. (delta load)
    Now once the sales order realignment is finished in R3 , I will pull the same set of data in the DSO .
    Now the question is, is it necessary that I do a reinit of the cube before I load the second set of data to the cube?
    I guess it is not necessary.
    Please share your thoughts.

    Hello Partho
    I understand your KF are overwrite in DSO. But in BI7 flow delta from DSO to cube is request based.
    When you will do a self-loop, same records ( I guess you have already loaded them to cube) will have different request id.
    As a result your Change Log table will get different id after activation. So same record will get loaded to cube and which will make all your KF in cube doubled.
    I believe to make correction in historical records in cube either you have to delete cube content and then run delta from DSO to cube.  As first time delta will read Active table of DSO you will get all KF correct in cube..
         Else, you have to delete all those previously loaded record from cube selectively.
    Regards
    Anindya

  • Loading full  (DTP)to Dso and then Delta to info cube

    Hi Experts,
    I am loading data from Oracle and the data voulme is very huge.
    The only option i could have in the infopackege (to PSA) is Full load.
    I am loading a FULL DTP to DSO and the a Delta DTP to cube.
    So for the next load  i should delete the request in PSA and DSO and keep the request in the cube.
    How could i delete the request in DSO without deleting the request in cube?
    Is there any alternate solution to improve the load time?
    Please advise.
    Thank you.
    Priya

    hai priya,
    u r loading the data to dso and after that from dso to cube, is it correct ???
    if it is correct u have to do full load to cube also,
    otherwise there is another option is please choose 'write optimized DSO' instead of Standard DSO
    if still u r not cleared plz post clear information so that i can suggest u

  • Data load from DSO to cube

    Hi gurus
    We have a typical problem, we have to combined 2 records into one when they reach to DSO.
    One source is flat file and the other is R3. so few fields I am geting from flat file and few from R3 . They are creating one record when they loaded to DSO . Now I am getting one record in active data table but when I load the data to cube from that DSO ( Data is going from change log to cube ) to cube , I am getting 2 seperate records ( one which is loaded from flat file and one which is loaded from R3 ) in cube which I dont want. I want only one record in cube just like i have in my active data table in DSO.
    I cant get the data from Active data table because I need delta load.
    Would you please advise what can I do to get that one record in Cube?
    Pl help

    Ravi
    I am sending the data thro DTP only but is there any solution to get one record bcos in other scenario I am getting data from 2 different ERP sources and getting one record in DSO and cube as well.
    but that is not happening for this second scenario where i am getting data from FLAT file and ERP and traing to create one record.

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

  • While loading data from DSO TO CUBE.(which Extraction option)

    hi friends,
                     my questin is that when we load the data from one data target to another data target then we have four options to extract the data in DTP screen from (1.active data table (with archive) 2.active data table (without archive) 3.change log table. and fourth i dont remember.so my question is when we update data like from DSO TO CUBE. Then which option we should choose. 2nd or 3nd.?
    <removed_by_moderator>
    Edited by: Julius Bussche on Jan 4, 2012 9:26 AM

    - If you want to do a full load always select 2nd option i.e. Active Data Table (without archive).
    - If you wand to do a delta load, you can select either of 2 or 3
       Active Data Table (without archive) - first load will be full from active table and after that it will be delta from change log
       (ii) Changelog table - you will have to do a full load using a full DTP and then delta can be done using this DTP (but you will need to an init as well using this DTP)
    - if you have done using a full DTP and now just want to run delta, select 3rd option and do an init before running delta
    I hope it helps.
    Regards,
    Gaurav

  • Data from DSO to CUBE

    hi gurus,
    please tell me from which table (Changelog or Active data) of DSO the cube will pick the records
    and also overwrite functionality works with DSO Change log
    Please awaiting a quick response
    Thank you

    Hi,
    Answer is already in my previous post.
    i.e.  Delta load - From Change Log,
      but in BI7.0 data flow, when you are using DTP the Delta behaviour will be based on your selection. as below.
    BW Data Manager: Extract from Database and Archive?
    The settings in the group frame Extraction From... or Delta Extraction From... of the Data Transfer Process maintenance specify the source from which the data of the DTP is extracted.  For a full DTP, these settings apply to all requests started by the DTP. For a delta DTP, the settings only apply to the first request (delta initialization), since because of the delta logic, the following requests must all be extracted from the change log.
    For Extraction from the DataStore Object, you have the following options:
    Active Table (with Archive)
    The data is read from the active table and from the archive or from a near-line storage if one exists. You can choose this option even if there is no active data archiving process yet for the DataStore object.
    Active Table (Without Archive)
    The data is only read from the active table. If there is data in the archive or in a near-line storage at the time of extraction, this data is not extracted.
    Archive (Only Full Extraction)
    The data is only read from the archive or from a near-line storage. Data is not extracted from the active table.
    Change Log
    The data is read from the change log of the DataStore object.
    For Extraction from the InfoCube, you have the following options:
    InfoCube Tables
    Data is only extracted from the database (E table and F table and aggregates).
    Archive (Only Full Extraction)
    The data is only read from the archive or from a near-line storage.
    hope this gives you clear idea.

  • DSO to Cube

    Hi Experts,
      This is an issue related to data loads from DSO to Cube.
      A request was loaded successfully to the DSO but was in yellow status in the Cube for a long time.
      I deleted the request and tried to reload it again from the DSO to Cube. While doing this I got a information message "No new deltas in DataStore object for update" and I get 2 options FULL and INIT. There is no DELTA option.
    How can reload that particular request from DSO to CUBE without disturbing any data in the Cube or DSO.
    Thanks
    Ashok

    Hi Ashok,
    Create a DTP from DSO to Cube with the option Full as upate and see the data will be successfully loaded into the Cube.
    The reason for earlier scenario,the load must be in delta mode..and since there are no new or changed records at DSO level with the given selections.so it did not load into Cube.
    Regards
    Ramsunder

  • Inventory Management Extractors via DSO to Cube 0IC_C03

    Dear experts,
    to fulfill the requirements of DataWarehouse spirit using Entry-, Harmonization- and Reporting-Layer I want to use for Inventory Management a Data Flow from extractors 2LIS_03_BX, 2LIS_03_BF and 2LIS_03_UM via DSO/DSOs and later on to Cube 0IC_C03.
    In this forum there are some opened threads about this issue, but none has been resolved! Many hints are about the "How to Inventory Management..." - some other say: "You can use DSO's!" - others say: "Don't use DSOs!". So: Where is the truth? Is there anybody who can provide a really praticable way to implement the above mentioned data flow?
    I'm so excited about your suggestions and answers.
    Thanks in advance and regards
    Patrick
    --> Data Flow has to be in BW 7.0

    Hi Patrick,
    Yes indeed there is.
    Using DSOs in inventory flow is absolutely possible. Here's what you need to know and do:
    1 - Firstly don't be apprehensive about it. Knowledge is power! We know that inventory uses ABR delta update methodology and that is supported by DSOs so no doubt abt it.
    2 - Secondly Inventory is special because of non-cumulative and several rule group at cube level for BX and BF.
    3 - Now as you want to use a DSO and I am presuming that this will be used for staging purpose, use a write optimized DSO as the first layer. While mapping from DS to this DSO, keep one to one mapping from field to Info-objects.
    4- Keep in mind that from Infosource to 0IC_C03 there would be multiple rule groups present in transformations.
    These rule groups will have different KPIs mapped with routines. The purpose is that from ECC only 1 field for quantity (BWMNG) and one field for value (BWGEO) is coming but from Infosource to 0IC_C03 the same fields in different rule groups are mapped to different KPIs (Stock in transit, vendor consignment, valuated stock etc) based on stock category, stock type and BW transaction keys.
    5- So create a write optimized DSO. map it one to one from datasource fields. create/copy transformations from this DSO to cube maintaining the same rule groups and same logic.
    6- you can also use standard DSO and create rule groups and logic at DSO level and then do one to one from DSO to Cube.
    Keep in mind that these rule groups and logic in them should be precisely the same as in standard flow.
    This should work.
    Debanshu

  • Source system - DSO-1 - DSO-2 - Cube (Performance improvement)

    Hello ppl,
    If I have a data flow like written below:
    Source system --> DSO-1 --> DSO-2 --> Cube
    Situation is:
    1.     Data load from source system to DSO-1 is full and can not be changed to delta itu2019s the restriction.
    2.     Amount of data flowing in DSO-1 is in millions daily (because it is full and growing).
    3.     Currently all DTPs in between source system and cube are maintained as full.
    Requirement is to improve performance, is it possible to make DTPs as Delta after DSO-1. Since I am deleting all records from DSO-1 and its PSA daily not sure if it will work? If yes how ?
    Any suggessionsu2026
    Regards,
    Akshay

    DTPs  delta is -->request based delta from write optimized DSO.The DTP should have the delta option selected.
    DTPs delta is -->request based delta only also for standard DSO.
    You will find initialization option in DTP on the execution tab(select DTP as type delta in extraction tab).
    Now since you will be doing init from DSO so system will also give you option to select table of DSO for extraction:
    i.e. Active/Changelog.
    You can select Active if you are using Write optimized DSO and Changelog if you are using standard DSO for DSO 1.
    Now coming bact to your case,using write optimized DSO will definately enhance the performance  as you don't have to activate the request..but I have recently had a very bad experience with write optimized DSO regarding delta.
    So I dont' trust write optimized DSO for pulling delta from it.
    if you want to use it ,test it as much possible before using it . I would still go with standard(the beauty of changelog table).
    Now regarding DTP ,once you load the the data from the DSO you can load init information even after the load.Else load the first request from dtp w/o data transfer for init and then run deltas..both will work fine.Its a similar option present in infopackages called "init without data transfer".
    Refer this thread ,a really nice discussion in this :
    BI7: "Delta Init. Extraction From..." under DTP Extraction tab?

  • Delta failure

    HI all,
    When Delta load occures from DSO to CUBE load was failed.In this issue we need  to make the request  to Red and delete the request from the cube. Then need to delete the Datamart status in the DSO.
    My Question is, why we need to make the request red  and delete first and then go for Datamart Staus.What happen if we do viceversa.
    Can any one give suitable answer.
    Regards,
    Bunny

    Hi Pranav,
    The reason why we turn the bad request red and then delete the request prior to the deletion of data mart is as follows:
    1) If we turn the request red then we are sending the message to the source system that the last delta was not correct and the records need to be extracted again.
    2) The delta queue keeps the last delta records only till the next delta run goes green.
    If we don't turn the request red and in case of a green request being deleted accidentally or due to some other reason, then we stand a chance of data loss from the source system. Hence as a precaution we always turn a request that needs to be re extracted to red and then delete the request.
    We cannot delete the data mart status if the request still exists in the data target(s). The system will not allow you to delete the data mart directly, it will say data already exists in the target(s).
    Hope this clarifies your doubts related to deltas.
    Regards,
    Manoj

Maybe you are looking for