Moving data from DSO to cube

Experts,
We are in a situation where we might have to move data from cube to DSO. We doing POC and finding that even if DSO is set to overwrite, the data is summing up for same billing document. I would like to know:
1) is there any o

Hi,
As the data in the cube is already in additive mode, if you get the data to a DSO in overwrtite mode, you will get an incorrect data.
So in order to get the correct data,, it is always best to keep the DSO id additive mode.
e.g
Request ID Customer Quantity
20               1000          15
21               1000          10
in this situation you would want to report
Customer Quantity
1000          25
an ODS with overwrite mode, without the request ID as a KEY would give
Customer Quantity
1000          10
an ODS with overwrite mode, with the request ID as a KEY would give
Customer Quantity
1000          25
an ODS with additive mode, with/without the request ID as a KEY would give
Customer Quantity
1000          25
Hope this helps
-Vikram

Similar Messages

  • Issue when uploading Sales data from DSO to Cube.

    Dear All,
    I have an issue when I am uploading Sales data from DSO to Cube.  I am using BI 7.0 and I have uploaded all sales document level data to my DSO.  Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube.  Cube has customer wise aggregation data.
    In DSO I have NetPrice(KF) and Delivered_QTY(KF).  I do a simple multiplication routine in the transformation from DSO to Cube.
    RESULT =   SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
    At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
    Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
    Can someone please help me.
    Shanka

    Hi,
    are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
    did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
    During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
    but first verify if other key figures are also having the same issue.
    Thanks
    Ajeet

  • Problem while  data processing TRANSACTION data from DSO to CUBE

    Hi Guru's,
    we are facing problem while  data processing TRANSACTION data from DSO to CUBE. data packets  processing very slowly  and updating .Please help me regarding this.
    Thanks and regards,
    Sridhar

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.
    You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the cube.
    Thanks,
    JituK

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

  • Regarding  Loading the data from DSO to cube.

    Hello Experts,
    I have DSO which loads data from psa  using 7.0 tranformation (using DTP).  And i have a cube which loads the data from that DSO using 3.x transfer rules. Now i have deleted the init request for a infopack before load the data into DSO. But when i load the data from DSO to cube by  right click on DSO -> click On Additional Functions -> update the 3.x data to targets, It is giving me an error like 'Delete init. request REQU_4H7UY4ZXAO72WR4GTUIW0XUKP before running init. again with same selection
    Please help me with this.
    i want to load the data in the init request to cube..
    Thanks

    Hi Shanthi,
    Thanks For reply. I have already deleted the init request from source system to DSO and then i have tried still i am getting the error.
    Thanks

  • DTP load data from DSO to CUBE can't debug

    Hi,all expert.
    I met a problem yestoday.
    When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
    inconsistent input parameter(parameter:<unknow>,value <unknown>)
    message no: RS_EXCEPTION 101.
    I don't know why,PLZ help me,thank you very much.

    Hi,
    An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
    Thanks,
    Arminder

  • When I extracting data from DSO to Cube by using DTP.

    When i am extracting data from DSO to Cube by using DTP.
    I am getting following erros.
    Data package processing terminated (Message no. RSBK229).
    Error in BW: error getting datapakid cob_pro (Message no. RS_EXCEPTION105).
    Error while extracting from source 0FC_DS08 (type DataStore) - (Message no. RSBK242).
    Data package processing terminated. (Message no. RSBK229).
    Data package 1 / 10/04/2011 15:49:56 / Status 'Processed with Errors'. (Message no. RSBK257).
    This is the brand new BI 7.3 system. Implementing the PSCD and TRM.
    I have used the standard business content objects in FI-CA (dunning history header, item,activities) and standard Datasource (0FC_DUN_HEADER ,0FC_DUN_ITEMS, 0FC_DUN_ACTIVITIES). I have extracted data till the DSO level . when I try to pull the data to info provider level(cube) using DTP . I am getting fallowing error.
    my observation: when ever I use the DSO as source to any target like another DSO or cube. its throwing same kind of error for any flow including simple Flat file .
    please suggest any one whether do I need to maintain basic settings since its a brand new BI 7.3.
    please help me out on this issue . I am not able to move forward .its very urgent

    hello
    Have you solved the problem ?  
    I have the same error...
    as you solve this error,
    can you help me please I have the same error
    yimi castro garcia
    [email protected]

  • Job cancelled While loading data from DSO to CUBE using DTP

    Hi All,
      while i am loading data frm DSO to CUBE the job load is getting cancelled.
    in the job overview i got the following messages
        SYST: Date 00/01/0000 not expected.
       Job cancelled after system exception ERROR_MESSAGE
    wt can be the reason for this error as i have successfully loaded data into 2 layers of DSO before loading the data into the CUBE

    hi,
    Are u loading the flat file to DSO?
    then check the data in the PSA in the date field and replace with the correct date.
    i think u r using write optimised DSO which is similar to PSA it will take all the data from PSA to DSO.
    so clear the data in PSA and then load to DSO and then to CUBE.
    If u dont need a date field in CUBE then remove the mapping in transformation in cube and activate and make DTP activate and trigger it it will work.

  • Error while loading data from DSO to Cube

    Dear Experts,
    I have change update routines of the update rule for DSO & update rule between DSO & cube. when i uploading data in DSO , working fine. But when uploading from DSO to cube , it giving error.
    Error Message :
    DataSource 8ZDSO_ABB does not have the same status as the source system in the Business Information Warehouse.
    The time stamp in the source system is 30.09.2010 14:05:17.
    The time stamp in the BW system is 09.05.2008 16:41:10.
    I executed program RS_TRANSTRU_ACTIVATE_ALL , still giving same error.
    Regards,
    Anand Mehrotra.

    Dear Experts,
    Thanks for help.Problem solved.
    Regards,
    Anand Mehrotra.

  • Change Log Data from DSO to CUBE

    Hi Experts,
      Is it possible to load/transfer change log data as well from DSO to Cube
    Thanks

    Hi,
    Yes it is possible.
    Create a DTP from DSO to cube.
    In the extraction tab of DTP choose the option
    Datasource Change log
    Hope it Helps!!
    Reward if useful
    Manish

  • While loading data from DSO TO CUBE.(which Extraction option)

    hi friends,
                     my questin is that when we load the data from one data target to another data target then we have four options to extract the data in DTP screen from (1.active data table (with archive) 2.active data table (without archive) 3.change log table. and fourth i dont remember.so my question is when we update data like from DSO TO CUBE. Then which option we should choose. 2nd or 3nd.?
    <removed_by_moderator>
    Edited by: Julius Bussche on Jan 4, 2012 9:26 AM

    - If you want to do a full load always select 2nd option i.e. Active Data Table (without archive).
    - If you wand to do a delta load, you can select either of 2 or 3
       Active Data Table (without archive) - first load will be full from active table and after that it will be delta from change log
       (ii) Changelog table - you will have to do a full load using a full DTP and then delta can be done using this DTP (but you will need to an init as well using this DTP)
    - if you have done using a full DTP and now just want to run delta, select 3rd option and do an init before running delta
    I hope it helps.
    Regards,
    Gaurav

  • Semantic Partitioning Delta issue while load data from DSO to Cube -BW 7.3

    Hi All,
    We have created the semantic partion with the help of BADI to perform Partitions. Everthing looks good,
    first time i have loaded the data it is full update.
    Second time i init the data load. I pulled the delta  from DSO to Cube . DSO is standard where as Cube is partition with the help of semantic partition. What i can see is the records which are updated in Latest delta are shown into the report rest all are ignored by the system.
    I tried compression but still it did not worked.
    Can some has face this kind
    Thanks

    Yaseen & Soujanya,
    It is very hard to guess the problem with the amount of information you have provided.
    - What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
    - How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
    - Is there data already in the InfoCube?
    - Is there change log data for DSO or did someone delete all the PSA data?
    Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
    Good luck.
    Sudhi Karkada
    <a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a>

  • Data from DSO to CUBE

    hi gurus,
    please tell me from which table (Changelog or Active data) of DSO the cube will pick the records
    and also overwrite functionality works with DSO Change log
    Please awaiting a quick response
    Thank you

    Hi,
    Answer is already in my previous post.
    i.e.  Delta load - From Change Log,
      but in BI7.0 data flow, when you are using DTP the Delta behaviour will be based on your selection. as below.
    BW Data Manager: Extract from Database and Archive?
    The settings in the group frame Extraction From... or Delta Extraction From... of the Data Transfer Process maintenance specify the source from which the data of the DTP is extracted.  For a full DTP, these settings apply to all requests started by the DTP. For a delta DTP, the settings only apply to the first request (delta initialization), since because of the delta logic, the following requests must all be extracted from the change log.
    For Extraction from the DataStore Object, you have the following options:
    Active Table (with Archive)
    The data is read from the active table and from the archive or from a near-line storage if one exists. You can choose this option even if there is no active data archiving process yet for the DataStore object.
    Active Table (Without Archive)
    The data is only read from the active table. If there is data in the archive or in a near-line storage at the time of extraction, this data is not extracted.
    Archive (Only Full Extraction)
    The data is only read from the archive or from a near-line storage. Data is not extracted from the active table.
    Change Log
    The data is read from the change log of the DataStore object.
    For Extraction from the InfoCube, you have the following options:
    InfoCube Tables
    Data is only extracted from the database (E table and F table and aggregates).
    Archive (Only Full Extraction)
    The data is only read from the archive or from a near-line storage.
    hope this gives you clear idea.

  • ERROR :  "Exceptions in Substep: Rules" while loading data from DSO to CUBE

    Hi All,
    I have a scenario like this.
    I was loading data from ODS to cube daily. everything was working fine till 2  days before. i added some characteristic fields in my cube while i didn't do any change in my key figures. now i started reloading of data from ODS to cubes. i got the following error  "Exceptions in Substep: Rules" after some package of data loaded successfully.
    I deleted the added fields from the infocube and then i tried toreload data again but i am facing the same error again  "Exceptions in Substep: Rules".
    Can any one tell me how to come out from this situation ?
    Regards,
    Komik Shah

    Dear,
    Check some related SDN posts:
    zDSo Loading Problem
    Re: Regarding Data load in cube
    Re: Regarding Data load in cube
    Re: Error : Exceptions in Substep: Rules
    Regards,
    Syed Hussain.

  • Error in loading data from DSO to Cube

    Hi
    Daily process chains are executed and the data is loaded from DSO to several cubes.
    On 7.11.2008, there was an error in one of the cube. When checked in the short dump, the following is the dump msg:
    DBIF_REPO_PART_NOT_FOUND. I looked into the note 854261 and had a chat with our basis guy, he said our kernal and patch levels are more then that mentioned in the note.
    Strange thing is, from 8.11.2008 to till date, the loads are running fine but the problem was only on that date.
    So now i have several request in the cube with loads of data but "WITHOUT" reporting. Because of error on 7.11.2008.
    Can anyone help in solving the issue. I want to re-execute the load again and see if the problem occurs again or not.
    Regards
    Annie

    Hi,
       Check the reconstruction tab. If the request is green, you could try deleteing and reconstructing the incorrect request.
    Regards.

Maybe you are looking for

  • Has to be an easy fix -

    Installed new Nano on existing Itunes computer / actt - here is the error message: "Thi Ipod "Nano" could not be updated. The disk could not be read from or written to" Any suggestions other than chuck it and watch a movie?

  • Problem in inserting data into MySQL table

    Hi All, I have a table with just two fields FIRSTNAME and SECONDNAME in MySQL. I am accepting the First Name and Second Name in two text fields and then trying to insert them into the MySQL table. I have set the concurrency of the rowset to CONCUR_UP

  • Giving a specific path to the reading from excel application

    Hello, i want to know there is a way of giving a specific path to an application (reading from excel sheet) without all the time getting to enter the path? if someone could help. i'm using labview 8.0 Regards, Tchaou

  • Head in the Cloud[s]?

    I was checking out the CS6 product page, especially wanting to see the uprgrade pricing (I currently own CS4 Design Premium full suite). I had not really investigated the Creative Cloud thing before, so I went through the FAQs. I'm still not sure if

  • Zoom behaving in new unexpected manner

    I don't know if I accidentally hit some perfect combo of keystrokes to change how my MBP zooms, but it is different today than yesterday. I'll do my best to explain. And there are a few different things going on. 1. It used to be that a little circle