Data problem from DSO to cube

hi
i am facing data from problem from dso to cube
i am getting daat into dso
plant   material  doc_num  itemno  qty
p1      m1           1              1         10
p1      m1           1               2         20
dso i am uisng key fields
doc_no
item_no
doc_year
data fields
material
plant
qty
while loading data into cube
its showing only
p1 m1 1 2 20
item 1 is missing
there is no iten num filed in my target field?
ple let me know ur idea.

Hi,
The problem is in your ODS/DSO.
In your ODS you have 'Doc_num' as key field.
For both Items 1 & 2, you have same key field ( Doc_Num) Value.
So, the first record will be over written by the second one. So from your ODS you will get only the latest record to the Cube.
If you want to send both doc_nums to cube, you need to bring doc_num to the data fields area of ODS.
Hope this helps.
Cheers
Praveen

Similar Messages

  • Data load from DSO to Cube in BI7?

    Hi All,
    We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
    a. Infopackage1 from datasource to ODS and
    b. Infopackage2 from ODS to the CUBE.
    Now after we transported the migrated dataflow to production, to load the same infoproviders I use
    1. Infopackage to load PSA.
    2. DTP1 to load from PSA to DSO.
    3. DTP2 to load from DSO to CUBE.
    step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
    Please note that the DSO already has data loaded using infopackage.  (Is this causing the problem?)
    Thanks,
    Sirish.

    Hi Naveen,
    Thanks for the Reply. The creation of DTP is not possible without a transformation.
    The transformation has been moved to production successfully.

  • Partial data loading from dso to cube

    Dear All,
    I am loading the data from dso to the cube through DTP. The problem is that some records are getting added  through data
    packet 1 to the cube around 50 crore records while through data packet2 records are not getting added.
    It is full load from dso to cube.
    I have tried deleting the request and executing DTP but again the same no of records get added to the cube through datapacket 1   and after that records are not added to the cube through datapacket 1  ;request remains in yellow state only.
    Please suggest .

    Nidhuk,
    Data load transfers package by package. Your story sounds like it got stuck in second package or something. Suggest you check package size, try to increase to a higher number to see if anything  changes ? 50 records per package kind of low, your load should not spread out into too many packages.
    Regards. Jen
    Edited by: Jen Yakimoto on Oct 8, 2010 1:45 AM

  • Data loading from DSO to Cube

    Hi,
    I have a question,
    In book TBW10 i read about the data load from DSO to InfoCube
    " We feed the change log data to the InfoCube, 10, -10, and 30 add to the correct 30 value"
    My question is cube already have 10 value, if we are sending 10, -10 and 30 Values(delta), the total should be 40 instead of 30.
    Please some one explaine me.
    Thanks

    No, it will not be 40.
    It ll be 30 only.
    Since cube already has 10, so before image ll nullify it by sending -10 and then the correct value in after immage ll be added as 30.
    so it ll be like this 10-10+30 = 30.
    Thank-You.
    Regards,
    Vinod

  • Data load from DSO to cube

    Hi gurus
    We have a typical problem, we have to combined 2 records into one when they reach to DSO.
    One source is flat file and the other is R3. so few fields I am geting from flat file and few from R3 . They are creating one record when they loaded to DSO . Now I am getting one record in active data table but when I load the data to cube from that DSO ( Data is going from change log to cube ) to cube , I am getting 2 seperate records ( one which is loaded from flat file and one which is loaded from R3 ) in cube which I dont want. I want only one record in cube just like i have in my active data table in DSO.
    I cant get the data from Active data table because I need delta load.
    Would you please advise what can I do to get that one record in Cube?
    Pl help

    Ravi
    I am sending the data thro DTP only but is there any solution to get one record bcos in other scenario I am getting data from 2 different ERP sources and getting one record in DSO and cube as well.
    but that is not happening for this second scenario where i am getting data from FLAT file and ERP and traing to create one record.

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • While loading data from DSO TO CUBE.(which Extraction option)

    hi friends,
                     my questin is that when we load the data from one data target to another data target then we have four options to extract the data in DTP screen from (1.active data table (with archive) 2.active data table (without archive) 3.change log table. and fourth i dont remember.so my question is when we update data like from DSO TO CUBE. Then which option we should choose. 2nd or 3nd.?
    <removed_by_moderator>
    Edited by: Julius Bussche on Jan 4, 2012 9:26 AM

    - If you want to do a full load always select 2nd option i.e. Active Data Table (without archive).
    - If you wand to do a delta load, you can select either of 2 or 3
       Active Data Table (without archive) - first load will be full from active table and after that it will be delta from change log
       (ii) Changelog table - you will have to do a full load using a full DTP and then delta can be done using this DTP (but you will need to an init as well using this DTP)
    - if you have done using a full DTP and now just want to run delta, select 3rd option and do an init before running delta
    I hope it helps.
    Regards,
    Gaurav

  • Filter in DTP load from DSO to cube by Forecast version not working ?

    Hi All
    Can any help on the below code I did to filter data update from DSO to Cube in DTP - to filter by  next period forecast version. This code is not working it is loading data pf present forecast version also  Can any one help please
    data: l_idx like sy-tabix.
    data: L_date type sy-datum,
          t_gjahr  type t009b-bdatj,
          t_buper  type t009b-poper,
          1_period(6) type c.
              read table l_t_range with key
                   fieldname = 'ZFCSTVERS'.
              l_idx = sy-tabix.
       clear: t_buper, t_gjahr.
        L_date = sy-datum.
        call function 'DATE_TO_PERIOD_CONVERT'
          EXPORTING
            i_date  = L_date
            i_periv = 'Z1'
          IMPORTING
            e_buper = t_buper
            e_gjahr = t_gjahr.
    *---> Check if the period is 012, then increase the year by 1 and set
    *period to 001.
        if t_buper = '012'.
          t_gjahr = t_gjahr + 1.
          t_buper = '001'.
        else.
    *---> Increase just the period by 1.
          t_buper = t_buper + 1.
        endif.
        concatenate t_gjahr t_buper+1(2)  into 1_period.
        l_t_range-fieldname = 'ZFCSTVERS'.
        l_t_range-low = 1_period.
        l_t_range-sign = 'I'.
        l_t_range-option = 'EQ'.
           append l_t_range.
              p_subrc = 0.
    sk
    Edited by: SK Varma Penmatsa on Jan 23, 2012 2:30 PM

    Hi Praveen/Raj,
    Basically PCS_PER_PACK is a KF i have in the DSO, which i use to calculate the total number of pieces which i store in a KF in the cube. The transformation rule to calculate TOTAL_PCS is a routine.
    within this routine i multiply PACKS * PCS_PER_PACK to calculate the figure. I do not store PCS_PER_PACK in the cube.
    is it this rule that you want me to check whether aggregation is set to SUM or overwrite? Also this rule should add up the total pcs. if I say overwrite then it might not give me the desired result?
    Thing which i cannot figure out is since the transformation rules go record by record why would it add up the PCS_PER_PACK figure before calculating?
    I cannot access the system at the moment. as soon as i can i will check on it and get back to you.
    thanks once again for you're quick response to my need.
    regards
    dilanke

  • Problem while  data processing TRANSACTION data from DSO to CUBE

    Hi Guru's,
    we are facing problem while  data processing TRANSACTION data from DSO to CUBE. data packets  processing very slowly  and updating .Please help me regarding this.
    Thanks and regards,
    Sridhar

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.
    You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the cube.
    Thanks,
    JituK

  • Issue when uploading Sales data from DSO to Cube.

    Dear All,
    I have an issue when I am uploading Sales data from DSO to Cube.  I am using BI 7.0 and I have uploaded all sales document level data to my DSO.  Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube.  Cube has customer wise aggregation data.
    In DSO I have NetPrice(KF) and Delivered_QTY(KF).  I do a simple multiplication routine in the transformation from DSO to Cube.
    RESULT =   SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
    At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
    Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
    Can someone please help me.
    Shanka

    Hi,
    are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
    did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
    During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
    but first verify if other key figures are also having the same issue.
    Thanks
    Ajeet

  • Error while loading data from DSO to Cube

    Dear Experts,
    I have change update routines of the update rule for DSO & update rule between DSO & cube. when i uploading data in DSO , working fine. But when uploading from DSO to cube , it giving error.
    Error Message :
    DataSource 8ZDSO_ABB does not have the same status as the source system in the Business Information Warehouse.
    The time stamp in the source system is 30.09.2010 14:05:17.
    The time stamp in the BW system is 09.05.2008 16:41:10.
    I executed program RS_TRANSTRU_ACTIVATE_ALL , still giving same error.
    Regards,
    Anand Mehrotra.

    Dear Experts,
    Thanks for help.Problem solved.
    Regards,
    Anand Mehrotra.

  • DTP load data from DSO to CUBE can't debug

    Hi,all expert.
    I met a problem yestoday.
    When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
    inconsistent input parameter(parameter:<unknow>,value <unknown>)
    message no: RS_EXCEPTION 101.
    I don't know why,PLZ help me,thank you very much.

    Hi,
    An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
    Thanks,
    Arminder

  • Semantic Partitioning Delta issue while load data from DSO to Cube -BW 7.3

    Hi All,
    We have created the semantic partion with the help of BADI to perform Partitions. Everthing looks good,
    first time i have loaded the data it is full update.
    Second time i init the data load. I pulled the delta  from DSO to Cube . DSO is standard where as Cube is partition with the help of semantic partition. What i can see is the records which are updated in Latest delta are shown into the report rest all are ignored by the system.
    I tried compression but still it did not worked.
    Can some has face this kind
    Thanks

    Yaseen & Soujanya,
    It is very hard to guess the problem with the amount of information you have provided.
    - What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
    - How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
    - Is there data already in the InfoCube?
    - Is there change log data for DSO or did someone delete all the PSA data?
    Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
    Good luck.
    Sudhi Karkada
    <a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a>

  • Issue while loading data from DSO to Cube

    Gurus,
    I'm facing a typical problem while loading Cube from a DSO. The load is based upon certain conditions.
    Though the data in DSO satisfies those condition, but the Cube is still not being loaded.
    All the loads are managed through Process Chains.
    Would there be any reason in specific/particular for this type of problem?
    Any pointers would be of greatest help !
    Regards,
    Yaseen & Soujanya

    Yaseen & Soujanya,
    It is very hard to guess the problem with the amount of information you have provided.
    - What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
    - How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
    - Is there data already in the InfoCube?
    - Is there change log data for DSO or did someone delete all the PSA data?
    Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
    Good luck.
    Sudhi Karkada
    <a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a>

  • When I extracting data from DSO to Cube by using DTP.

    When i am extracting data from DSO to Cube by using DTP.
    I am getting following erros.
    Data package processing terminated (Message no. RSBK229).
    Error in BW: error getting datapakid cob_pro (Message no. RS_EXCEPTION105).
    Error while extracting from source 0FC_DS08 (type DataStore) - (Message no. RSBK242).
    Data package processing terminated. (Message no. RSBK229).
    Data package 1 / 10/04/2011 15:49:56 / Status 'Processed with Errors'. (Message no. RSBK257).
    This is the brand new BI 7.3 system. Implementing the PSCD and TRM.
    I have used the standard business content objects in FI-CA (dunning history header, item,activities) and standard Datasource (0FC_DUN_HEADER ,0FC_DUN_ITEMS, 0FC_DUN_ACTIVITIES). I have extracted data till the DSO level . when I try to pull the data to info provider level(cube) using DTP . I am getting fallowing error.
    my observation: when ever I use the DSO as source to any target like another DSO or cube. its throwing same kind of error for any flow including simple Flat file .
    please suggest any one whether do I need to maintain basic settings since its a brand new BI 7.3.
    please help me out on this issue . I am not able to move forward .its very urgent

    hello
    Have you solved the problem ?  
    I have the same error...
    as you solve this error,
    can you help me please I have the same error
    yimi castro garcia
    [email protected]

Maybe you are looking for

  • Need help with the Nokia N8 or I need to send it b...

    I purchased a new N8 from Amazon and I have had nothing but issues. It has been very disappointing and I need to know if it the new firmware will fix any of these or not. 1. Once a day, while I am on a call, the phone says "insert a SIM card" and the

  • Site to site VPN RV215W and SRP521: malformed ISAKMP Hash Payload

    Hi I have been struggeling with this problem for one week and tried all configuration (except the right one) I have Two Cisco (one RV215W and one SRP521) the SRP521 was used as client - server configuration and works fine I wanted to move into a site

  • Recovery Order in Error with: AiaResubmissionUtility or EM Console?

    I am using AIA 11g and I know that the AiaResubmissionUtility works on the message in the queue and EM Console with the instance committed on the database but I do not know what can be the best practice to recovery the message. I would like to know i

  • Puzzled by EAS install for version 11.1.1.3.0

    Greetings, hope someone can help me out. I've been working with Essbase/Planning, etc for the past 10 yrs and am now moving to version 11.1.1.3. Let me tell you, what a change!. Anyhow, by going through various forums and reading (and re-reading) the

  • Flash player for x64 windows server 2008 r2????

    I was running IE 7 and flash was fine.  I upgraded to IE10 (sick of all the 'you're using an unsupported browser' messages).  Flash player update window appears suggesting I need to upgrade my player, telling me that the correct version for my browse