Look up of Data from DSO

I have  DSO1 and DSO2 .I  am loading data from DSO1 to DSO3. But I need some more additional data from DSO2 . So , i need to write a Routine in Transformations  while sending data from DSO1 to DSO3 and needs to look up to DSO2 for some additional fields.
All the 3 DSO 's have the same Key Fields.Can anyone throw some light on how to go about this? What is the Routine I need to use (End Routine I guess)
Regards,
Samir

Hi,
        How can I load the data from DSO1 to DSO2 . They both have  different Set of Data eventhough they have same Key Fields.
I  want to consolidate the data from DSO1 and DSO2 into DSO3. Soi , I am loading data from DSO1 to DSO3 and  I need to look up to DSO2 for  three fields.
Can anyone suggest me whether this is the Right approach. Apart from this , I have another DOubt
1) When I load data from DSO1 to DSO3 and pick the fields from DSO2 using END ROUTINE LOOK UP betweeen DSO1 and DSO2 , what will be the status of these 3 fields in Transfomrations between DSO1 and DSO3. Do I need to maintain Transformations as Routine or just leave them as it is?
I hope I am clear.
Regards,
Samir

Similar Messages

  • Delete data from DSO only with a part of the key

    Hi experts,
    I have a DSO with 9 key fields and i want to delete some entrys. Our source system only provides 2 fields of the key. My problem is, that the BI is not able to delete from DSO unless I have the full key.
    Now I want to get the rest of the key fields from the active table of the DSO. Can sombody help me please with the ABAP Code? Or is there another possibility?
    I already found this thred in SDN but this is for BW 3.5:
    Delete data from ODS with only part of the key
    Thanks
    Ralf

    Ralf,
    the thread pointed out by you is for passing deletion entries to the DSO / target using the deletion entries .. namely recordmode = D.
    Are you looking at deleting data from the DSO as a one time activity or you want to handle the same using a start / end routine ...? and pass deletion entries to other data targets which get the data from this DSO ?
    Arun

  • Issue when uploading Sales data from DSO to Cube.

    Dear All,
    I have an issue when I am uploading Sales data from DSO to Cube.  I am using BI 7.0 and I have uploaded all sales document level data to my DSO.  Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube.  Cube has customer wise aggregation data.
    In DSO I have NetPrice(KF) and Delivered_QTY(KF).  I do a simple multiplication routine in the transformation from DSO to Cube.
    RESULT =   SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
    At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
    Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
    Can someone please help me.
    Shanka

    Hi,
    are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
    did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
    During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
    but first verify if other key figures are also having the same issue.
    Thanks
    Ajeet

  • Issue in Data from DSO to DSO Target with different Key

    Hello All,
    I am having Issue in Data from DSO to DSO Target with different Key
    Source DSO has Employee + Wage + Post numner as key and Target has Employee + Wage Type as key.
    DSO semantic grouping works like Group By clause in sql, is my understanding right ?
    Also if someone can explain this with a small example, it would be great.
    Many Thanks
    Krishna

    Dear, as explained earlier your issue has nothing to do with semantic grouping .
    Semantic grouping is only usefull when you have written a routine in the transformation for calculations and in error handling .
    Please go through this blog which explains very clearly the use of semantic grouping .
    http://scn.sap.com/community/data-warehousing/netweaver-bw/blog/2013/06/16/semantic-groups-in-dtp
    Now coming to your above question
    DSO 1
    Employee WageTyp Amount
    100          aa          200
    200          aa          200
    100          bb          400
    100          cc          300
    If we have semantic group as Employee .   If we have Employee as key of the target DSO and update type as summation .
    then target DSO will have
    Emp                Amount
    100                 700
    200                 200
    In this case Wage type will be the last record arriving from the data package . If the record 100  cc  300 is arrivng last then wage type will be cc .
    2) Case 2
    DSO 1
    Employee WageTyp Amount
    100          aa          200
    200          aa          200
    100          bb          400
    100          aa          300
    if we do Semantic grouping with Emp and Wage Type   If we have Employee and wage type as key of the target DSO and update type as summation .
    then target DSO will have
    Emp     Wage      Amount
    100          aa          500
    200          aa          200
    100          bb          400
    Hope this helps .

  • Problem while  data processing TRANSACTION data from DSO to CUBE

    Hi Guru's,
    we are facing problem while  data processing TRANSACTION data from DSO to CUBE. data packets  processing very slowly  and updating .Please help me regarding this.
    Thanks and regards,
    Sridhar

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.
    You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the cube.
    Thanks,
    JituK

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

  • Regarding  Loading the data from DSO to cube.

    Hello Experts,
    I have DSO which loads data from psa  using 7.0 tranformation (using DTP).  And i have a cube which loads the data from that DSO using 3.x transfer rules. Now i have deleted the init request for a infopack before load the data into DSO. But when i load the data from DSO to cube by  right click on DSO -> click On Additional Functions -> update the 3.x data to targets, It is giving me an error like 'Delete init. request REQU_4H7UY4ZXAO72WR4GTUIW0XUKP before running init. again with same selection
    Please help me with this.
    i want to load the data in the init request to cube..
    Thanks

    Hi Shanthi,
    Thanks For reply. I have already deleted the init request from source system to DSO and then i have tried still i am getting the error.
    Thanks

  • DTP load data from DSO to CUBE can't debug

    Hi,all expert.
    I met a problem yestoday.
    When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
    inconsistent input parameter(parameter:<unknow>,value <unknown>)
    message no: RS_EXCEPTION 101.
    I don't know why,PLZ help me,thank you very much.

    Hi,
    An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
    Thanks,
    Arminder

  • Issue while loading of data from DSO to InfoCube

    Hi Experts,
    Can you tell me what might root casue if data is coming into DSO from R3 its correct and fine as required but while loading it to InfoCube from DSO its showing wrong data like some of Line Items that were closed were shown open in Cube AND also KF values were not right
    Also there is no Routine code involved b/w DSO and InfoCube.
    Thanks in adv .
    NP

    Hope you didnt delete some req from DSO without deleting change log . This might cause inconsistency.
    If so , delete data from dso by right click delete data  and reload .

  • When I extracting data from DSO to Cube by using DTP.

    When i am extracting data from DSO to Cube by using DTP.
    I am getting following erros.
    Data package processing terminated (Message no. RSBK229).
    Error in BW: error getting datapakid cob_pro (Message no. RS_EXCEPTION105).
    Error while extracting from source 0FC_DS08 (type DataStore) - (Message no. RSBK242).
    Data package processing terminated. (Message no. RSBK229).
    Data package 1 / 10/04/2011 15:49:56 / Status 'Processed with Errors'. (Message no. RSBK257).
    This is the brand new BI 7.3 system. Implementing the PSCD and TRM.
    I have used the standard business content objects in FI-CA (dunning history header, item,activities) and standard Datasource (0FC_DUN_HEADER ,0FC_DUN_ITEMS, 0FC_DUN_ACTIVITIES). I have extracted data till the DSO level . when I try to pull the data to info provider level(cube) using DTP . I am getting fallowing error.
    my observation: when ever I use the DSO as source to any target like another DSO or cube. its throwing same kind of error for any flow including simple Flat file .
    please suggest any one whether do I need to maintain basic settings since its a brand new BI 7.3.
    please help me out on this issue . I am not able to move forward .its very urgent

    hello
    Have you solved the problem ?  
    I have the same error...
    as you solve this error,
    can you help me please I have the same error
    yimi castro garcia
    [email protected]

  • Errror while loading data from dso to info cube....

    Hi All,
    When i am running dtp from dso to cube i get a error i.e :
    1.  Data package processing terminated
         Message no. RSBK229
    2.  'Processed with Errors'
         Message no. RSBK257
    3. Error while updating to target 0FIGL_C10 (type INFOCUBE)
        Message no. RSBK241.
         So these are the errors i am getting , i am new to BI so how could i solve this errors....
         All the transformation are active with no errors.
    Thanks in Advance
    Regards,
    Mrigesh.
    Edited by: montz2006 on Dec 7, 2009 8:45 AM

    Hi All,
    Now i deleted data from dso and data source and again run the dtp,
    this time i got data from data-source to dso but , when i am running dtp between dso and cube , i am not finding an error.
    Now when i go to cube and see the data the request has came but data has not transferred .....
    The Request Status is green in cube but data is not cuming....
    Regards,
    Mrigesh.

  • Problem in transfering data from DSO to InfoCube

    Hi,
    I am trying to transfer data from DSO to Infocube. But it is giving the following message.
    Last delta update is not yet completed
    Therefore, no new delta update is possible.
    You can start the request again
    if the last delta request is red or green in the monitor (QM activity)
    It is not allowing to load data to InfoCube from DSO. I checked the request number. But there is no request number in InfoCube. If request no is there I can delete the request from InfoCube and delete the data mart status in DSO and reload again.But there is no request id in IC. So plz guide me how I can tranfer data to InfoCube overcoming this problem.
    Actually further updata was running from DSO to IC. But it failed. Executing Function Module : RSPC_PROCESS_FINISH, I set the chain  green before loading
    data to InfoCube. So it starts executing the next chain. So there is no request id. So plz guide me how I can retain this request id.
    Edited by: Purushotham Reddy on Jan 28, 2008 11:19 AM

    Hi Reddy,
    Make the QM Status Red of the failed request even if it is red.
    and then run the same infopackage used in the chain. This will ask for repeat of last delta.
    If this process is understood to u then its fine otherwise read below:
    the other option is check if this failed request is present in PSA. If it is present in PSA then update the failed request from PSA, here u wont need to do a repeat of last delta. This Option is easier. this u can do from the ,manage tab of the data source I think this u might be surely knowing....
    Assign Points if it helps u.....
    Thanks,
    Nagesh.

  • Error in uploading data from DSO to Infocube in 3.5(rsa1old)

    While uploading data from DSO to Infocube, I selected Initial update option
    In schedule tab when I clicked start, an error occured it says:
    Delete init request
    REQU_D4YTAGX8PEOUQJLSKOEPH9EV before running
    init. again with same selection.
    Can anyone let me know how should I delete that request.
    Thanks,
    Soujanya.

    Hi,
    Go to RSRQ and enter the request then execute. This should bring you to Monitor u2013 Administration Workbench. From there you will see the InfoSource (usually has u201C8u201D as a prefix). Get the InfoSource name. Go to RSA1 to find the InfoSource. From here it gets a bit tricky. Go to the datasource and drill down to the related InfoPackage. Go to Infopackage scheduling. Then go  to Scheduler Menu, the Initialization Options for Source System should be open. You can see from there the request with error. Let me know if this answers your question.
    Regards,
    JD Datuin

  • Looking to move data from OSX 9 to Mountain Lion using external hard drive?

    Looking to move data from OSX 9 to Mountain Lion using external hard drive?

    Thanks.  I'd like to get rid of this OSX 9 after retrieving a few files.  If I wanted to erase it's hard drive before I do so, would powering off then on again and immediately start tapping the F10 button do the trick as it does with a PC?

  • Read Data From DSO - In ABAP Program

    HI Friends,
    Can any one tell how to read Data from a DSO in a Abap Program on BI System itself......
    Which is the better way to read data from DSO in Abap Program... Is it
    1. Reading Data Directly From the Active Table Of the DSO ... or
    2. Reading Data From DSO....
    In case of reading data from DSO.... Which function module is used to do this work....When i searched the forum i came across 2 function Modules.... BAPI_ODSO_READ_DATA_UC and RSDRI_INFOPROV_READ..... Please let me know which one should be used....
    Regards,
    Shyam.

    Hi Shyam,
    The active content of any ods will be stored in the database tables( SE11) with the naming convention /BIC/A<odsname>00 for Custom-defined DSO and /BI0/A<odsname>00 for Business content delivered DSO.
    For example if the technical name of the ods is TEST , then the active content table name in SE11 will be /BIC/ATEST00.
    You can write your own program to read the contents of this database table.
    Regards,
    Krishna.

Maybe you are looking for