Data in DSO and CUBE

Hi Experts
Please..Please update me on in detail if possible with example ......Key Fields and Dimensions
How Data Records will be processed in DSO and  Cube
I tried to search but i can't able to find what i am looking for
My Requirment:
I am extracting Employee Benefits records from Non SAP Source System (Oracle Tables)
In the Oracle there is no Date field avaliable and there won't be any history avaliable in oracle once the Benefits of an employee is changed the old record will be overwritten to new one so they can't track the employee benefits history...but in BW i need to store/show history of employee benefits by Cal day
Oracle Table Fields
Location (Primary Key)
Emp No (Primary Key)
Insurance Type
Call Allowance Type
Annual Leave (No of day Days)
Pension Schem
Company Laptop (Yes/No)

hi,
key fields are the primary keys of ur ods tables. based on the key field values the data fields will get overwritten.
suppose if the key fields values were same then the data fields will get overwritten. but the changes were captured in the change log table, hence the history of changes is available.
Dimensions- are the prmary keys of cube. but in cube only addition of records will happen as default not overwrite as in ods.
                   dimension ids were generated based on the sid values of characteristics.
maximum of 16 key fields and dimensions can be there in ods and cube respectively.
for ur case, include 0calday field in the key fields and use the routine to update the field with system date(SY-Date). this keeps track/ else without date also , change log maintains history for each load.
have to add 0recordmode to the communication structure of the ods(infosource).
Ramesh

Similar Messages

  • How can i update data of DSO and Cube using ABAP??

    Hello Experts
    I have a requrement in which i need to update/ delete data from DSO and cube based on certain keys using ABAP.
    Please let me know how can i modify contets of cube or active table of DSO.
    Thanks
    Sudeep

    Hi
    I have requirement such that i need to update certain key figures in DSO after certain time.
    for eg. say record with key a is loaded to DSO and currospoding key figure was loaded with value 10
    after few days because of certain parameters changing in system i need to modify key figure value.
    currently i am doing same using 1 self transformation i.e. by loading same data again in same DSO.
    Amount of data is very huge and causing performance issues in system.
    now A is not key but i need to modify record for few combinations which apperar multiple times in DSO.
    Same DSO data is loaded into Cube regularly.
    I may need to update data after gap of few weeks as well.
    This design will be used as template and needto keep whole logic generic.
    So wring Function module can make it generic.
    Thanks
    Sudeep

  • Issue when uploading Sales data from DSO to Cube.

    Dear All,
    I have an issue when I am uploading Sales data from DSO to Cube.  I am using BI 7.0 and I have uploaded all sales document level data to my DSO.  Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube.  Cube has customer wise aggregation data.
    In DSO I have NetPrice(KF) and Delivered_QTY(KF).  I do a simple multiplication routine in the transformation from DSO to Cube.
    RESULT =   SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
    At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
    Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
    Can someone please help me.
    Shanka

    Hi,
    are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
    did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
    During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
    but first verify if other key figures are also having the same issue.
    Thanks
    Ajeet

  • Problem while  data processing TRANSACTION data from DSO to CUBE

    Hi Guru's,
    we are facing problem while  data processing TRANSACTION data from DSO to CUBE. data packets  processing very slowly  and updating .Please help me regarding this.
    Thanks and regards,
    Sridhar

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.
    You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the cube.
    Thanks,
    JituK

  • Job cancelled While loading data from DSO to CUBE using DTP

    Hi All,
      while i am loading data frm DSO to CUBE the job load is getting cancelled.
    in the job overview i got the following messages
        SYST: Date 00/01/0000 not expected.
       Job cancelled after system exception ERROR_MESSAGE
    wt can be the reason for this error as i have successfully loaded data into 2 layers of DSO before loading the data into the CUBE

    hi,
    Are u loading the flat file to DSO?
    then check the data in the PSA in the date field and replace with the correct date.
    i think u r using write optimised DSO which is similar to PSA it will take all the data from PSA to DSO.
    so clear the data in PSA and then load to DSO and then to CUBE.
    If u dont need a date field in CUBE then remove the mapping in transformation in cube and activate and make DTP activate and trigger it it will work.

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

  • Regarding  Loading the data from DSO to cube.

    Hello Experts,
    I have DSO which loads data from psa  using 7.0 tranformation (using DTP).  And i have a cube which loads the data from that DSO using 3.x transfer rules. Now i have deleted the init request for a infopack before load the data into DSO. But when i load the data from DSO to cube by  right click on DSO -> click On Additional Functions -> update the 3.x data to targets, It is giving me an error like 'Delete init. request REQU_4H7UY4ZXAO72WR4GTUIW0XUKP before running init. again with same selection
    Please help me with this.
    i want to load the data in the init request to cube..
    Thanks

    Hi Shanthi,
    Thanks For reply. I have already deleted the init request from source system to DSO and then i have tried still i am getting the error.
    Thanks

  • DTP load data from DSO to CUBE can't debug

    Hi,all expert.
    I met a problem yestoday.
    When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
    inconsistent input parameter(parameter:<unknow>,value <unknown>)
    message no: RS_EXCEPTION 101.
    I don't know why,PLZ help me,thank you very much.

    Hi,
    An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
    Thanks,
    Arminder

  • Data from DSO to CUBE

    hi gurus,
    please tell me from which table (Changelog or Active data) of DSO the cube will pick the records
    and also overwrite functionality works with DSO Change log
    Please awaiting a quick response
    Thank you

    Hi,
    Answer is already in my previous post.
    i.e.  Delta load - From Change Log,
      but in BI7.0 data flow, when you are using DTP the Delta behaviour will be based on your selection. as below.
    BW Data Manager: Extract from Database and Archive?
    The settings in the group frame Extraction From... or Delta Extraction From... of the Data Transfer Process maintenance specify the source from which the data of the DTP is extracted.  For a full DTP, these settings apply to all requests started by the DTP. For a delta DTP, the settings only apply to the first request (delta initialization), since because of the delta logic, the following requests must all be extracted from the change log.
    For Extraction from the DataStore Object, you have the following options:
    Active Table (with Archive)
    The data is read from the active table and from the archive or from a near-line storage if one exists. You can choose this option even if there is no active data archiving process yet for the DataStore object.
    Active Table (Without Archive)
    The data is only read from the active table. If there is data in the archive or in a near-line storage at the time of extraction, this data is not extracted.
    Archive (Only Full Extraction)
    The data is only read from the archive or from a near-line storage. Data is not extracted from the active table.
    Change Log
    The data is read from the change log of the DataStore object.
    For Extraction from the InfoCube, you have the following options:
    InfoCube Tables
    Data is only extracted from the database (E table and F table and aggregates).
    Archive (Only Full Extraction)
    The data is only read from the archive or from a near-line storage.
    hope this gives you clear idea.

  • When I extracting data from DSO to Cube by using DTP.

    When i am extracting data from DSO to Cube by using DTP.
    I am getting following erros.
    Data package processing terminated (Message no. RSBK229).
    Error in BW: error getting datapakid cob_pro (Message no. RS_EXCEPTION105).
    Error while extracting from source 0FC_DS08 (type DataStore) - (Message no. RSBK242).
    Data package processing terminated. (Message no. RSBK229).
    Data package 1 / 10/04/2011 15:49:56 / Status 'Processed with Errors'. (Message no. RSBK257).
    This is the brand new BI 7.3 system. Implementing the PSCD and TRM.
    I have used the standard business content objects in FI-CA (dunning history header, item,activities) and standard Datasource (0FC_DUN_HEADER ,0FC_DUN_ITEMS, 0FC_DUN_ACTIVITIES). I have extracted data till the DSO level . when I try to pull the data to info provider level(cube) using DTP . I am getting fallowing error.
    my observation: when ever I use the DSO as source to any target like another DSO or cube. its throwing same kind of error for any flow including simple Flat file .
    please suggest any one whether do I need to maintain basic settings since its a brand new BI 7.3.
    please help me out on this issue . I am not able to move forward .its very urgent

    hello
    Have you solved the problem ?  
    I have the same error...
    as you solve this error,
    can you help me please I have the same error
    yimi castro garcia
    [email protected]

  • Why do we create indexes for DSOs and Cubes.What is the use of it?

    Hi All,
    Can you please tell me why are indexes created for DSOs and Cubes.
    What is the use with the creation of indexes.
    Thanks,
    Sravani

    HI ,
    An index is a copy of a database table that is reduced to certain fields. This copy is always in sorted form. Sorting provides faster access to the data records of the table, for example, when using a binary search. A table has a primary index and a secondary index. The primary index consists of the key fields of the table and is automatically created in the database along with the table. You can also create further indexes on a table in the Java Dictionary. These are called secondary indexes. This is necessary if the table is frequently accessed in a way that does not take advantage of the primary index. Different indexes for the same table are distinguished from one another by a separate index name. The index name must be unique. Whether or not an index is used to access a particular table, is decided by the database system optimizer. This means that an index might improve performance only with certain database systems. You specify if the index should be used on certain database systems in the index definition. Indexes for a table are created when the table is created (provided that the table is not excluded for the database system in the index definition). If the index fields represent the primary keys of the table, that is, if they already uniquely identify each record of the table, the index is referred to as an unique index.
    they are created on DSO and cube for the performance purpose ..and reports created on them wil be also more efficent ..
    Regards,
    shikha

  • Is the concept of Transactional DSO and Cube in BI 7?

    Hi,
    Is the concept of Transactional DSO and Cube in BI 7?
    I see 3 type of Cubes[Standard or VirtualProvider (Based on DTP, BAPI or Function Module)]
    but canu2019t see transaction Cube.
    also
    I see 3 type of DSO(Standard, Write-optimized, Direct update )
    but canu2019t see transaction DSO
    See this link on DSO, with examples etc:
    http://help.sap.com/saphelp_nw04s/helpdata/en/F9/45503C242B4A67E10000000A114084/content.htm
    I am looking for such summary also for Cubes have you see a similar one?
    Thanks

    New terminology in BI 7.x
    Transactional ODS =  DSO for direct update
    Transactional Cube = Realtime Cube
    Jen

  • Data model that takes part of data to DSO and part to cube

    Hello experts,
    I have a question about the data model I am currently working on. This model is about a reservation agency, where a customer contacts the agency through a sales representative and makes a reservation for a certain event. He later has the option of confirming or cancelling the reservation. We will be receiving data for future reservations every day, with the updated status of future and recently past reservations, sales representative in charge of the reservations, income that will be perceived from each reservation (if the reservation is confirmed) or that was perceived if the event already took place, and more.
    I have found the need to use a DSO to update the reservation status of incoming reservations, as well as other fields. I am loading from a flat file with 20 fields, and the destination is a cube, after the data gets cleansed in the DSO, where there are 2 key fields (reservation code and reservation date).  However, not all the remaining data from the flat file needs to get updated. Out of the other 18 fields, only around 9 may change on an update, while the others will never change (for example, the customer name, confirmation date and event name will not change, while the sales representative, reservation status and income perceived can change from one day to another).
    I was designing a model in which the DSO contained all 18 non-key fields as data fields, so that everything got updated and passed to the cube, but then I thought that perhaps since many fields will not change, it would be better if the DSO only updated those fields that could actually change, while the other fields "jumped" directly to the cube. After the DSO updates the changing fields, the others will already be in the cube. This way, the DSO would receive and have to process less amount of data, making the DSO tables smaller and the process more efficient. But I cannot find a proper way to do this from a single data source, since the data has to pass from the data source to the DSO and then through a transformation from the DSO to the cube.
    Is there a way to accomplish this, or should I just have all fields go through the DSO and from there to the cube?

    Hi, thanks for the suggestion. The reservation code, as of right now, is not master data, and it is unique almost always, except for rare occurences where the same customer makes several reservations, each one for a different date (that is why for the DSO I am using as key fields both reservation date and reservation code). Because of that and the fact that there are lots of reservations made each day, we had defined that object without master data (since the master data table would be huge in a short amount of time). Perhaps I should change it to become master data? Or what should we do?

  • While loading data from DSO TO CUBE.(which Extraction option)

    hi friends,
                     my questin is that when we load the data from one data target to another data target then we have four options to extract the data in DTP screen from (1.active data table (with archive) 2.active data table (without archive) 3.change log table. and fourth i dont remember.so my question is when we update data like from DSO TO CUBE. Then which option we should choose. 2nd or 3nd.?
    <removed_by_moderator>
    Edited by: Julius Bussche on Jan 4, 2012 9:26 AM

    - If you want to do a full load always select 2nd option i.e. Active Data Table (without archive).
    - If you wand to do a delta load, you can select either of 2 or 3
       Active Data Table (without archive) - first load will be full from active table and after that it will be delta from change log
       (ii) Changelog table - you will have to do a full load using a full DTP and then delta can be done using this DTP (but you will need to an init as well using this DTP)
    - if you have done using a full DTP and now just want to run delta, select 3rd option and do an init before running delta
    I hope it helps.
    Regards,
    Gaurav

  • Reload data from DSO to Cube after error

    Dear Friends,
    i got a error last night durring upload of sales data to ODS and then into cube.
    ODS is updated sucessfully. but due to my mistake cube was inactive.
    and cube was not updated with ODS data. and i got a error.
    now i fix the problem.
    now please tell me how can i update that missing data into cube.
    which is already updated in DSO .
    any help higly appricated.
    thanks with regards
    Malik

    Hi
    Just see if you have data in the PSA not sure if it would be but stil check using the request number.
    if you have the particular request in the psa you can move that request from psa to the cube. if it is not coming using PSA jst check if you have delta's or full update in the cube. if it is a delta you still do not seem to have any problems as scheduling the infopack would pick the data that did not come to the cube.
    Thanks
    Puneet

Maybe you are looking for

  • Audio problems in Flash Player and Firefox

    Hi everyone! I have a problem that has been plagueing me for nearly a year now, all audio in all videos i play in Firefox goes out of synch, the video will start ok then at around the two minute mark the audio starts to lag until it's completely left

  • Problems setting up Remote Web Access in Windows Storage Server 2008 R2

    We just bought a WD Sentinel DX4000 Storage Server and went through the Remote Web Access wizard.  It setup the router just fine but when we got to the setup your domain name we have been having headaches ever since.  We chose the option to use our o

  • Backup in folder doesn't appear in iTunes

    I'm trying to recover some old data from my phone and I thought it might be on one of my old backups. I put my phone into recovery mode and then I searched in the "~library/application support/mobilesync/backup" folder and I was able to find some of

  • I'm new to electronic keys and am hoping for recommendations.  Around $1500

    I'm a piano player. I need the 88 keys (as ridiculous as it might sound). I would like something weighted well (obviously subjective). Sound isn't as important being that I can run it through Mainstage and will have other options. I'll be running it

  • Powerbuilder Web Service + Smart Client

    I encountered many errors about "System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt." in event log from the server that I hosting PB web service.