Cube to DSO DTP load

Hi
Iam doing synchronisation of DSO with cube i.e doing a full upload from cube to DSO.When i check the data in DSO it doesn't match with the data in cube.I find some data discrepiancies.
Could you please advise.
Thanks
Pratap.

Hi,
If you are loading data from cube to ODS then you shouldn't do record count t between two targets unless both targets having same number of char (i.e. dimensions).
The cube will aggregate KPI based on combinations of available dimensions & ODS will overwrite or aggregate (depends on update rule) based on KEY fields.
First check what are fields available in
       Cube
       ODS
       Flat file data.
Please explain with ex. with full details like key fields, number of common fields etc....
So that we can reply you with full understandings your concerns.
Regards,
ArunThangaraj

Similar Messages

  • Data is not loading to cube from DSO

    Hi experts,
    I loaded data to DSO. But, the problem is when i am trying to load the same to cube by using DTP there is no added records.
    All records from DSO are transferred but 0 added records. The mapping in the transformation is direct mapping no routines are available.
    The load is full load.
    Pls suggest me to resolve this issue.
    regards,
    Rajesh.

    Hi,
    You wont find the concept of TRFCs while running the DTPs....
    *DTP increases the performance*
    DTP is faster as DTP data loading is optimized by Parallel process.
    Data loading through DTP has no TRFC/LUWs generated at run time thus minimizing the loading time.
    Delete the index and reload the data and create the index...
    as the request is still active in SM37 -- kill the active job and delete the request from IC.
    Next delete the index and trigger the DTP --> create index for IC.
    this should work...
    Regards
    KP

  • How to add fields to already loaded cube or dso and how to fill records in

    how to add fields to already loaded cube or dso and how to fill  it.can any one tell me the critical issues in data loading process..?

    This is sensitive task with regards to large volumes of data in infoproviders.
    The issue is to reload of data in case of adjusted structures of infoproviders.
    Indeed there are some tricks. See following:
    http://weblogs.sdn.sap.com/cs/blank/view/wlg/19300
    https://service.sap.com/sap/support/notes/1287382

  • When I extracting data from DSO to Cube by using DTP.

    When i am extracting data from DSO to Cube by using DTP.
    I am getting following erros.
    Data package processing terminated (Message no. RSBK229).
    Error in BW: error getting datapakid cob_pro (Message no. RS_EXCEPTION105).
    Error while extracting from source 0FC_DS08 (type DataStore) - (Message no. RSBK242).
    Data package processing terminated. (Message no. RSBK229).
    Data package 1 / 10/04/2011 15:49:56 / Status 'Processed with Errors'. (Message no. RSBK257).
    This is the brand new BI 7.3 system. Implementing the PSCD and TRM.
    I have used the standard business content objects in FI-CA (dunning history header, item,activities) and standard Datasource (0FC_DUN_HEADER ,0FC_DUN_ITEMS, 0FC_DUN_ACTIVITIES). I have extracted data till the DSO level . when I try to pull the data to info provider level(cube) using DTP . I am getting fallowing error.
    my observation: when ever I use the DSO as source to any target like another DSO or cube. its throwing same kind of error for any flow including simple Flat file .
    please suggest any one whether do I need to maintain basic settings since its a brand new BI 7.3.
    please help me out on this issue . I am not able to move forward .its very urgent

    hello
    Have you solved the problem ?  
    I have the same error...
    as you solve this error,
    can you help me please I have the same error
    yimi castro garcia
    [email protected]

  • DTP load to cube failed with no errors

    Hi,
    I executed a DTP load from cube to cube with huge data volume approx 25 million records. It is a full load. The DTP load failed without any errors in any packages. The load went for 1 day and 33 mins and the request just turned red without any error messages. I have check ST22 and SM37 Job logs. No clue.
    I restarted this load and again it failed approximately around the sametime without any error messages. Please throw some light on this.
    Thanks

    Hi,
    Thanks for the response. There are no abnormal messages in the SM37 Job logs. These process is running fine the production. We are making a small enhancement.
    08/22/2010 15:36:01 Overlapping check with archived data areas for InfoProvider Cube                             RSDA          230          S
    08/22/2010 15:36:42 Processing of data package 000569 started                                                        RSBK          244          S
    08/22/2010 15:36:47 All records forwarded                                                                            RSM2          730          S
    08/22/2010 15:48:58 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 15:49:48 Processing of data package 000573 started                                                        RSBK          244          S
    08/22/2010 15:49:53 All records forwarded                                                                            RSM2          730          S
    08/22/2010 16:01:25 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 16:02:13 Processing of data package 000577 started                                                        RSBK          244          S
    08/22/2010 16:02:18 All records forwarded                                                                            RSM2          730          S
    08/22/2010 16:10:46 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 16:11:12 Job finished                                                                                00           517          S
    As shown here The process ended in 577 and 578 package did not start.  I can do be executing multiple DTPs but this process is running fine in production. There are no error messages or short dumps to start troubleshooting.
    Thanks,
    Mathi

  • Cube DTP load runs forever but doesn't do anything

    Hi experts.
    I was hoping to get some guidance on an issue I'm having loading one of my cubes.  I've searched the forums but can't seem to phrase the search correctly because I'm not getting any results.  I'm on BW 7.x.  Here is the data flow and scenario:
    APO Datasource A --> BW Cube A --> DTP --> BW Cube C
    APO Datasource B --> BW Cube B --> DTP --> BW Cube C
    This is a monthly load that has been in place for several years with no previous issues.  Yesterday everything went fine up until the load from
    BW Cube B --> BW Cube C.  This DTP normally runs about 10-15 minutes.  But yesterday I tried running it twice, once for two hours, and once for 4.5 hours, and never got a single record to load in the target cube.  It just kept showing "0 Data Records" under the Process Request section.  The load never completed and never erred.  I eventually cancelled both runs manually.  Neither SM21 nor ST22 indicated any problems.  I checked with my Basis guy and he did not see any issues on the Oracle side.
    Now, Cube B is the largest cube in my BW at around 48 GB, but I've never had problems with this load before.  Between the first and second attempt I dropped and rebuilt the indexes and verified statistics.  But I don't know what to fix because I can't find any errors.
    Points will be awared to anyone with viable suggestions or direction to a solution.
    Thanks in advance.

    Hi,
    During your load time, Is any new loads/full loads are scheduled at bw prod?
    There might be something scheduled on that day. you need to track the whole loads scheduled in that time frame. you may find if any changes on scheduling loads other than usual.
    before loading data to cube, have deleted the index?
    Can you check dtp monitor logs in detailed?
    During your load have you checked SM51 whether free application servers available to process your load?
    Thanks

  • How to decide whether to use a cube or dso?

    Hi,
    If any requirment is given ,Based on what conditions we decide which  data target(i.e a cube or dso) to be used ?
    As of my knowledge,
    1) DSO has the property of Over write option and cube have additive property.
    2) DSO  can have delta records and cube  can not have.
    Do we need consider these also?while we decide to load?
    help me how to decide the data target for a particular requirment
    Regards
    Naresh.

    Hi,
    If you have to report on a data target, you should always use Cube (give us better performance).
    For intermediate storage, we use DSO.
    Another thing is that DSO can also be used in additive mode. Regarding Delta, cube can have delta loads and can also provide delta to another target.
    Regards,
    Gaurav

  • Regarding Info cube and DSO

    Hi,
    Generally reporting will be done on Info Cube rather than DSO.
    Suppose If we assign the same data source to Info Cube and DSO then both contains the same data.
    Info cube have additive and aggregated functionality where DSO have the overwrite functionality .
    Are we using cube for this functionality only ?
    What about the Dimensions in Cube how they differ from data fields and key fields in DSO when we are developing same Bex Report on both ?
    Please advice me .
    Thanks in advance.
    Thanks & Regards,
    Ramnaresh.p

    It is hard to compare Cube and DSO.
    Both thier own usage.
    1. InfoCube is always additive, while DSO supports overwrite functionality.
    2. In InfoCube, combination of all the characteristic value is a Key in the Fact Table, while in ODS, you can specify your own Key Fields based on which you want to generate unique record in the DSO.
    3. DSO supports many delta modes like D, R, N, X, after image, before image, while cube does not support all the modes. You can not delete the record based on the key from the cube by just loading the data. While DSO automaitcally deletes the record from active table and generates the reverse entry for Cube.
    4. DSO is a flat structure and therefore, it is used to store information at detail level, while cube is used to store information at aggregated level.
    So both the structures are very much different from each other. One can replace other at some places, but both the objects have thier own functionality.
    - Danny

  • DTP Load Error (  ASSIGN_TYPE_CONFLICT)

    Hi All,
    I have added new field in Open Hub table then I tried to load the data to Open Hub table.
    DTP, I have filter selections based on CALMONTH = 201409
    I am getting dump  -- ASSIGN_TYPE_CONFLICT
    Please help me..!
    Thanks
    Prakash

    Hi  Ramanjaneyulu,
    Thanks for the reply...
    My DTP Load is "FULL" Load with selection CALMONTH = 201409 (OLAP Variable) this is running daily basis in P as per the requirement I have added new field and triggered load but I checked data in cube level same selection I found no data but I am getting error message ASSIGN_TYPE_CONFLICT Dump(ST22)
    Please help me
    Thanks
    Prakash

  • DTP load in Yellow status forever

    All,
    The DTP load to this particular remains in Yellow status forever even though the transferred records are just 5000. After observing  the dump, it shows problem where a select query to a table is done in the end routine of the transfroamtion. 
    The select is on a big master data table (material) but when I gave the selection by itself in SE16, there were not many records - just 3000. then why would the trace say that all the time is being spent querying this database table?
    This was all working in D but not in q
    Another important note is that there was no scuh change to the material table. however, there were four attributes to which we had enabled 'conversion' in the transfer structure
    Would this be an issue if not the code itself?
    Thanks

    Any updates please?
    MORE iNFORMATION:
    The dtp loads from one DSO to another. I checked a similar scenario in the same system - things look fine.
    I have an end routine in the transformation. when I look at the dtp monitor for the status, it says:
    Extract. from DataStore xxxx: 4709 Data Records ---status GREEN
    Filter Out New Records with the Same Key : 5000 -> 5000 Data Records------status GREEN
    ODSO xxxxx-> ODSO yyyyy-----status YELLOW
    -Transformation Start---status GREEN
    -Rules---status GREEN
    -End Routine---status GREEN
    No Message: Transformation End
    No Message: Update to DataStore Object yyyyyy
    No Message: Set Technical Status to Green
    No Message: Set Overall Status to Green
    Thanks!

  • DSO data Load issue

    Hi all,
    i have some issue with DSO data load , while loading the data data comming to psa perfectly 238 ware comes, but when i trigger that DTP, i am getting only 6 records.
    Please any one suggest me.
    Thanks,
    Gayatri.

    Hi Gayatri,
    if you have already loaded some data to DSO & now if you are trying to do Delta, it is possible that it is taking only Delta Data??
    (or)
    If you have any Start/End routines/Rule Routines written for deleting any records based on some conditions.
    (or)
    Also It depends on the keyfield you have selected in the DSO. If the keyfield which you have selected is having some repeated values, then while loading into DSO, it will be aggregated i.e if you have 10 rows for keyfield with value say 101, then in the DSO it will be loaded with only one row with value 101 (10 rows becoming 1 row) & with the keyfigure either Summation/Overwritten depending on what you have selected in the rule details for the keyfigure(you can check it by right click on Keyfigure mapping> Rule Details> here you can see whether it is Overwrite/Summation).
    Also as mentioned in above posts you can check the DSO --> manage --> Check the number of rows transferred & number of rows added.
    Hope it is clear & helpful!
    Regards,
    Pavan

  • How to tell record amount in Cube or DSO in GB

    All,
    I  me trying to figure out how to see data amount in cubes and DSOs in GB. I appreciate any help.
    Regards,
    Mike

    Thanks for your reply Srikanth. I do not have DB02 access however I do have access to ORA_COCKPIT. What I see there is SPACE->Tablespaces-Overview
                                         Detailed Analysis
    I remember there was a program for cubes which was showing data amount for the cubes but not for the DSOs. I cannot recall the program name

  • SQL error while activating DSO and loading Mater Data

    Hello Experts,
    we are having process chains which include process activating DSO and loading to Master data,
    while executing achai, we are getting error:
    An error occurred during confirmation of process 000008. See long text
    Message no. RSODSO_PROCESSING024
    Diagnosis
    Note the detailed system message:
    An SQL error occurred when executing Native SQL.
    when i go to T-Code: ST22
    the diagnosis is below:
    Runtime Errors         DBIF_RSQL_SQL_ERROR
    Exception              CX_SY_OPEN_SQL_DB
    Date and Time          20.05.2010 22:18:02
    What happened?
         The database system detected a deadlock and avoided it by rolling back
         your transaction.
    What can you do?
         If possible (and necessary), repeat the last database transaction in the
          hope that locking the object will not result in another deadlock.
         Note which actions and input led to the error.
         For further help in handling the problem, contact your SAP administrator
         You can use the ABAP dump analysis transaction ST22 to view and manage
         termination messages, in particular for long term reference.
         Note which actions and input led to the error.
         For further help in handling the problem, contact your SAP administrator
         You can use the ABAP dump analysis transaction ST22 to view and manage
         termination messages, in particular for long term reference.
    Could you please help me out how can i deal with this issue, .
    Regards
    Sinu Reddy

    it seems that the target you're loading to is already locked by another load...if it concerns a master data is also possible that the master data is locked by a change run. check your scheduling and make sure you don't load the same target at the same moment...
    M.

  • DTP Load is not progressing for Particular DataPackage

    Hi All,
    I am facing a Problem in my DTP Load. I have triggered a DTP run.
    It's having around 82 Datapackages and almost 80 datapackages completed Sucessfully.
    Particular 2 Data package is running for longtime say 2 Hrs and its not passing after Extraction step.
    When i check the Job Overview for that DTP run, the Job is finished and that 2 datapackeges have not have any information like "Processing of data package started"..
    so Kindly help me in fixing this issue.
    Thanks & Regards,
    Gowri NK

    Hi Gowri,
    if your DTP is loading to an Infocube, check if the Index of the infocube is dropped. Because DTP is loading data packages in parallel it is possible that the DTP and the update of the Index block each other. Because this may cause a deadlock, the resulting dump can show up after Hours without you noticing it.
    So my recommendation (at least if this is the problem):
    Load by process chain:
    1. Drop Index on Infocube
    2. Start DTP
    3. Create Index on Infocube.
    Kind regards,
    Jürgen

  • DTP loading to master data is running long due to lock on SID table

    Hi All,
    We have DTP loading directly from infoobject to master data infobject. DTP usually takes less than 30min to update but at times its observed the DTP runs for hours to update. No dump was found on ST22. and no locks on sm12 apart from this DTP. On analysing the log in sm37, we found exclusive locks on the SID tables. 
    18.03.2014 02:06:53 All records forwarded                                                                             RSM2          730
    18.03.2014 02:22:32 EXTRACTION OF DATAPACKAGE 000014                                                                  RSAR          051
    18.03.2014 02:22:33 All records forwarded                                                                             RSM2          730
    18.03.2014 02:38:36 EXTRACTION OF DATAPACKAGE 000015                                                                  RSAR          051
    18.03.2014 02:38:37 All records forwarded                                                                             RSM2          730
    18.03.2014 02:55:25 EXTRACTION OF DATAPACKAGE 000016                                                                  RSAR          051
    18.03.2014 02:55:25 All records forwarded                                                                             RSM2          730
    18.03.2014 03:13:56 SQL: 18.03.2014 03:13:56 ALEREMOTE                                                                DBMAN         099
    18.03.2014 03:13:56 TRUNCATE TABLE "/BI0/0600000066"                                                                  DBMAN         099
    18.03.2014 03:13:56 SQL-END: 18.03.2014 03:13:56 00:00:00                                                             DBMAN         099
    18.03.2014 03:13:56 SQL: 18.03.2014 03:13:56 ALEREMOTE                                                                DBMAN         099
    18.03.2014 03:13:56  LOCK TABLE "/BI0/0600000066" IN EXCLUSIVE MODE                                                   DBMAN         099
    18.03.2014 03:13:56 NOWAI                                                                                             DBMAN         099
    18.03.2014 03:13:56 SQL-END: 18.03.2014 03:13:56 00:00:00                                                             DBMAN         099
    18.03.2014 03:13:58 EXTRACTION OF DATAPACKAGE 000017                                                                  RSAR          051
    18.03.2014 03:13:59 All records forwarded                                                                             RSM2          730
    18.03.2014 03:34:24 EXTRACTION OF DATAPACKAGE 000018                                                                  RSAR          051
    18.03.2014 03:34:25 All records forwarded                                                                             RSM2          730
    18.03.2014 03:56:03 EXTRACTION OF DATAPACKAGE 000019                                                                  RSAR          051
    18.03.2014 03:56:03 All records forwarded                                                                             RSM2          730
    18.03.2014 04:19:55 EXTRACTION OF DATAPACKAGE 000020                                                                  RSAR          051
    18.03.2014 04:19:56 All records forwarded                                                                             RSM2          730
    18.03.2014 04:44:07 SQL: 18.03.2014 04:44:07 ALEREMOTE                                                                DBMAN         099
    18.03.2014 04:44:07 TRUNCATE TABLE "/BI0/0600000068"                                                                  DBMAN         099
    18.03.2014 04:44:09 SQL-END: 18.03.2014 04:44:09 00:00:02                                                             DBMAN         099
    18.03.2014 04:44:09 SQL: 18.03.2014 04:44:09 ALEREMOTE                                                                DBMAN         099
    18.03.2014 04:44:09  LOCK TABLE "/BI0/0600000068" IN EXCLUSIVE MODE                                                   DBMAN         099
    18.03.2014 04:44:09 NOWAI                                                                                             DBMAN         099
    Is the lock on the SID table reason for the delay?? if so how to avoid this delay?
    if not, What are the other possibilties to avoid this issue ??
    Thanks in advance.

    Hello Yogesh,
    priority of the Job is 'A' . Is this related to lock on SID table as when i observe system logs for successful runs i don see the lock on SID table.
    please refer the below log for successful run.
    19.03.2014 07:16:21 EXTRACTION OF DATAPACKAGE 000051
    19.03.2014 07:16:22 All records forwarded
    19.03.2014 07:16:34 EXTRACTION OF DATAPACKAGE 000052
    19.03.2014 07:16:34 All records forwarded
    19.03.2014 07:16:47 EXTRACTION OF DATAPACKAGE 000053
    19.03.2014 07:16:48 All records forwarded
    19.03.2014 07:17:02 EXTRACTION OF DATAPACKAGE 000054
    19.03.2014 07:17:03 All records forwarded
    19.03.2014 07:17:17 EXTRACTION OF DATAPACKAGE 000055
    19.03.2014 07:17:17 All records forwarded
    19.03.2014 07:17:31 EXTRACTION OF DATAPACKAGE 000056
    19.03.2014 07:17:32 All records forwarded
    19.03.2014 07:17:45 EXTRACTION OF DATAPACKAGE 000057
    19.03.2014 07:17:45 All records forwarded
    19.03.2014 07:17:58 EXTRACTION OF DATAPACKAGE 000058
    19.03.2014 07:17:59 All records forwarded
    19.03.2014 07:18:11 EXTRACTION OF DATAPACKAGE 000059
    19.03.2014 07:18:11 All records forwarded
    19.03.2014 07:18:24 EXTRACTION OF DATAPACKAGE 000060
    19.03.2014 07:18:24 All records forwarded
    19.03.2014 07:18:37 EXTRACTION OF DATAPACKAGE 000061
    19.03.2014 07:18:37 All records forwarded
    19.03.2014 07:18:50 EXTRACTION OF DATAPACKAGE 000062
    19.03.2014 07:18:51 All records forwarded
    19.03.2014 07:19:04 EXTRACTION OF DATAPACKAGE 000063
    19.03.2014 07:19:06 All records forwarded
    19.03.2014 07:19:19 EXTRACTION OF DATAPACKAGE 000064
    19.03.2014 07:19:20 All records forwarded
    19.03.2014 07:19:32 EXTRACTION OF DATAPACKAGE 000065
    19.03.2014 07:19:33 All records forwarded
    19.03.2014 07:19:47 EXTRACTION OF DATAPACKAGE 000066
    19.03.2014 07:19:47 All records forwarded
    19.03.2014 07:19:53 EXTRACTION OF DATAPACKAGE 000067
    19.03.2014 07:19:53 Start of Serial Processing Processing is Executed in the Same LUW
    19.03.2014 07:19:53 Processing started for batch ID DTPR_1739894_1 process 3
    19.03.2014 07:19:53 Processing completed for batch ID DTPR_1739894_1 process         3
    19.03.2014 07:19:53 End of Serial Processing Executed in the Same LUW
    19.03.2014 07:19:54 Technical status 'Green' (user ALEREMOTE)
    19.03.2014 07:19:57 Set overall status to 'Green' (user ALEREMOTE)
    19.03.2014 07:20:41 Processing completed for batch ID DTPR_1739894_1 process         2
    19.03.2014 07:20:41 Job finished
    Thanks

Maybe you are looking for