Full load from a DSO to a cube processes less records than available in DSO

We have a scenario, where every Sunday I have to make a full load from a DSO with OnHand Stock information to a cube, where I register on material and stoer level a counter if there is stock available.
The DTP has no filters at all and has a semantic group on 0MATERIAL and 0PLANT.
The key in the DSO is:
0MATERIAL
0PLANT
0STOCKTYPE
0STOR_LOC
0BOM
of which only 0MATERIAL, 0PLANT and 0STORE_LOC are later used in the transformation.
As we had a growing number of records, we decided to delete in the START routine all records, where the inventory is not GT zero, thus eliminating zero and negative inventory records.
Now comes the funny part of the story:
Prior to these changes I would [in a test system, just copied from PROD] read some 33 million of records and write out the same amount of records. Of course, after the change we expected to write out less. To my total surprise I was reading now however 45 million of records with the same unchanged DTP, and writing out the expected less records.
When checking the number of records in the DSO I found the 45 million, but cannot explain why in the loads before we only retrieved some 33 millions from the same unchanged amount of records.
When checking in PROD - same result: we have some 45 million records in the DSO, but when we do the full load from the DSO to the cube the DTP only processes some 33 millions.
What am I missing - is there a compression going on? Why would the amount of records in a DSO differ from the amount of records processed in the DataPackgages when I am making a FULL load without any filter restrictions and only a semantic grouping in place for parts of the DSO key?
ANY idea, thought is appreciated.

Thanks Gaurav.
I did check if there were more/any loads doen inbetween - there were none in the test system.  As I mentioned that it was a new copy from PROD to TEST, I compared the number of entries in the DSO and that seems to be a match between TEST and PROD, ok a some more in PROD but they can be accounted for. In test I loaded the day before the changes were imported to have a comparison, and between that load and the one ofter the changes were imported nothing in the DSO was changed.
Both DTPs in TEST and PW2 load from actived DSO [without archive]. The DTPs were not changed in quite a while - so I ruled that one out. Same with activation of data in the DSO - this DSO get's loaded and activated in PROD daily via process chain and we load daily deltas into the cube in question. Only on Sundays, for the begin of the new week/fiscal period, we need to make a full load to capture all materials per site with inventory. The deltas loaded during the week are less than 1 million, but the difference between the number of records in the DSO and the amount processed in the DataPackages is more than 10 millions per full load even in PROD.
I really appreciated the knowledgable answer, I just wished you would pointed out something that I missed out on.

Similar Messages

  • Full load from r/3 failed due to bad data and no psa option selected

    SDN Experts,
    Full load from R/3 failed loading in to ODS and in the infopackage on psa selected. how to fix it? can i rerun the infopackage again to load from R/3? May be it will fail again? is this a design defect?
    i will assign points. Thank you.
    Les

    Hi,
    There is obsolutely no prob. in re executing the package, but before that, check
    your ODS and Info Source is re active and update rules are also active if any..
    and you can check the option of target info provider selected properly along with
    the PSA option and also check whether you have any subsequent process like
    updating data from ODS to Cube and so on..
    You can also check the data availability at R/3 for the respective DS by executing
    RSA3.
    Hope this helps..
    assign points if useful..
    Cheers,
    Pattan.

  • Delta and Full load from a single Delta ODS

    I have delivered 0DS 0GT_DS01 that is being loaded from R/3 as a delta update. This ODS is then loaded to a delivered InfoCube 0GT_C01 as a delta.
    My question is this, can I create update rules and a Infopackage to do a full load from 0GT_DS01 ODS to my custom ODS ZGT_DS01?
    I kind of remember from my BW310 class that in release 3.5,  if I have an ODS to InfoCube Delta update all other updates from that ODS must be a delta also.
    Regards,
    Mike...

    Siva,
    I created my update rule without a problem.
    I created the Infopackage with my ZGT_DS01 ODS at the target, the update mode as FULL and I saved the InfoPackage. After SAVE I went to the drop-down menu, Scheduler ==> Initialization For This Datasource and there is a Delta present.
    The Infopackage works OK when I run is from RSA1, but when I put the Infopackage in Process ChainI get and error "Delta upload is only possible after successful initialization". What I see in the Manage screen for my ZGT_DS01 ODS is a request for a full load followed by a request for a Delta.
    Any Ideas why the infopackage works in RSA1 but not in a process chain?
    Regards,
    Mike...

  • Automated data load from APO to SEM transactional cube

    Hi ,
    We have BW-SEM system integrated with APO system.
    I could see automated data loads from APO to SEM transactional cube ..
    Infopackage name as "Request loaded using the APO interface without monitor log" ..
    I don't see any infopackage by this name in both the systems ( APO & SEM )..
    I am not sure how it configured ..
    Appreciate any inputs on how its happens .....
    Thanks in advance

    Hi,
    As I mentioned the starting point will be the tcode BPS0. There will be 2 planning areas created (if I am correct) one for SEM cube and the other for APO cube. The better way to find it will be goto tcode se16 and enter UPC_BW_AREA and key in the cube names in the cube field. this will give you the planning area names now look for a multiplanning area which has the 2 areas included in them (this is available in table UPC_AREAM).
    then goto BPS0 and you will have to find which function is being used to post the data.
    thanks

  • Can I do Parallel Full loads from ODS to Cube.

    Hai,
       Usually I will do one a full update load from OD'S to Cube. to speed up the process can I do parallel Full update loads from ods to Cube?
    Please advise.
    Thanks, Vijay,

    Assuming that the only connection we are talking about is between a single ODS and a single Cube.
    I think the only time you could speed anything up is in full drop an reload scenario. You could create multiple InfoPackages based on selection and execute them simultaneously.
    If the update is a delta there is really no way to do it.
    How many records are we talking about? Is there logic in the update rule?

  • Load from 3 ODS to 1 cube

    Hello All,
    Our requirement is to load the cube from 3 ODSs. Each time we are loading the data from the 3 ODS to Cube the load gets failed due to job getting terminated in the source system.
    POP UP for the ERROR:
    Express document generated  error in data selection Check load from Author.
    ERROR TERMINATED:
    Job in source system terminated --> Request is set to red.
    Message no. RSM078
    We think There is Some problem in the data selection from the ODS.
    Please suggest us some solution for this.OR a check list for loading the data from 3 ODS to a cube.
    Thanks & Regards,
    Pallavi.

    Check SAP Note: 869445.
    Check this:
    Job terminated in source system --> Request set to red
    Load data into InfoCube 0FIGL_C10
    Thanks

  • Perform INIT, DELTA, and FULL Load from R3 to BI 7.0

    Hi,
    Could anyone please give me the information about how to perform the following to load data from the R3 system to the BI 7.0 system:
    1. INIT Load
    2. Delta Load
    3. Full Load
    4. RealTime Data Acquisition
    Where can I find the settings for all the above in DTP?
    For Real Time Data Acquisition; Don't I have to load the data from the data soruce to the PSA........
    Does the data goes directly from the data source to the data target throguh Daemon Process?  I read the SAP Help doc's but still little confused.......could any one please explain it in your clear way.
    Thank, I will assign the points.

    the first three are similar to 3.5
    check out this blog by KJ for RDA
    Real-Time Data Acquisition -BI@2004s

  • Data load from BPC into BI 7 cubes for Bex reporting

    Hi,
    Can we pull BPC planning data into the BI cubes so that Bex reports can be created for combined data from BPC cubes and BI cubes?
    Also can we create Bex reports on BPC cubes just like we create for BI reporting cubes?
    Let me also give an example of the scenario we face.
    We have actuals data in a BI 7 basic cube(Actuals cube). Planning is done in BPC, so I understand  that the planned data gets automatically stored in a BPC cube in BI7 system. I want to load the data from this BPC cube to BI Actuals cube and create a combined Bex report to show actuals and plan data.
    Please let me know.
    Thanks,
    Archana

    AS of now, if you report data in the BPC7NW cubes through BEx, you may not get the same result as you get from BPC reports. The reason being BEx won't do the same calculations (for example sign reversals) that BPC client would do. In addition, you won't be able to report the results of the dimension member formulas when you report through BEx. If you have to get data out from BPC cubes to other cubes, you can go through SQE (shared query engine) and that way, yyour results will match with what you get in BPC reports. Once you get data out through SQE into other cubes, you can use BEx to report if you wish.
    More functionality will be available in near future to accomplish BEx reporting for BPC data.
    Regards
    Pravin

  • Steps to create Data loading from Flat File to Info Cube in BI

    Hi,
         I am very new BW, I need some one help. When I am trying to create info source i am any pop window stating to create Transaction data or master data.
         After creating Info source, I dont know how to assign this info source to source system (which i created).
          When select the context menu of info source I dont have option to assign the datasource.
          And one more thing is When I am creating the Info Cube. I cant understand how to create.
          Please some one help me how to map the fields to Source system.
    Regds
    Dave.

    Hi,
    For flat file upload, first you need to create the source system.
    Then you need to create the infosource based on the format of the flat file you are going to upload or vice versa depending on your requirements.
    Once your infosource is ready, right click on it and select assign datasource. Here you can assign your flat file datasource. Then create an infopackage and give the path from where the file is to be upload to BW (in the infopackage).
    Also look at the thread below for procedure on flat file upload :
    http://help.sap.com/saphelp_nw04s/helpdata/en/8e/dbe92341c84242be2c7d3917f1c197/frameset.htm
    Cheers,
    Kedar

  • Full Load DTP is not creating outbound file where there is no data in DSO.

    Hi folks!
    We have created a Open Hub Destination on one DSO. That DSO is having 10 millions data.
    We have created a Full updated DTP to create a file with some selections. In DSO there is not data with that selections.
    I like to know, wether the file is created when ther is no data or not for Full update DTP. For Delta update is creating file even though there is no data as per the selections.
    Pelase clarify me ASAP.
    Thanks in advance
    Peter

    Hi Peter,
    Please check the DTP Selections. if there is no data with these selections then you will get a blank.
    as per Open hub destination it will create 2 files one is infobject list file where all the infoobjects like char's and key figure list is maintained.
    then another file with the data of the infoobject with DTP selections.
    Check for the selections in DTP.
    With Regards,
    Ravi Kanth

  • Revaluate data record at the time of loading from flat file or BI Cube

    Hello Friends,
    I want to revaluate a data record at time of loading using Transformation or Conversion File, based on some condition.
    Like, I have a rule to identify that a record is supposed to be multiplied by -1 or not.
    For example,
    *if (ID(1:5) = str(00070) then(Record-1)
          ID(1:5) = str(00071) then (Record-2)
    Can you please guide me how can I achieve this by using Transformation file or Conversion file?
    Regards,
    Vishal.

    Hi Nilanjan,
    Thanks for reply.
    I tried the script you suggested in conversion file for Account.
    But It is not working for me.
    Even I tried simple multiplication and also addition in Formula column it is not working.
    External   -->   *
    Internal    -->    *
    Formula   --->  Value * -1
    Above conversion file for Account was not working for me.
    then I tried
    Formula  --> Value + 100
    It also did not work for me.
    Kindly suggest if I am doing anything wrong in above file.
    Thanks,
    Nilanjan.

  • Delta and Full Load question for cube and ODS

    Hi all,
    I need to push full load from Delta ODS.
    I have process chain.... in which the steps are like below,
    1. R/3 extractor for ODS1 (delta)
    2. ODS1 to ODS2 (delta)
    3. ODS2 to Cube ---> needs to be full load
    Now when i run process chain by further processing automatically ODS2 does init/delta for Cube.
    How can i make it possible for full load ??
    can any one guide anything in this ?
    Thanks,
    KS

    Hi,
    1. R/3 extractor for ODS1 (delta) :  This is OK, normally you can put the Delta InfoPack in Process Chian
    2. ODS1 to ODS2 (delta): It automatically flow from ODS1 to ODS2 (you need to select Update Data automaticall in the Targets at the time of ODS creation)
    3. ODS2 to Cube ---> needs to be full load  :
    This you create a Update rules from ODS1 to Cube then Create InfoPackage in between ODS2 and Cube then do full loads. You can delete the data in the CUbe before the load ann dthen do Full load to Cube.
    Note: In ODS2 don't select Upadate Data autmaticlly to Data Targets
    Thanks
    Reddy
    Edited by: Surendra Reddy on Nov 21, 2008 1:57 PM

  • Partial data loading from dso to cube

    Dear All,
    I am loading the data from dso to the cube through DTP. The problem is that some records are getting added  through data
    packet 1 to the cube around 50 crore records while through data packet2 records are not getting added.
    It is full load from dso to cube.
    I have tried deleting the request and executing DTP but again the same no of records get added to the cube through datapacket 1   and after that records are not added to the cube through datapacket 1  ;request remains in yellow state only.
    Please suggest .

    Nidhuk,
    Data load transfers package by package. Your story sounds like it got stuck in second package or something. Suggest you check package size, try to increase to a higher number to see if anything  changes ? 50 records per package kind of low, your load should not spread out into too many packages.
    Regards. Jen
    Edited by: Jen Yakimoto on Oct 8, 2010 1:45 AM

  • Loading from ODS to Cube in process chain

    Hi Experts,
    How can I do a full load from ODS to cube when using further processing in process chain? Your help is much appreciated.
    Thanks,
    Bill

    Hi,
    You can use a DTP for this.
    Create transformation between DSO and cube.
    Create DTP and run it.
    Loading data from one cube to another cube.
    Cube to cube data loading
    how to upload data from cube to cube
    Can we pull data from one cube to another cube
    Data Load Steps:
    Reading data from another cube
    Hope this helps.
    Thanks,
    JituK

  • Error in DTP loading from 1st level DSO to 2nd level DSO

    Hi Experts,
    I am facing a strange kind of error while loading data through a DTP included in Process chain.
    We are loading full load, from a DSO to a target DSO.
    But it gives an error, "Error in substep : Start routine".
    When i checked the error logs, it redirected me to the start routine, where a deletion statement is written.
    We checked, with some ABAP persons, the code contained no error.
    So we repeated the DTP(right click and repair), it worked fine this time.
    There are several other DTP, with the same source and target.
    As per the client requirement, we divided the loads and not loading at once.
    The same error happened with other DTP also, but repeating them solved the problem again.
    As this is only a work around, but it makes no sense as it had to be done manually everytime even though it is scheduled using a process chain.
    Any one who has encountered similar problem, please share idea/information with me.
    Regards,
    Ajay

    Hi,
    I had the same problem before, to debug the code, i put break point in the routine and couldnt find any error. Then loaded the data, it failed again. then I removed the break point from the transformation and activated it. When ever we activate transformation, a program is generated, if the program is not generated then this kind of problem occurs.
    Thanks
    Srikanth

Maybe you are looking for