Partitioning a cube in a Process chain

Hi,
How can I launch the partitioning of my cube in a process chain ?
Is there an ABAP program i can use with a variante ?
I have 3 cubes (one per Year : Y, Y-1, Y-2).
I would like to create a process chain for Shifting my data :
Delete data of cube Y-2
Load data Y-1 in cube Y-2
Delete data of cube Y-1
Load data in cube Y-1
Selective deletion in cube Y
When my cubes are empty I would like to make a new partitionning with the next year.
this | Next |
year    |  year   |        old part.         |         new partioning
cube Y_           2007   | 2008      | 001.2007->012.2007 |      001.2008->012.2008
cube Y-1        2006   | 2007      |  001.2006->012.2006 |      001.2007->012.2007
cube Y-2        2005   | 2006      |  001.2005->012.2005 |      001.2006->012.2006
Thanks for any help.

Hi,
run transaction delete_facts and generate a deletion program for your cubes with a select-option for the year. Create variants for that programs to delete the relevant years.
Create a chain with the following steps:
1st. abap to delete year - 3 from last cube
2nd. load year - 2 from previous cube to deleted cube
3rd. abap to delete year - 2 from previous cube
4th. load year - 1 from actual cube.
5th. delete year - 1 from actual cube.
Hope this helps!
kind regards
Siggi

Similar Messages

  • Error While Deleting cube Contents in Process Chain

    Hi All ,
       I am getting a error when i am trying to delete the contents of two cubes in a Process Chain . The data of one cube is getting successfully deleted but while deleting the second one it results in a error.
    Following are the messages :
    1>Database table /BIC/FZCFEXP01 was deleted and recreated     
    2>Database table /BIC/EZCFEXP01 was deleted and recreated     
    3>Database table /BIC/FZCFEXP02 was deleted and recreated     
    4>Database table /BIC/EZCFEXP02 was deleted and recreated     
    5>Content Of Data Target ZCFEXP01 Successfully Deleted     
    6>Content Of Data Target ZCFEXP01 Successfully Deleted     
    7>System error: RSDU_DATA_EXISTS_TABLE_DB6/ DB6     
    8>Drop Cube Failed In Data Target ZCFEXP02
    Can anyone Help ??
    Sam
    Any pointers . Help is highly appreciated.
    Edited by: Samir Bihari on May 14, 2009 8:54 AM
    Edited by: Samir Bihari on May 14, 2009 8:55 AM

    Hi,
    I'm not sure for this issue but it may relate to SP issue...
    jst chk this notes:
    Note 385293
    Note 987754
    rgds,

  • How to close an open request for a RealTime Cube in a Process Chain ?

    Hi,
    I've a real-time infocube, let´s call it C1RT. C1RT is connected to another standard Infocube C2 via a DTP.
    The users enter the planning data; it's stored  on C1RT. I wan't a process chain that loads data from C1RT to C2.
    What I've done yet and doesn't work is:
    Process chain PC1:
    Start->Change load mode from input ready to load ready for C1RT->Run DTP.
    I need to go to C1RT, right click, manage, click on the yellow ligth of the open request, and chage it to green, after this DTP transfers the data succesfully.
    I want to automate the yellow to green issue in PC1.
    Any ideas ?
    Best Regards,
    Miguel P.

    Hi,
    no report necessary.
    I had a similar problem lately.
    changing the cube in process chain is enough, but there are some settings to be made for the cube.
    Look it up at OSS note 1063466 .
    regards
    Cornelia

  • Loadihng from ODS to Cube failed in process chain

    Hi All,
    I have a process chain for SD loads scheduled every night. It gave me an error while loading data from Billing ODS to Billing Cube(via Further Processing, and the data is not available in PSA).
    The error it gave is as follows:
    Error while updating data from DataStore object ZSD_O03
    Error 7 when sending an IDoc
    Time limit exceeded. 3
    Time limit exceeded.     
    Errors in source system
    Now the request is loaded in Cube(YELLOW status) with transfered records 87125 and added 62245. But there is no way by which we can check if the complete data has come to the cube.
    i did not delete the previous request from the cube  and tried to repeat the process chain from the same point...it then gave me the following error:
    -Last delta incorrect. A repeat must be requested. This is not possible.
    Please tell me now what should I do...i don't have to miss the delta records in the cube.
    1)Should I delete the prevous loaded request in cube and repeat the process chain from the same point...will it get all the delta records in that case???
    2)Should I delete the request from cube and manually repeat the last delta for the cube. and then manually load the remaining process chain.
    please help me out soon..
    Thanks and Regards,

    Hello Niyati
    you can use 2nd option
    Your error message is comming because of lack of BGD process and memory....just set the status of request in cube to red, delete it, and run a repeat load from infopackage.....
    If no. of records are high than just go for the option of "load data in PSA and than subsequently in data target"
    in infopackage
    Thanks
    Tripple k

  • Creating cube index in process chain

    Hi,
    From my previous post I realized that we build index of cube first and then delete the overlapping request from cube.
    (http://help.sap.com/saphelp_nw04/helpdata/en/d5/e80d3dbd82f72ce10000000a114084/frameset.htm)
    If I design a process chain in which I delete the overlapping request first and then build index then in the checking view it does not give me any error.
    The process chain also works fine.
    How does it hurt to have it this way or what is the concept behind having the sequence as recommended by SAP.
    Thanks,
    sam

    Hi Sam,
      Writing performance is increased when there is no index --- So we delete the index before updating data.
    Reading Performance is increased when there is a index -- So we create Index so that when queries are executed quickly by reading the data in the info-cube.
    BI Best solutions suggests that old requests are to be deleted before loading new one.So we delete ovelapping requests.
    Hope it helps.

  • How to rollup cube data in Process chain?

    I have loaded data into a cube, and when i look at the technical status it is green but there is nothing in the "request for reporting available"box. I went to Rollup Tab and manually execute the request. It worked.
    But is there anyway that I can add a process type to do it automatically? I saw there are three process type related to Rollup, which one should I use?
    1. Initial Fill of New Aggregates
    2. Roll Up of Filled Aggregates/BIA Indexes (what does this mean?)
    3. initial Activation and Filling of BIA Indexes
    Thank you!

    Dear experts,
    We have the following problem: for certain figures we have on ODS level 2 daily updates: one at night and one at noon. During the week at noon we only load from the ODS into the Cubes only that data that is entered the same day (selection on CPU-Date in the InfoPackage). After loading the data at night we load the data which was entered the previous day (selection on CPU-Date in the InfoPackage). In this process we have a step in the process chain that deletes in the overlapping request of the previous day (which was loaded at noon).
    Our Process Chain for loading the data from ODS into the cubes looks as following:
    Delete Indexes of the cubes -> Load data from ODS into Cubes -> Generate Cube Indexes -> Delete overlapping Requests in InfoCubes
    After filling the BIA-Indexes on this cube, the process chain gives an error message at the stage where the overlapping request should be deleted.  The error messages says:
    Uncaught Exception: Keyfigure is of type Float.
    To solve this error at the moment we manually delete the BIA Indexes, delete the overlapping requests and fill the BIA Indexes again. Since the functions 'delete BIA Index' and 'Fill BIA Index' is not available in RSPC we can not do this automatically in the process chain.
    I also tried like above mentioned taking the step 'roll up filled BIA Indexes' into the process chain, but the check of the process chain creates a message stating that i cannot do this when in the same process chain the steps 'Delete Indexes of the Cube' and 'generate Indexes of the cubes' are included.
    Does anybody know a solution how i can delete overlapping requests in a process chain with filled BIA Indexes.
    Many thanks in advance for your kind reply.
    Best regards,
    Ilja Dekeyser

  • Process chain questions

    Hai  All,
               I designed a process chain with data flow like this:
    R/3>ODS1>ODS2>Cube.
    The process chain sequence is:
    Start>>Load data>>activate ODS1 data>>load data to ODS2>>activate ODS2 data>>delete index>>load data to cube.
               My process chain is working and the data loads are good. But I am getting warning messages for all three variants "activate ODS1 data>>load data to ODS2>>activate ODS2 data>>"
    The message for the two activation variants is as follows:
    A type "Update ODS Object Data (Further Update)" process has to follow process "Activate ODS Object Data" var.ACTIVATEZMONDET in the chain
    Message no. RSMPC016
    The message for the loading variant is as follows:
    A type "Activate ODS Object Data" process cannot precede process "Execute InfoPackage" var. ZPAK_3YF4Q8R0RSHTDJH74WZOAXBC6 in thechain
    Message no. RSMPC013
               The version I am working on is BW 3.5. My question is it ok to have these warning messages or do I need to do anything else?
    Thank you.

    Hi Visu,
    If you data loads are running fine, you can ignore the warning messages. The system tries to warn you in case it suspects that a process is missing from the chain of events required...but there are more than one ways to achieve a load. If the data volume to be loaded to the cube is not very high, like it is just a daily delta, you can even go ahead and remove the index deletion process.
    Hope this helps...

  • To set delay in process chain using a program

    Hi all,
    We have a requirement in which the loads from nearly 15 sources (DSO) gets loaded into the Cube thro 15 process chains. The source gets loaded from the PI interface. So it might happen that simultaneously 10 loads will be started and reach the cube.
    In the cube we have delete overlapping request, so its gives a deadlock when the chains try to delete the prev. request at the same time.
    What we have planned is if we can write a program that will set a delay depending on the lock entries present for that particular cube. But still we are not sure how far this will help.
    If anyone has faced a similar issue or can anyone suggest on this would be much appreciated.
    Thanks
    Merlin

    I believe in each of your 15 local chains you have the load to Cube process.Why dont you take the load to the Cube out of the local chains and in the meta chain you could sequence the loads from the 15 ODSs so that you can have a parallel processing upto the ODSs ( assuming the rest of the local chain is used to fetch from R3 ) and then sequential to Cube.
    I dont think writing a program will help much here.

  • Deleting Data/ Process Chains

    Hi All,
            I want to delete data from a number of cubes, and also after deleting I want to make sure that no data is to be loaded into these cubes via any process chains that are already present and loading. My Question is by looking at a cube how can I find out if a process chain is doing so?

    I found them,. If I do not see process chains does that mean that particular cube is not in process chains?
    Message was edited by:
            xcaliber
    Message was edited by:
            xcaliber

  • How to run Multi provider with process chain

    Hi Experts,
    I have a process chain (Including 3 cubes) , now i want to run this multi provider through process chain?
    can any one give me step by step process?
    Thanks

    Hi,
    Include 3 cubes in the Process chain.Multiprovider is a structure it will not contain data physically.while running the query it will fetch the data from 3 cubes that u loaded from Process chain and displayed in the report.....

  • Compression through process chain

    Hi Gurus,
    I have 76 cubes in my producation system.
    I want compress all cubes through process chain in every weekend.
    adding all cubes in one process chain...
    pls suggest me ,
    thanks,alex

    When using the 'Compression of the InfoCube' Process Variant, you don't have the capability of given a date range or year/month of requests to compress. It only has the capability of n days old.
    So, you'd have to create two Process Variants, one with marker and one without, for each month of requests to be compressed. Since you haven't compressed for two years, the values to put in the 'Collapse only those requests that were loaded XXX days ago' would be:
    730
    700
    670
    640
    610
    580
    550
    So in total, you're going to end up with something like 50 Process Variants (25 with marker and 25 without marker). You'd want then to execute in inverse order (730 first, then 710, then 670, and so on) so that it does only 30 days worth of work.
    Then once you have completed compressing the InfoCubes, you can then remove all of these Process Variants and only have one with marker and one without and set to how many days old you wish (SAP recommended to us not to compress requests that are less than 21 days old in the case there's an issue with a request - once you compress a request you can't identify and delete data by request).

  • Automate loads through Process Chains

    Hi,
    I have two Infocubes, one a delivered one and the other a custom one. The delivered one should be loaded everyday and it is a full load (we would delete its contents completely and reload every night). The custom cube is an exact replica of the delivered one and is also a full load from the delivered cube. But the custom cube stores monthly snapshots of the data and is hence loaded only on the last day of a month.
    I want to automate the loads to both these cubes by creating Process Chains. How do I do it? The catch in this is if the load to standard cube fails on last day of month, then the load to snapshot cube should not be triggered at all.
    Also note that we are using BW 3.5.
    Thanks.

    Another thought:
    Make two chains - Daily and MOnthly...
    In daliy chain: your deliverd cube will run...
    In monthly : your daily and afer tht "On Success"  - monthly load will happen..
    Note that you need to keep start variant fro both chains accordingly and daily chain should not run on 1st day of month when monthly chain would run..
    hope it helps
    thnx for any pts

  • I want to know Cube,Ods,Multiproviders,infopackage groups,process chains

    Hi all,
    1.how to make techinical specifications Cube,Ods,Multiproviders,infopackage groups,process chains.
    2.how to decide the size of the cube.
    3.what is the default size& what is the max size & In the case of ods.
    4.performance wise which is best multiprovider or cube.
    5.why cube is best (performance wise) for  cube over ods.
    6.What is max no of chars & keyfigures can insert in ods.
    7.Is sid concept exists for ods.if yes when they generate.
    Thanks
    cheta.

    Hi
    1.From functional specification requirements,find out the datasources by using business content check(offline) or using Metadata repository(online), u need to have TREX installed in ur system.Do the GAP Analysis and note down the objects which are not in business content and create Z objects(Datasources,ODS or Infobjects,Cubes). Draw the bubble diagram and derrive ER diagram from it and finally do logical information model(cube model) and decide what u need in data flow.
    2.You have a quick sizer in market place which can help you to decide the sizing of BI project.Take care of trade-off's between line item dimensions and normal dimension which have large impact on ur sizing.
    3.There is no such default sizing,depends on ur reporting requirements whether u need detailed reporting or snap shot reporting.
    4.There is no trade-off's between multiprovider and Cube regarding performance.From performance point of view we have to consider other factors such as aggregates,Indexing,Partitioning etc..
    6.The Max number of Keyfields are 16 and data fields are 749.
    7.Yes SID's are created for ODS when you check the flag for BEX Reporting.
    Thanks
    Chandru

  • Process Chain which loads data to a cube

    Hi All
    Please help me create a process chain in BI 7.0. I have to load the data in to a cube and before that i need to delete the existing request in the Cube. I think i should use " Delete overlapping requests from Infocube".  In the maintain variant for this process type what object types i need to add. Do i need to add both Execute infopackage and Data transfer process objects.
    Regards
    Naga

    Hi Ravi
    I am loading the data from PSA to Cube using DTP. Actually my data source is export datasource (8ZHYP- prefix 8). So accroding to your answer i should use DTP object type in the process type delete overlapping request. But when i create a variant for that process type it is getting Indexs > Create Index > Data Transfer Process > Request to delete data in the infocube.
    I just want to delete the data in the infocube before i load the data. So can i delete all the remaining processes.
    Regards
    Naga

  • Cannot update data from DSO to Cube with further update in process chain

    Hi Guys,
                  This is the message am getting when am trying to update to cube by process further update with process chain.Process fails at further update with the following below message.
    However at DSO level i can see the data mart status but no request at cube level.
    Error Message:
    Cannot update request REQU_D57WAP7Z2WAI1CG5Y5OQ5W7JR in data target
        Message no. RSM1523
    Diagnosis
        System checks show that request REQU_D57WAP7Z2WAI1CG5Y5OQ5W7JR cannot be
        updated to a data target.
        The system checked whether
        o   request REQU_D57WAP7Z2WAI1CG5Y5OQ5W7JR is already in the data target
        o   the data target has active update rules for the InfoSource
        o   the data target was selected for updating
        o   locks exist
        o   for a DataStore object as a data target, that the request was
            updated in the correct order, if the DataStore object stipulates
    When i check the monitor screen under status tab it says "A caller 01, 02 or equal to or greater than 20 contains an error meesage" and under details tab "Data Package 1 : arrived in BW ; Processing : Data packet not yet processed "
    Regards,
    Krishna.
    Edited by: krishna Krishna on Mar 4, 2009 6:37 PM

    Hi
    It seems you used infopack when you select update data inot datatarget then system prompts you a infopack.That's the infopack which is generated by system being used in process chain variant to update data into datatarget.
    Create new infopack/ DTP manually then use that infoapck/DTP on the process chain variant then try.
    Chandu.

Maybe you are looking for

  • What are data types that can be stored on TemSe?

    What are data types that can be stored on TemSe(Storage for Temporary Sequential Data)? Moderator message: please search for available information/documentation. [Rules of engagement|http://wiki.sdn.sap.com/wiki/display/HOME/RulesofEngagement] [Askin

  • LSMW - upload question

    Hi Guys, I  would like to upload vendor invoices by using LSMW tool. The imported method type is Batch input recording (Tcode FB60). The problem is: I have text file as with the following format, Vendor Inv.Date Reference Desc Amount GLcode WBS eleme

  • Column name in ALV table dissapeared after new language package installatio

    Hi, We have a custom field in an ALV table in SAP system. In english everything was just fine with our custom field. But when we installed a new languaged the column name dissapeared. My question is: Where does the ALV get the column name from? Is th

  • Can context menues be altered/enhanced when using FM REUSE_ALV_GRID_DISPLAY

    Dear ABAPers, as I understood it I would need to have access to the instance of class CL_CTMENU and its method ADD_FUNCTION in order to enhance the standard context menu that I get when I right-click into an ALV-Grid. Am I right that this is not poss

  • Backend ODATA channel missing

    Hi, I have installed all the required components for the SAP Netweaver Gateway 2.0 on Solution Manager systme in my landscape. Now as per the process  i need to go to:  In transaction SPRO open the SAP Reference IMG and navigate to: SAP NetWeaver ->