Planning Cube Load process

Hi Gurus,
How to Load data to DP cubes from a BW 3.5 system

If the BW 3.5 system is an external system then you need to build an Infocube in APO datamart and first load the data from external BW system to APO BW. Then create Planning area, POS, CVCs etc and then load the Planning data from APO Infocube to Planning area. using Transaction /SAPAPO/TSCUBE. DP Cubes are virtual ones and there is no physical existence. Please check this [DP Process|http://help.sap.com/saphelp_scm50/helpdata/en/8f/9d6937089c2556e10000009b38f889/frameset.htm] and also please go through the [Best Practices|http://help.sap.com/bp_scmv250/BBLibrary/HTML/DPL_EN_DE.htm]. open the link and click on Configuration guide. It will you through step by step process.

Similar Messages

  • Automate the Cube loading process using script

    Hi,
    I have created the Essbase cube using Hyperion Essbase Studio 11.1.1 and my data source is Oracle.
    How can I automate the data loading process into the Essbase cubes using .bat scripts?
    I am very new to Essbase. Can anyone help me on this in detail?
    Regards
    Karthi

    You could automate the dimension building and dataloading using Esscmd/ Maxl scripts and then call them via .bat scripts.
    Various threads available related to this post. Anyways, you could follow the following steps.
    For any script provide the login credentials and select the database.
    LOGIN server username password ;
    SELECT Applic_name DB_name;
    To build dimension:
    BUILDDIM location rulobjName dataLoc sourceName fileType errorLog
    Eg: BUILDDIM 2 rulfile 4 username password 4 err_file;
    For Dataload
    IMPORT numeric dataFile fileType y/n ruleLoc rulobjName y/n [ErrorFile]
    Eg: IMPORT 4 username password 2 "rulfile_name" "Y";
    Regards,
    Cnee

  • Automating the cube load process

    Hi All,
    I have created the essbase cube using Hyperion Integration services 9.3.1 as my datasource is the star schema residing in a oracle db. I can successfully load the cube and see the results in smart view.
    I want to know how I can automate the data loading process(both dimension and fact) into the essbase cubes either by using unix scripts and windows .bat scripts.
    Your inputs are Appreciated. Thanks in Advance.
    Regards
    vr

    What everyone has shown you is the use of esscmd or MaxL, howevery you stated you are using EIS. Until you get to 11.1 and use Essbase studio, these won't help you. What you will need to do is to go into EIS, then select your metadata outlline. Pretend you are going to build the cube manually. Start scheduling it, when you get to the screen that allows you to schedule it or create a file, create the file only. You will want the name to end in .cbs (Hint, if you select both outline and data load, you will get all the statements in one script)
    After you are done, if you look on the EIS server under arborpath\EIS\Batch, you should see the .cbs file you created. It is a regular text file so you can look at it. If you open it, you will see loaddata and/or loaddimensions statements This is what does the work.
    To run it, create a batch file with the command olapicmd -fscriptFileName >logfilename
    For more information about the Integration Services Shell look at http://download.oracle.com/docs/cd/E10530_01/doc/epm.931/eis_sysadmin.pdf Chapter 5 tells you everything you need to know

  • Plan Cube Load Plan Mode - Green Yellow Request

    When planning, the request status remains Yellow during all changes and functions. If I change the cube to load mode then back to plan mode, this request will turn green and a new yellow request appears. My planning application will have many planning functions and data input, and I would like to be able to create a new request for each Planning function for ease of troubleshooting/repair. Is there a standard functionality to do this?
    I have found I can create a process chain which will switch to load mode then back to plan mode, and can execute this process chain via the web template, but was wondering if there was a better method to doing this.
    Thanks!

    Hi,
    please take that e.g. saving the plan data is only allowed if the cube is in plan mode. That means as soon as two planers are working with your application there is the possibility that the both want to save data at the same point in time. Of course they can not work on the same data set, because the locks will prevent this. But it is possible that they want to save the data at the same time. This means in your case, that the first user sets the cube to load to close the request e.g. by using a planning function and the second user is getting an error message, because the cube is in load mode and can not be planned right now. Do you really want to run in this situation? I would not do it.
    If yes, you may create a customer planning function and switch the cubes. Function RSAPO_SWITCH_BATCH_TO_TRANS and RSAPO_SWITCH_TRANS_TO_BATCH are able to do switch.
    Hope this helps ...
    Regards Matthias Nutt
    SAP Consulting Switzerland

  • Defragmenting Planning Cube with Refresh

    Has any body ever worked across a script which will defragment the planning app cubes(3 cubes in an app say).
    Essbase cubes its easy I know. clearing lev0 data/reset cube/load data ang aggregate.
    For planning cubes the process need to be 100% automatic. Also it requires automatic refresh of Planning app before the aggregation step, to bring currency rates to essbase.
    If essbase and planning servers are different how to accomplish this a fully automatic script without any manual intervention.
    Any suggestion will be appreciated.
    Thanks,
    Tarini

    Try the following utility to refresh your cube:
    [http://www.network54.com/Forum/58296/message/1201822528/Re-+CubeRefresh-exe]
    Use a batch file or shell script to launch the utility above, then use MaxL (launched from the same batch file) to export the data from the Essbase cubes, clear and reload to defrag. I would install the Essbase client on your Planning Server (so you can run MaxL from your Planning server). Schedule with whatever scheduling tool you prefer.
    - Jake

  • Cube compression process chain creating

    Hi,
    I want to create Cube compression process chain so that it will automate the process to compress the cube.
    Could any please provide me the steps to create Process chain to automate the cube compression.
    Regards,
    Prem.
    Please search the forum
    Edited by: Pravender on Aug 27, 2010 11:54 AM

    Hi,
    You can modify ur process chain in RSPC.
    Get into change mode in planning view and in the left side check for the different process where you can find the Compression process.
    Drag the same to right section and create a variant for the same and give the cube name. Once done then you can put the same after the cube load process (If roll up is there then it shud be after rollup if not then cube load process).
    Once all changes are done then, activate it and collect the same in transport to move the changes to next level.
    Hope this helps.
    Murali

  • Automatically trigger the event to load data from Planning cube to Standard Cube

    Hello,
    We have a below set up in our system..
    1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
    2. An actual reporting cube which gets data from the planning cube above.
    Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
    This involves 2 things..
    1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
    2. Trigger the DTP which loads data from Planning cube to reporting cube.
    We want to automate the above two steps...
    I have tried few things to achieve the same..
    1. Created an event in SM64,
    2. In the Planning cube "Manage" Screen,  clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
    3. Wrote a ABAP program which changes the setting of the planning cube (  " Change real time load behaviour " to Loading )
    4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
    This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
    This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?

    hi,
    try to do the transformation directly in the input cube by using CR of type exit, more details :
    http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
    hope it helps.

  • Error in delta loading from a planning cube

    hi all,
    I am using a delta load from one planning cube to another.When the changed records are of a magnitude of 100 then it is successful but when the records are of magnitude of 100000 then it stays yellow with no data records sent and eventually fails.the packet size is 20000.
    any help is appreciated.

    Hello Ajay,
    Have you checked that your Request is closed in the planning cube? FM RSAPO_CLOSE_TRANS_REQUEST will do it
    It is a customizing issue to tell the infopackage to end "green" when it has no data.
    Transaction Spro ==> SAP Netweaver ==> SAP Business Information Warehouse ==> Automated Processes ==> Monitor Settings ==> Set Traffic Light Color. There you have to set the Traffic light on green if no data is available in the system.
    Hope that helps
    best regards
    Yannick

  • Unable to load data into Planning cube

    Hi,
    I am trying to load data into a planning cube using a DTP.
    But the system throws a message saying that, real time data loading cannot be performed to the planning cube.
    What would be the possible reason for the same?
    Thanks & Regards,
    Surjit P

    Hi Surjit,
    To load data into cube using DTP, it should be put in loading mode only.
    The real time cube behaviour is set to planning mode only when the data is input to the cube via the planning layouts / or through the Fileupload program.
    You can change the behaviour of the cube in RSA1 -> Right click on the cube -> Change Real-time load behaviour and select the first option(real time cube can be loaded with data; planning not allowed)
    Best Rgds
    Shyam
    Edited by: Syam K on Mar 14, 2008 7:57 AM
    Edited by: Syam K on Mar 14, 2008 7:59 AM

  • BPS Processing Question for 2 Planning Cubes

    Hello, Can you please give input on the following scenario:
    We have 2 planning cubes, cube 1 contains selection
    criteria records which will be used to select and
    process the detail records from planning cube 2. i.e.
    Sales Org = 100, Material= 200, Pct=.05  
    This cube could have multiple records with different
    selection criteria.
    Cube 2 holds the detail data to be selected and
    manipulated based on the records from cube 1. We want
    to process each record from cube 1, one record at a
    time as selection criteria to select and process the
    data from of records of 2nd cube. i.e. Select Sales
    Org = 100, Material=200, Cost = Cost * .05 etc. 
    Do you have a best practice or ideas? 
    I really appreciate any help.  Scott

    I think you should include these 2 cubes in a multiplanning area and implement a FOX function for this scenario. The same combinations of sales org, material... etc will get grouped into one data packet and get processed together (these criterion chars should not be marked <i>'to be changed'</i>).
    Hope this helps.

  • Incremental Cube Load Error: job slave process terminated

    Hi,
    For performance reasons, we switched to Incremental Cube Loading i.e. only those partitions are autosolved whose data is made available.
    Some times, the background submitted job terminates and the reason given in dba_scheduler_job_run_details is:
    REASON="Job slave process was terminated"
    There so no definits occurance pattren for this error.
    The job submitted in background is killed.
    The last entry the xml_load_log displayed is of Started Auto solving of a partition.
    After this error occurs, we have to Full Aggregate the cube; which offcourse would autosolve all partitions.
    We have been too much annoyed by this error as we did lot of package changes as part of a release to production to include Incremental cube loading, and once done, we see that incremental cube loading just terminates while autosolving a partitions.
    Can any one assist please? Urgent?
    thank you,

    Hi,
    There is a metalink note about this issue. Check note 443348.1
    Thanks
    Brijesh

  • Optimize the data load process into BPC Cubes on BW

    Hello Gurus,
    We like to know how to optimize the data load process and our scenario for this is that we have ECC Classic Ledger,  and we are looking for the best way to load data into the BW Infocubes from an ECC source.
    To complement the question above, from what tables the data must be extracted and then parsed to BW so the consolidation it´s done ?  also, is there any other module that has to be considered from other modules like FI or EC-CS for this?
    Best Regards,
    Rodrigo

    Hi Rodrigo,
    Have you looked at the BW Business Content extractors available for the classic GL? If not, I suggest you take a look. BW business content provides all the business logic you will normally need to get data out of ECC and into BW for pretty much every ECC application component in existence: [http://help.sap.com/saphelp_nw70/helpdata/en/17/cdfb637ca5436fa07f1fdc0123aaf8/frameset.htm]
    Ethan

  • Does exist a way to read data from IP planning cube with ABAP?

    Hello All.
    My scenario is as follows:
    I have an ODS where we store costcenters to be planned. This ODS is loaded via a manual falt file load (regular dta transfer process).  In order to avoid inconsisten data in the planning cube, I want to check in my ODS load process, if any costcenter which is already planned (it has planned data in my planning cube) is being deleted in the loaded file.
    To do this, I need to access my planning data cube ( i guess also planning buffer), so my question is if there is any function module which retrieves data from a planning level as reference data.
    Thanks a lot and best regards,
    Alfonso.

    Hi Alfonso,
    note 1101726 shows how the plan buffer can be read.
         l_r_plan_buffer = cl_rsplfa_plan_buffer=>if_rsplfa_plan_buffer~get_instance( i_infoprov ).
         l_r_plan_buffer->get_data( EXPORTING i_t_charsel = l_t_charsel       
                                              i_include_zero_records = rc_false
                                              i_r_msg = l_r_msg
                                    IMPORTING e_r_th_data  =  l_r_th_data
                                    EXCEPTIONS OTHERS  = 2 ).  
    Please take a look at the note. You do not need to implement the after_burn_selection exit, but you can find sample code how to read the planning buffer. Please give it a try.
    Another solution would be to use the function module RSDRI_INFOPROV_READ. But you need to make shure that you first close the yellow request. This can be done using function module RSAPO_SWITCH_BATCH_TO_TRANS.
    Hope this helps
    Matthias Nutt
    SAP Consulting Switzerland

  • Any ideas on this plan for a process chain?

    Hi,
    I have 6 ODSes. I load these on a daily basis with 6 different flat files. ODS1, ODS2 and ODS3 needs to be loaded first before ODS4, ODS5, ODS6.
    Once all six ODSes are loaded, they are then aggregated based on some two key fields to and loaded into a cube. Npw I want to automate the process.
    Can you please check if my plan for the process chain is right:
    1. Start Process:   
    direct Scheduling
    Change Selections:
    Start date & Time
    Period Jobs: check
    Periodic Values: Daily
    Restrictions: Always execute job
    2. Indexes:(this flows into the first 3 ODSes)
    Delete indexes
    3. Load Data:
    Load Data ODS1 
    Load Data ODS2
    Load Data ODS3
    4. Activate ODS1
    5. Activate ODS2
    6. Activate ODS3
    (? What do I setup here so that the following will be loaded only if ODS1, ODS2 and ODS3 are successful)
    7.Load Data
    Load Data ODS4
    Load Data ODS5
    Load Data ODS6
    8.Delete Indexes
    9. Load Data
    Load data from ODS1, ODS2, ODS3, ODS4, ODS5, ODS6 to the CUBE
    10.Activate Cube (?Needs to activate cube? Is there a process type like that of activate ODS)
    11. Create Index  (Hm, will the Delete and create indexes in this plan take apply to both ODS and Cube)
    Thanks, I will lovr to get hint from you. How do I factor in PSA? i.e. To always go to PSA then to ODS and Cube?

    Hi,
      1.Start the process.(as per your requirement)
      2. Load the data to ODS in parallel (ODS1 , ODS2 & ODS 3)
      3. Activate the three ODS , with separate ODS activation process type.
      4.Put an AND condition
      5. load data to ODS4 ,5 and 6
      6. Activate ods4 ,5, & 6.
      7. delete index for cube
      8. load different data from ods to cube
    9. create index.
                    start
    load ODS1  -- Load oDS2 -- Load ods 3
    Activate ODS1 - Activate oDS2 - Activate ODS3
                     AND (process)
    Load oDS4      load ods5    load ODS 6
    Activate oDS4  Activate ODS5  Activate ODS 6
                     AND
              Delete the index
               Load data from different ODS to cube
                Create Index
       there is no concept of activating the cube .. it is only applicable to ods.
    Regards,
    Siva.

  • Planning Text Load Issue.

    Team
    I have a data file that has TEXT data that I want to load to planning, I have done the following steps and facing issue while running the
    Outline Command .Please advice what could be the problem. I have mentioned all dimension in the CSV file.
    Below Version and Type i am using .
    Version: 11.1.1.3.0 , Classic Type
    SET In Planning - Manage Data Load Administration
    Data Load Dimension: Employee
    Driver Dimension: Account, Member: EMPLOYEE NAME
    Created CSV File (text.csv)
    Employee,Data Load Cube Name,EMPLOYEE NAME,Point-of-View
    TBH1,Wrkforce,SMITH,"Local,Jan,Actual,1st Pass,FY11,12000"
    Test the Execution of the Utility
    /OutlineLoad.sh /S:Hypdevapp.ana.corp.abc.com /A:EXP_GI /U:plnadmin /-M /-N /I:/home/asrivastava/Text.csv /D:Employee /L:/home/asrivastava/log_Text.log /X:/home/asrivastava/err_Text.log
    Error getting as below:
    Error occurred loading data record 1: TBH1,SMITH,Wrkforce, Local ,Jan,Actual,1st Pass,FY11,12000
    java.lang.RuntimeException: Not all dimensions were specified.
    Planning Outline data store load process finished with exceptions:  exceptions occured, examine the exception file for more information. 1 data record was read, 1 data record was processed, 0 were successfully loaded, 1 was rejected.
    Thanks
    Suresh

    Hi John,
    Thanks for the quick reply.
    Today I have executed with /M but still I'm getting the same error.
    Please suggest me if you know any other command/option for the same.
    Thanks,
    Suresh.

Maybe you are looking for