Best practices for loading apo planning book data to cube for reporting

Hi,
I would like to know whether there are any Best practices for loading apo planning book data to cube for reporting.
I have seen 2 types of Design:
1) The Planning Book Extractor data is Loaded first to the APO BW system within a Cube, and then get transferred to the Actual BW system. Reports are run from the Actual BW system cube.
2) The Planning Book Extractor data is loaded directly to a cube within the Actual BW system.
We do these data loads during evening hours once in a day.
Rgds
Gk

Hi GK,
What I have normally seen is:
1) Data would be extracted from APO Planning Area to APO Cube (FOR BACKUP purpose). Weekly or monthly, depending on how much data change you expect, or how critical it is for business. Backups are mostly monthly for DP.
2) Data extracted from APO planning area directly to DSO of staging layer in BW, and then to BW cubes, for reporting.
For DP monthly, SNP daily
You can also use the option 1 that you mentioned below. In this case, the APO cube is the backup cube, while the BW cube is the one that you could use for reporting, and this BW cube gets data from APO cube.
Benefit in this case is that we have to extract data from Planning Area only once. So, planning area is available for jobs/users for more time. However, backup and reporting extraction are getting mixed in this case, so issues in the flow could impact both the backup and the reporting. We have used this scenario recently, and yet to see the full impact.
Thanks - Pawan

Similar Messages

  • Synchronisation of two tables in an APO planning book view

    If one has 2 separate tables/grids in an APO planning book data view, is there a way of making them move synchronously, so that if the user moves say x buckets in a direction then this applies to both tables?
    Thanks for any advice...

    Unfortunately no but would be very interested to know if someone has come up with any enhancement (has to be GUI related).
    Somnath

  • Uploading planning book data error

    Hi Emmanuel,
    I think you can help me with this qn.
    When i am trying to upload the planning book data after using the save locally function, i am not able to load the data back to the planning book. I am getting a dump.
    I am not able to resolve it. can u please guide me.
    Thank you very much in advance

    Hi Sandeep,
    When you saved the file using the "Save locally" functionality in the Interactive Planning transaction, did you choose the option "Prepare file for upload at a later time" ? You have to choose that option so that an upload later will work.
    Also, please check out OSS note Note "925933 - Interactive Planning Excel up/download corrections".
    Please tell me if this helps to solve your problem

  • Publishing an APO Planning book to  EP

    Any ideas on publishing an APO planning book to EP? Are there are business packages available? Couldn't find any on Iviewstudio.

    Hello Anand,
    I know this post was made a long time ago.. but yet, I wanted to check if you found a business package for APO planning book to implement it on th portal.
    Thanks and Regards,
    Reena

  • Unable to see the planning book data source

    Hi all,
    I am not able to see my planning book data source when i am going for rsa3. neither i am able to see it in rsa6. but when i am trying to create it by the same name it says this data
    source exist in the system. can anybody guide me.
    thanks in advance,
    rahul

    Rahul, I would instead try to go to Planning Area and check for Data Extraction Tools on the top menu bar. The data sources if defined  correctly should show up as one of the options in the data source field that you see. Ensure you have followed the steps based on
    http://help.sap.com/saphelp_scm41/helpdata/en/e0/9088392b385f6be10000000a11402f/frameset.htm

  • SNP - Planning Book/Data View not showing up in /SAPAPO/SNP94

    Hi All:
    I created a planning Book out of a custom SNP planning area. While I could see the Planning Book in the list of Planning Book in the /SAPAPO/SDP94 - Interactive Supply Network Planning (all Books)   I could not see the custom planning book in /SAPAPO/SDP94 - Interactive Supply Network Planning (all Books) .  I could ony see the SAP supplied PB there.  Will SDP94 transaction display only the standard SNP Planning book?  How can I make sure that the custom PB is displaed in /SAPAPO/SDP94?
    pl. help
    Thanks
    Ryan

    Hi Narayanan,
    a couple of simple checks..
    can you please check that you have not set a filter critera in the shuffler in planning book/data view?
    you can also check if by mistake only a limited number of users are assigned to this planning book and data view ?
    else you may want to check note : , 327371
    Let me if this helps.
    Rgds, Sandeep

  • Rename & deletion of planning book & Data view

    Hi Experts,
    Is it possible to rename & deletion of planning book & Data view, if yes what is the procedure.
    Also suggest if hiding of same is possible

    Hello  jrd333
    You can delete planning book/data view via transaction /SAPAPO/SDP8B
    There you can choose PB/DV that needs corrections.
    Also you maybe need to deactivate your planning area to wich this PB is assigned.
    You can do this in transaction /SAPAPO/MSDP_ADMIN
    This things is related to custom Planning Books/Views.

  • Long time for loading Planning books/data views

    Hi
    Could someone throw some pointers/solutions for extremely horrible time taken for a planning view (data view) to load and populate for certains selection profiles that already have large number of trasactional records. In my case it is specifically the run time dumps thrown for a popular combination (large number of transactions).
    Urgent suggestions/solutions required. Pls. call 9923170825. India, if you are lazy enough to type it out. or just type in some key words, tcodes etc.
    Thanks

    I hope you dont have too many macros hogging the memory in interactive book. The other thing is to be on the latest LC build. Also try to have default unit of measures (keep the dataview settings for UoM blank).
    Can you confirm if you build the TLB view brand new in 5.0 or did you do some migration of the books ? The first time opening of book takes longer due to some compilation of certain objects. in SM50, try to identify where the process takes longest - whether it is at client end, at the DB procedure of LC or at application level.

  • Best Practices for loading large sets of Data

    Just a general question regarding an initial load with a large set of data.
    Does it make any sense to use a materialized view to aid with load times for an initial load? Or do I simply let the query run for as long as it takes.
    Just looking for advice on what is the common approach here.
    Thanks!

    Hi GK,
    What I have normally seen is:
    1) Data would be extracted from APO Planning Area to APO Cube (FOR BACKUP purpose). Weekly or monthly, depending on how much data change you expect, or how critical it is for business. Backups are mostly monthly for DP.
    2) Data extracted from APO planning area directly to DSO of staging layer in BW, and then to BW cubes, for reporting.
    For DP monthly, SNP daily
    You can also use the option 1 that you mentioned below. In this case, the APO cube is the backup cube, while the BW cube is the one that you could use for reporting, and this BW cube gets data from APO cube.
    Benefit in this case is that we have to extract data from Planning Area only once. So, planning area is available for jobs/users for more time. However, backup and reporting extraction are getting mixed in this case, so issues in the flow could impact both the backup and the reporting. We have used this scenario recently, and yet to see the full impact.
    Thanks - Pawan

  • What is best practice to do APO Demand planning

    Hi,
    My client wants to implement demand planning.
    Client has come up with one scenario like a New Customer is created in ECC, and if I use BI and then APO flow  ( ECC -> BI -> APO-BI) for Demand planning, user will have to wait for another day. (AS BI is always having one day delay).
    For this scenarios user is insisting on ECC and APO-DP interface.
    Will anybody suggest what should be the best practice for Demand planning.
    ECC -> Standalone BI -> Planning area (Planning is done in APO) -> Stand alone BI
    Or ECC -> APO-DP (Planning is done in APO) -> Standalone BI system
    I hope I am able to explain my scenario.
    Regards,
    ST

    Hi Sujoy,
    Thanks for reply.
    (1) I have to get Sales Order data from ECC into BI standalone system Cube.
    (2) Then from this cube same data is sent to SCM APO - BI Cube.
    (3) Planning will be done in SCM APO.
    (4) Planned data is again sent to BI standalone system, Using datamart on Planned area.
    (5) In BI we will have reporting on cube, which has ECC sales order data and APO Planned data.
    But in this case, there is always delay between data loads (firstly ECC -> BI, then BI -> APO, then APO -> BI).
    In this case, if a new customer is created, Client wants to see demand planning in  latest data.
    How do we do APO DP generally?
    Do we follow the route from BI, or directly between ECC and APO..
    Hope I am able to explain the scenario..

  • BPC:NW - Best practices to load Transaction data from ECC to BW

    I have a very basic question for loading GL transaction data into BPC for variety of purposes, would be great if you can point me towards best practices/standard ways of making such interfaces.
    1. For Planning
    When we are doing the planning for cost center expenses and need to make variance reports against the budgets, what would be the source Infocube/DSO for loading the data from ECC via BW, if the application is -
    YTD entry mode:
    Periodic entry mode:
    What difference it makes to use 0FI_GL_12 data source or using 0FIGL_C10 cube or 0FLGL_O14 or 0FIGL_D40 DSOs.
    Based on the data entry mode of planning application, what is the best way to make use of 0balance or debit_credit key figures on the BI side.
    2. For consolidation:
    Since we need to have trading partner, what are the best practices for loading the actual data from ECC.
    What are the typical mappings to be maintained for movement type with flow dimensions.
    I have seen multiple threads with different responses but I am looking for the best practices and what scenarios you are using to load such transactions from OLTP system. I will really appreciate if you can provide some functional insight in such scenarios.
    Thanks in advance.
    -SM

    For - planning , please take a look at SAP Extended Financial Planning rapid-deployment solution:  G/L Financial Planning module.   This RDS captures best practice integration from BPC 10 NW to SAP G/L.  This RDS (including content and documentation) is free to licensed customers of SAP BPC.   This RDS leverages the 0FIGL_C10 cube mentioned above.
      https://service.sap.com/public/rds-epm-planning
    For consolidation, please take a look at SAP Financial Close & Disclosure Management rapid-deployment solution.   This RDS captures best practice integration from BPC 10 NW to SAP G/L.  This RDS (including content and documentation) is also free to licensed customers of SAP BPC.
    https://service.sap.com/public/rds-epm-fcdm
    Note:  You will require an SAP ServiceMarketplace ID (S-ID) to download the underlying SAP RDS content and documentation.
    The documentation of RDS will discuss the how/why of best practice integration.  You can also contact me direct at [email protected] for consultation.
    We are also in the process of rolling out the updated 2015 free training on these two RDS.  Please register at this link and you will be sent an invite.
    https://www.surveymonkey.com/s/878J92K
    If the link is inactive at some point after this post, please contact [email protected]

  • Error while loading the planning book

    Hi There,
    I am getitng the below error while loading the data in planning book
    "Error for COM routine using application program (return code 40,016)
    Error reading data - Planning book cannot be processed further"
    Ran  /SAPAPO/TS_LCM_CONS_CHECK and /SAPAPO/TS_LCM_REORG but its not working , still getting the same error.
    Could you please advise?
    Thanks,
    Krishna

    Hi Krishna,
    Looks like a key figure might be incorrectly created or there might be a problem in aggregation/disaggregation of key figures due to numerous number of CVCs.
    Regards
    JB

  • Planning Book Data outside planning horizon

    Hi Experts,
    I am new to APO  "Data partly after horizon 01.04.2010 to 01.04.2016"  message is coming when accessing planning book  in our production system, not in Testing/Quality system .refer attachment
    when I gone through the forum we have to extend planning area using program  /SAPAPO/TS_PAREA_INITIALISE.
    To perform this activity  is  back up of the cube compulsory we have to do?
    Please let me know is any other steps to follow to resolve this issue, with this issue the users are not able to access planning book.
    Thanks & Regards,
    Srihari.M

    Hi Srihari,
    Check the horizon of the planning area and compare with your data view, right click on Planning area and select "Created Time Series Objects"
    for example
    Your Data view should not cross the End Date shown, which is the reason for your error. Also check the past horizon.
    Please check and confirm
    Regards,
    JB

  • Virtual Cube to load APO Planning area (LC)?

    Hello,
    Does anyone know if it is technically possible to use a virtual cube in APO/BW to load the APO planning area (liveCache)?  It would be great to have some SAP documentation to suport this.
    Thanks,
    Chad

    Thanks for the reply.  I'm actually looking to source data from a non-sap system and would like to explore the option of using a virtual cube connected to the external system using SAP UDC (universal data connector).  The data could be loaded to a basic cube, but this would mean data redundancy.  If this can be avoided by the use of a virtual cube, I would prefer to use that method.  I just wasn't sure if SAP-APO would allow for the data to be loaded into liveCache from a virtual cube.  I do like the BAPI option also.  If a virtual cube with services is used, is an ABAP function module required to get the data?

  • Best Practice on using and refreshing the Data Provider

    I have a �users� page, that lists all the users in a table - lets call it master page. One can click on the first column to of the master page and it takes them to the �detail� page, where one can view and update the user detail.
    Master and detail use two different data providers based on two different CachedRowSets.
    Master CachedRowSet (Session scope): SELECT * FROM UsersDetail CachedRowSet (Session scope): SELECT * FROM Users WHERE User_ID=?I want the master to be updated whenever the detail page is updated. There are various options to choose from:
    1. I could call masterDataProvider.refresh() after I call the detailDataProvider.commitChanges() - which is called on the save button on the detail page. The problem with this approach is that the master page will not be refreshed across all user sessions, but only for the one saving the detail page.
    2. I could call masterDataProvider.refresh() on the preRender() event of the master page. The problem with this approach is that the refresh() will be called every single time someone views the master page. Further more, if someone goes to next page (using the built in pagination on the table on master page) and clicks on a user to view its detail and then close the detail page, it does not keep track of the pagination (what page the user was when he/she clicked on a record to view its detail).
    I can find some work around to resolve this problem, but I think this should be a fairly common usage (two page CRUD with master-detail). If we can discuss and document some best practices of doing this, it will help all the developers.
    Discussion:
    1.     What is the best practice on setting the scope of the Data Providers and CahcedRowSet. I noticed that in the tutorial examples, they used page/request scope for Data Provider but session scope for the associated CachedRowSet.
    2.     What is the best practice to refresh the master data provider when a record/row is updated in the detail page?
    3.     How to keep track of pagination, (what page the user was when he/she clicked on the first column in the master page table), so that upon updating the detail page, we cab provide user with a �Close� button, to take them back to whaterver page number he/she was.
    Thanks
    Message was edited by:
    Sabir

    Thanks. I think this is a useful information for all. Do we even need two data providers and associated row sets? Can't we just use TableRowDataProvider, like this:
    TableRowDataProvider rowData=(TableRowDataProvider)getBean("currentRow");If so, I am trying to figure out how to pass this from master to detail page. Essentially the detail page uses a a row from master data provider. Then I need user to be able to change the detail (row) and save changes (in table). This is a fairly common issue in most data driven web apps. I need to design it right, vs just coding.
    Message was edited by:
    Sabir

Maybe you are looking for