Virtual Cube to load APO Planning area (LC)?

Hello,
Does anyone know if it is technically possible to use a virtual cube in APO/BW to load the APO planning area (liveCache)?  It would be great to have some SAP documentation to suport this.
Thanks,
Chad

Thanks for the reply.  I'm actually looking to source data from a non-sap system and would like to explore the option of using a virtual cube connected to the external system using SAP UDC (universal data connector).  The data could be loaded to a basic cube, but this would mean data redundancy.  If this can be avoided by the use of a virtual cube, I would prefer to use that method.  I just wasn't sure if SAP-APO would allow for the data to be loaded into liveCache from a virtual cube.  I do like the BAPI option also.  If a virtual cube with services is used, is an ABAP function module required to get the data?

Similar Messages

  • Best practices for loading apo planning book data to cube for reporting

    Hi,
    I would like to know whether there are any Best practices for loading apo planning book data to cube for reporting.
    I have seen 2 types of Design:
    1) The Planning Book Extractor data is Loaded first to the APO BW system within a Cube, and then get transferred to the Actual BW system. Reports are run from the Actual BW system cube.
    2) The Planning Book Extractor data is loaded directly to a cube within the Actual BW system.
    We do these data loads during evening hours once in a day.
    Rgds
    Gk

    Hi GK,
    What I have normally seen is:
    1) Data would be extracted from APO Planning Area to APO Cube (FOR BACKUP purpose). Weekly or monthly, depending on how much data change you expect, or how critical it is for business. Backups are mostly monthly for DP.
    2) Data extracted from APO planning area directly to DSO of staging layer in BW, and then to BW cubes, for reporting.
    For DP monthly, SNP daily
    You can also use the option 1 that you mentioned below. In this case, the APO cube is the backup cube, while the BW cube is the one that you could use for reporting, and this BW cube gets data from APO cube.
    Benefit in this case is that we have to extract data from Planning Area only once. So, planning area is available for jobs/users for more time. However, backup and reporting extraction are getting mixed in this case, so issues in the flow could impact both the backup and the reporting. We have used this scenario recently, and yet to see the full impact.
    Thanks - Pawan

  • Delta load from planning area to third party system

    Hi All,
    Generally we sent full data load from DP planning area to back up cube in BW system..But suppose we are to send delta load from DP planning area to third party BW system, then what changes we need to do on the DP planning area side so that we send all the changed data from livecache. Can this be done in a standard way?
    Thanks

    Hi Tarun,
    It's not possible to extract delta loads from the APO planning areas. The delta loads are only possible with standard BW objects,  and planning area is not a BW object.
    SAP doesn't provide the functionality to have delta loads when you use export data source based on the planning area.
    PS: From backup point of view, anyway having delta load would not make much sense. We take the backup as a snapshot of a particular time, and if restoration is required, based on the request numer or the date of extraction, we restore the relevant data from cube to the planning area.
    Thanks - Pawan

  • Datasource on APO Planning Area - Transportation Error

    Hi All,
                 I have created the Datasource on APO Planning Area. Datasource working fine check in RSA3 and also in BW side. when transporting the data source from APO Dev to APO QA i am getting following error and transport fails. Please suggest.
    Thanks
    Christopher
       Execution of programs after import (XPRA)
       Transport request   : AD1K909333
       System              : AQ3
       tp path             : tp
       Version and release: 372.04.10 700
       Post-import methods for change/transport request: AD1K909333
          on the application server: hllsap112
       Post-import method RSA2_DSOURCE_AFTER_IMPORT started for OSOA L, date and time: 20080725125524
       Execution of "applications after import" method for DataSource '9ADS_PP_APO'
       Import paramter of AI method: 'I_DSOURCE:' = '9ADS_PP_APO'
       Import paramter of AI method: 'I_OBJVERS:' = 'A'
       Import paramter of AI method: 'I_CLNT:' = ' '
       Import paramter of AI method: 'LV_CLNT:' = '100'
       DataSource '9ADS_PP_APO': No valid entry in table /SAPAPO/TSAREAEX
       Planning area for DataSource '9ADS_PP_APO' does not exist in target system
       Extract structure /1APO/EXT_STRU100002737 is not active
       The extract structure /1APO/EXT_STRU100002737 of the DataSource 9ADS_PP_APO is invalid
       Errors occurred during post-handling RSA2_DSOURCE_AFTER_IMPORT for OSOA L
       RSA2_DSOURCE_AFTER_IMPORT belongs to package RSUM
       The errors affect the following components:
          BC-BW (BW Service API)
       Post-import method RSA2_DSOURCE_AFTER_IMPORT completed for OSOA L, date and time: 20080725125532
       Post-import methods of change/transport request AD1K909333 completed
            Start of subsequent processing ... 20080725125524
            End of subsequent processing... 20080725125532
       Execute reports for change/transport request: AD1K909333
       Reports for change/transport request AD1K909333 have been executed
            Start of................ 20080725125532
            End of.................. 20080725125532
       Execution of programs after import (XPRA)
       End date and time : 20080725125532   Ended with return code:  ===> 8 <===

    Christopher,
    There seems to be no extract strucutre available for this data source in quality. This is creating the problem in quality. The extract strucutre which is created in your scenario would be available in temp folder and that will not be availbale for transport, so you need to have the datasource generated in quality and then you transport the active version to quality so that it will be available with the changes as that of development.
    Regards
    Vijay

  • Help Required for Mapping Key figures from Cube to APO Planning area.

    Hello Experts,
    We have created cube in APO BW and now we want to map it to Planning area how we can map.
    Can any body explain how we can map keyfigures?
    What is the use of livechache how it will be updated?
    Regards
    Ram

    Hi,
    I am not very sure about the 9ARE aggregate (haven't used it in backups), but RTSCUBE is used to copy time Series (TS) KF data from cube to planning area (SNP or DP).
    Are you trying to restore some time series data from your backup cube to the planning area? If yes, then do a mapping of characteristic from cube to planning area in RTSCUBE, and also map the TS KF between cube and planning area.
    If your KF is not a time series KF, then you can't copy it from cube to planning area. You could get data to cube for some reporting, otherwise I am not sure what use the backup is for you. For SNP, most of the data would be received from R/3, so there's not much point in having a backup.
    Hope this helps.
    Thanks - Pawan

  • Back up cube for report from planning area

    Hi Gurus,
            We are in implementation and i had seen that client wanted to see daily data and wanted to delete that data and load next day data.One more point is client even wanted monthly data as well in a same cube and this two facilities should be in single info cube.
    How is is possible?
    for further info we have calday and calmonth time Chart.
    please suggest me some good ideas.
    I will be verythankfull to your suggestions.
    Thanks a Lot
    Regards,
    Raj

    Hi Raj,
    You can use SAP BW to create reports for the user. You have the option of using an external BW system or the BW system that is coupled with the APO server. If you have an external BW system, then it is recommended that you report from there. As you may know APO is an OLTP system and thus is configured for such. The external BW system will be configured for OLAP access and thus would be much more suitable for reporting purposes.
    You may want create a SAP RemoteCube so that the data in your report is "as fresh as possible".
    Here are the steps if you will be using the BW component in the APO system:
    1) Create an export datasource for your planning area. (Transaction /n/SAPAPO/SDP_EXTR)
    2) Create an InfoSource and attach the datasource your have created in step 1. (transaction RSA1 Modelling tab)
    3) Create a SAP RemoteCube and attach your InfoSource and SourceSystem to that.
    4) Create a BeX query and a BeX report (either in Web or Excel).
    If you will be using an external BW system here are the steps:
    1) Create an export datasource for your planning area in the APO system (Transaction /n/SAPAPO/SDP_EXTR)
    In the external BW system:
    2)replicate the datasource your have created.(transaction RSA1, Modelling tab, sourcesystems->choose the APO system->Right click->Replicate DataSources).
    3) Create an InfoSource and attach the datasource that you have created in step 2. (transaction RSA1 Modelling tab)
    3) Create a SAP RemoteCube and attach your InfoSource and SourceSystem to that.
    4) Create a BeX query and a BeX report (either in Web or Excel).
    Note that a RemoteCube is only suitable for few users only. If you will have many users, you need to create a Basic InfoCube instead.
    In your BeX query, you can choose the granularity of your report. If you want your report to be aggregated to monthly level then be sure to include the 0CALMONTH InfoObject.
    Please post again if things are not clear and I will be happy to help out.

  • Issue with loading a Planning Area using TSCUBE

    Hi All:
    I tried to load the data in production PA into QA PA (two different systems). 
    Steps I followed were:
    1.  Create a DataSource of the Production PA
    2.  Download the data from the DataSource into an Excel spreadsheet for specific range of date.
    3.  Then save the excel in .csv format.
    4.  Create a DataSource in QA and load data from .csv file. checked for data
    5.  Created an InfoCube matching the PA and loaded data from DataSource in QA.
    6.  Created CVC based on this InfoCube.
    7.  Loaded the PA in QA system using this InfoCube.
    I found that the PA had data for all the KF excepting the Sales History and Invoice History.  I checked the InfoCube and it had hsitory data.  Not sure why history is not loaded.  I have Proportional factor in the Infocube and I have mapped the UOM to the sales history.
    Not sure why.  Could anyone pl. help understand?
    Thanks much in advance.
    narayanan

    Did you map the key figure of the cube to the key figure in the planning area?

  • OLAP Virtual cubes in PS 2010: what are their components ?

    Hi,
    I read somewhere that the 3 following OLAP Cubes in Project Server 2010 are virtual cubes built with the "aggregation" of several other cubes. What are these "primary" cubes?
    Thanks
    MSP_Portfolio_Analyzer
    MSP_Project_SharePoint
    MSP_Project_Timesheet

    To understand the Total fields and Dimensions available in the 14  OLAP cubes available in Project Server 2010, it is helpful to group the OLAP cubes based on the Total fields and Dimensions they have in common with one another. Therefore, consider
    the following logical groupings for the fourteen OLAP cubes:
    Assignment: The Assignment Non Timephased, Assignment Timephased,
    MSP Portfolio Analyzer, and MSP Project Timesheet
    OLAP cubes contain assignment data. Each of these OLAP cubes includes many of the same Total Fields (such as Actual Cost, Actual Overtime Cost, Actual Work, and Actual Overtime Work) and many of the same Dimensions (such as Assignment Owner and Project List)
    Project Workspace:The Deliverables, Issues,
    MSP_Project_SharePoint, and Windows SharePoint Services (WSS). This data includes Risk, Issues, Document, and Deliverable information about each project.
    Timesheet: The EPM Timesheet and Timesheet OLAP cubes contain timesheet
    data. You can use these two OLAP cubes to report on Actual Work Billable and Actual Work Non Billable, for example.
    Project:The Project Non Timephased OLAP cube contains project
    data. Although this OLAP cube contains some information in common with several others, the common information is so limited that I include it in its own group.
    Resource: The Resource Non Timephased and Resource Timephased OLAP cubes
    contain resource data. These two OLAP cubes do not contain any Total fields in common, but do contain several Dimensions in common (such as Booking Type and Resource List).
    Task: 
    The Tasks Non Timephased OLAP cube contains task data. This OLAP cube contains only one Total field and six Dimensions, two of which it has in common with other OLAP cubes. Because this OLAP cube contains so little information, and has so little in common
    with other OLAP cubes, I include it in its own group.
    In other way you can conclude :
    MSP_Project_Timesheet:: combines the Assignment Timephased, Resource Timephased, and EPM Timesheet cubes.
    MSP_Project_SharePoint :: combines Project Non-Timephased, Issues, Risks, and Deliverables cubes.
    MSP_Portfolio_Analyzer :: combines the Assignment Timephased and Resource Timephased cubes
    kirtesh

  • Selected columns in APO planning area

    I have a requirment such that i have to copy the data from the columns selected by user and paste it to another column. my questuion is     Using abap code how can i find whether a column is selected  or which columns are selected.

    SVK,
    I don't know about ABAP, but Macro function COLUMN_MARKED() should do the trick.
    http://help.sap.com/saphelp_scm70/helpdata/EN/4b/592a4900e2584ae10000000a42189c/frameset.htm
    Best Regards,
    DB49

  • Loading data from Cube to Planning area

    Hi,
             If I am loading data from a cube to a planning area using transaction TSCUBE,
    does the system load data into planning area for the combinations that exist in the cube or does it load for all CVCs?
    For example,
    I have my CVC as Plant, Material, Customer
    If there are 4 CVCs in the POS that were previously generated as
    Plant--Material--Customer
    01--M1--
    C1
    01--M2--
    C3
    01--M2--
    C2
    01--M4--
    C5
    If the cube has data like this:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    (doesnot have the last combination), then if I use TSCUBE transaction to load data to Planning area from this cube,
    is the data loaded as
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    Only for the 3 combinations that exist in the cube and not load anything for the last one
    OR
    is the data loaded as
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    01--M4C5--
    0
    Load all 4 combinations and send 0 as the cube doesnot have this combination?
    Hope I am clear on this question.
    Thanks.

    Thanks a lot Vinod, Srinivas and Harish. The reason why I am asking you is that we have a scenario where we get this situation.
    We initially get data from R/3 to BW to APO like the below:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    Later when the customer is changed or bought out by somebody C1 is changed to C2. Some times when the business doesnot know who the customer is initially they just put C1 as dummy and then after sometime replace it by C2. Then the new record coming in is as follows:
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    BW can identify changes in transaction data  but not in Master data. What I mean by this is when Qty. 10 changes from 10 to 20, the system can identify it in deltas.
    If the customer (master data) changes to C2 from C1, the system thinks it's a new record all together then if I use delta loads, it gets me the following:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M1C2--
    10
    If I am looking at Plant and Material Level, my data is doubled.
    So we are planning to do a full load that works like this:
    1. Initial data like the below:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    The CVC is created and the planning area has Qty.10
    Then we delete the contents of cube and do a full load into the cube with changed customer
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    This time a new CVC is created. Then we have another 10 loaded into Planning area.
    If the system loads all CVCs, then the it would send
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    0
    01--M1C1--
    10
    If the system loads only combinations in cube,
    then it loads
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    But the system already has another 10 for Customer C1 duplicating the values.
    We are trouble in the second case.
    We had to go fr this solution instead of realignment as our business has no way pf knowing that C1 was replaced by C2.
    Hope I am clear.

  • APO DP - loading from InfoCube to planning area

    I am using APO DP V5.
    I have the following situation:
    1. I am extracting sales history data at the end of each day from a connected ECC system into an InfoCube, using delta logic, so in the Cube I just have the new sales history transactions (despatches) for the day
    2. I am then loading data from the Cube to my DP planning area via transaction /SAPAPO/TSCUBE
    I assume I must have the 'add data' flag set for each key figure in this transaction, to ensure I see the consolidated sales history in the planning area.
    Is this the 'best practice' approach for regular loading of sales history? I assume it's better to have a 'delta' approach for the Cube, for improved performance.
    Thanks, Bob Austin

    Hi,
            Good questions!
    1. What does the 'period' really refer to? Is it referring to the date of a particular key figure? Or the 'date of data being added to Cube'?
    A: Both are same
    The date is generally the date in your cube like the calendar day, month etc. This date is again based on a time stamp in the format DDMMYYYYHHMMSS. The calendar day is part of this field as DDMMYYYY. Also the system recognizes the changes by the same time stamp. So, if a customer changes the qty 05/15 at 3.30 pm, then the time stamp is 15052007153000. The calendar day in your cube or your key figure date is 15052007 and the delta is recognized by the changes in time stamp, between last load at the current system time. So , you are talking about the same time field.
    Check in your system if this is the same case you got. let me know if not.
    2. Suppose original dispatch qty = 100 (two weeks ago), and 'today' the customer returns a qty of 60; how does system handle this - I would want this posted to the original date of two weeks ago.
    A: The data from your ECC system is generally brought to an ODS first. The reason is we overwrite the data there if there is any data that has the same key. If your key for the ODS is Customer and division. Then you overwrite the customer qty for that division whenever the value changes. If you need it by time, lets say per month, include it in the key. The system over writes the value for that month only. For next month, it's a new record.
    In your case, if the qty. is 100 2 weeks back and now it's 60, if the time stamp is not in key, the system overwrites it to 60 and you have only 60 when you load it to your ODS and thereby to your PA as it overwrites. Delete the delta in your ODS and it shows the same 100 again. Then load it to PA. This is not a good option. The alternative is to include time stamp like calweek in your ODS key and load it over to cube. That way you have two records.
    I hope I answered all your questions.  Please feel free to ask more. I would love to answer that way I can also brush things up that were unused.
    Thanks.

  • Load from cube to planning area

    Hi,
    We are facing a problem in loading data from Cube to planning area, we need to distinguish between the Zero and the blank values in planning area when loading data from cube to planning area.
    Scenario is like this.
    For a CVC u2013A on day D I am having key figure value as blank and on D1 is Zero and on D2 is 2, I want the same to appear as my planning area data If I load this data to cube my day D value will appear as Zero  (but it was blank) and on D1 Zero and on D2 as 2 in cube.
    I am using Std T-code /SAPAPAO/TSCUBE to load data from cube to planning area, in std T-code we are having option of ignore Zero, if I use this it will also stops the actual zero value(which I need) on day D+1 to come in to planning area which I donu2019t want.
    I want my planning area data to be as blank on day D and Zero on D1 and 2 on D2 after loading data from cube to planning area..
    Can any one put some light and help me out from this issue.
    With regards,
    Sreerama

    Hi Seerama,
    I am not sure to understand completely the issue: do you have an issue in the cube or in the planning area?
    In order to differentiate a blank for a zero in the planning area, you need to set the flag "zero allowed" in the planning area settings. (in the tab keyfigure, click details, then for each key figure you can select or not the flag "zero allowed")
    When you load the data, you should indeed not flag "ignore zero value"
    If the issue you have is in your cube (between the file and the cube) then it is another matter...
    Kind Regards,
    Julien

  • DP Planning Area Load using Attributes

    We have an APO DP Planning area that is comprised of two characteristics ("Material"; "Mill").  All the data are attributes of one of these characteristics.  We chose this for the flexibility of attributes as compared to characteristics and was under the impression this was "the best practice".
    I have two questions regarding this:
    1)  We now have a requirement to populate a new key figure in DP from our sales forecast that is done in BI-IP Planning.  The sales forecast is done at the "Brand" level which is above the "Material".  It appears I am unable to load my planning area with a cube that does not have matching characteristics.   Does anyone have an idea of how to load a planning area above the characteristic level?  I think this can be done using SDP94 and the "Save Locally" and "Upload file" options but would like to find a smoother way to do this so I can automate the load via a process chain...
    The error I get when I try to load via TSCUBE is:
    Planning area and InfoCube must have common characteristic
    Message no. /SAPAPO/TSM272
    Diagnosis
    You are trying to load data from an InfoCube into a planning area. The planning area and InfoCube should be similar. This means that they should have at least one common characteristic. If they do not, it is impossible to map the data.
    Procedure
    Check the planning area and InfoCube. Make the necessary changes
    ===
    2) Is using attributes to this level a bad thing from a performance and/or enhancement perspective?  I like the fact that things adjust automatically as attributes change but now I am worried we may have over used them... We do notice some speed issues when accessing data at times...
    Thanks so much for any information or comments on this!

    In TSCUBE transaction use the Characteristic Assignment option to map Brand characteristic of infocube to Brand characteristic in Planning Area.
    Hope this helps.
    somnath

  • RE: Need help on cornering the APO BI issue relevant to Planning area

    HI Guys,
    Iam loading historical data from my infocube to APO planning area...
    MY Planning will be done in apo for weekly based..
    for thatmy client has configuired Fisc VArnt which is
    specific for 48 periods but not 52 periods...
    My clinet will be planning in week which will will be like it starts from 07.01.2010,
    14.01.2010,21.01.2010,31.01.2010..
    but for testing purpose we are taking thisfrom a flat file andloaded into the infocube..
    and created gen export datasource into scmapo system and loaded into cune from that cube iam feeding itto planning area..
    when I execute the transaction /n/sapapo/tscube tha data copied was successful,
    but the keyfigure amnt when I saw in the planning area Transaction /n.sapapo.sdp94which will be distributed across the week.
    lets say my keyfigure values is 100 in the infocube for the jan month and week 1
    and the value which i CAN SEE IN THE PLANNING AREA IS week1 25 week 2 25 week 3 25 and week 4 25
    which will be 100 total for amonth,
    but it should not be like that 100 should go into a particular week and should display 100 for that particular week..
    I have calmonth calday, fiscper(posting period) which we have maintained in ob29 as 48 periods
    when i derived calweek in the transformation iam getting 48 weeks but
    when i try to load it to planning area iAM GETTING AN ERROR LIKECOMBINATION IS INCONSISTENT..with the calmonth
    CODE WHICH i HAVE DERIVED CALWEEK FROM CALDAY:
    DATA: lv_year(4) TYPE c,
    lv_month(2) TYPE c,
    lv_day(2) TYPE c,
    lv_result(6) TYPE c,
    v_poper TYPE poper.
    lv_year = SOURCE_FIELDS-calday+0(4).
    lv_month = SOURCE_FIELDS-calday+4(2).
    lv_day = SOURCE_FIELDS-calday+6(2).
    SELECT SINGLE poper FROM t009b INTO v_poper WHERE bumon = lv_month
    AND butag = lv_day.
    IF sy-subrc = 0.
    CONCATENATE lv_year v_poper+1(2) INTO lv_result.
    RESULT = lv_result.
    CLEAR: lv_result.
    ENDIF.
    gURUS CAN ANY BODY THROW SOME LIGHT ON THIS.. iWILL BE HIGHLY OBLIGED
    when i load the data from infocube to planning arae using /SAPAPO/TSCCUBE, the copy was succeeful.. But the issue is the keyfigure value is dis aggregating..
    For ex If my sales hostory for week 1 001.2010 and for calmonth 01.2010 is 100, but it is disaggegating the values for whole month 4 weeks as 25,25,25,25 but it needs to b written as 100 for week 1.rather it should be aggregated on the highlevel as 100 for week 1.
    Do I need to check any Charecterstics combination but all are consistent....
    Even the periodicities in the planning area and infocube are consistent , since i am able to copy in to planning area..
    I dont have calweek in my infocube i have derived calweek with logic provided in my earlier thread, but as of now iam going with calyear and calmonth, fisper3 (postig period), since 48 posting periods are maintained in the 0b29 or t009b tables.
    whether I need to implement any oss notes for this, If I include calweek and calmonth and try to copy in to planning area Iam getting an error periodicities are not matching  SAP Note 1408753 - /sapapo/tscube wrong error message
    /SAPAPO/TSM 232
    Regards
    Balaram

    thnx for replying the thread..
    Where do I maintain this the PLANNING OBJECT STRUCTURE(STOR BKT Profile) and Planning Book-data View(Time Bkt Profile) with the time periodicities for Fisc Varnt and Posting period specificu2026
    Can you pls elaborate on this Iam new to APO BW implementation part, Sine this is a burning issue
    Actually what seeting I need to do there, I think infocube structures are good and copying data in to planning area..
    I have calmonth and Fiscper3 for 48 periods in my infocube
    also whether I need to maintain any calweek in the PLANNING OBJECT STRUCTURE(STOR BKT Profile) and Planning Book-data View(Time Bkt Profile)
    when I checked in planning book keyfigure overview it is maintained there for 48 periods
    For Planning Book-data View(Time Bkt Profile)  how can we achieve this...
    If you could throw some light more on this I have my APO Counter part I will ask him to change the master planning structure accordingly
    Regards
    Ram

  • How is data loaded from Infocube to Planning area - need technical info

    I would like to find out how data is loaded from cube to planning area ,
    I know that CUBE has tables, but how does that get loaded to planning area , where is the mapping from which I can compare
    data in CUBE and the same data that reaches planning area.
    Say for example I have below values in the infocube
    Prod1 --> Loc1 -
    > available provisioning qty
    AAA        AB90             100
    Then where do i check in the planning area tables ( are there any tables mapped) after running the TSINPUT program, i know it can be checked in the planning book , but i want to check in planning area tables if they exi

    Hi ,
    The data is loaded from infocube to planning area using update rules. The issue you have mentioned seems to be two requests are having data for the same CVCs in the cube.
    Example: For the same CVC, two requests avilable in the infocube. When you copy the data using TSCUBE transaction, whatever data avilable in the cube for the CVC gets copied into Planning Area.
    CVC1 - cube - Old Request - 100
    CVC1 - cube - Actual Request - 200
    CVC1 - Planning Area = 300 ( The value is supposed to be 200, but since the old request also contains data in the cube for the cvc, it gets copied as 300)
    Issue : There might two request contains data for the same code
    Solution : 1. Check the data in the cube using transaction LISTCUBE  2.  Delete old request and check the data agian. 3. If it matches with your requirement, run the TSCUBE again.
    Please let me know if you need additional information.
    Thanks,
    Jeysraj

Maybe you are looking for