Selected columns in APO planning area

I have a requirment such that i have to copy the data from the columns selected by user and paste it to another column. my questuion is     Using abap code how can i find whether a column is selected  or which columns are selected.

SVK,
I don't know about ABAP, but Macro function COLUMN_MARKED() should do the trick.
http://help.sap.com/saphelp_scm70/helpdata/EN/4b/592a4900e2584ae10000000a42189c/frameset.htm
Best Regards,
DB49

Similar Messages

  • Datasource on APO Planning Area - Transportation Error

    Hi All,
                 I have created the Datasource on APO Planning Area. Datasource working fine check in RSA3 and also in BW side. when transporting the data source from APO Dev to APO QA i am getting following error and transport fails. Please suggest.
    Thanks
    Christopher
       Execution of programs after import (XPRA)
       Transport request   : AD1K909333
       System              : AQ3
       tp path             : tp
       Version and release: 372.04.10 700
       Post-import methods for change/transport request: AD1K909333
          on the application server: hllsap112
       Post-import method RSA2_DSOURCE_AFTER_IMPORT started for OSOA L, date and time: 20080725125524
       Execution of "applications after import" method for DataSource '9ADS_PP_APO'
       Import paramter of AI method: 'I_DSOURCE:' = '9ADS_PP_APO'
       Import paramter of AI method: 'I_OBJVERS:' = 'A'
       Import paramter of AI method: 'I_CLNT:' = ' '
       Import paramter of AI method: 'LV_CLNT:' = '100'
       DataSource '9ADS_PP_APO': No valid entry in table /SAPAPO/TSAREAEX
       Planning area for DataSource '9ADS_PP_APO' does not exist in target system
       Extract structure /1APO/EXT_STRU100002737 is not active
       The extract structure /1APO/EXT_STRU100002737 of the DataSource 9ADS_PP_APO is invalid
       Errors occurred during post-handling RSA2_DSOURCE_AFTER_IMPORT for OSOA L
       RSA2_DSOURCE_AFTER_IMPORT belongs to package RSUM
       The errors affect the following components:
          BC-BW (BW Service API)
       Post-import method RSA2_DSOURCE_AFTER_IMPORT completed for OSOA L, date and time: 20080725125532
       Post-import methods of change/transport request AD1K909333 completed
            Start of subsequent processing ... 20080725125524
            End of subsequent processing... 20080725125532
       Execute reports for change/transport request: AD1K909333
       Reports for change/transport request AD1K909333 have been executed
            Start of................ 20080725125532
            End of.................. 20080725125532
       Execution of programs after import (XPRA)
       End date and time : 20080725125532   Ended with return code:  ===> 8 <===

    Christopher,
    There seems to be no extract strucutre available for this data source in quality. This is creating the problem in quality. The extract strucutre which is created in your scenario would be available in temp folder and that will not be availbale for transport, so you need to have the datasource generated in quality and then you transport the active version to quality so that it will be available with the changes as that of development.
    Regards
    Vijay

  • Virtual Cube to load APO Planning area (LC)?

    Hello,
    Does anyone know if it is technically possible to use a virtual cube in APO/BW to load the APO planning area (liveCache)?  It would be great to have some SAP documentation to suport this.
    Thanks,
    Chad

    Thanks for the reply.  I'm actually looking to source data from a non-sap system and would like to explore the option of using a virtual cube connected to the external system using SAP UDC (universal data connector).  The data could be loaded to a basic cube, but this would mean data redundancy.  If this can be avoided by the use of a virtual cube, I would prefer to use that method.  I just wasn't sure if SAP-APO would allow for the data to be loaded into liveCache from a virtual cube.  I do like the BAPI option also.  If a virtual cube with services is used, is an ABAP function module required to get the data?

  • Help Required for Mapping Key figures from Cube to APO Planning area.

    Hello Experts,
    We have created cube in APO BW and now we want to map it to Planning area how we can map.
    Can any body explain how we can map keyfigures?
    What is the use of livechache how it will be updated?
    Regards
    Ram

    Hi,
    I am not very sure about the 9ARE aggregate (haven't used it in backups), but RTSCUBE is used to copy time Series (TS) KF data from cube to planning area (SNP or DP).
    Are you trying to restore some time series data from your backup cube to the planning area? If yes, then do a mapping of characteristic from cube to planning area in RTSCUBE, and also map the TS KF between cube and planning area.
    If your KF is not a time series KF, then you can't copy it from cube to planning area. You could get data to cube for some reporting, otherwise I am not sure what use the backup is for you. For SNP, most of the data would be received from R/3, so there's not much point in having a backup.
    Hope this helps.
    Thanks - Pawan

  • RE: Need help on cornering the APO BI issue relevant to Planning area

    HI Guys,
    Iam loading historical data from my infocube to APO planning area...
    MY Planning will be done in apo for weekly based..
    for thatmy client has configuired Fisc VArnt which is
    specific for 48 periods but not 52 periods...
    My clinet will be planning in week which will will be like it starts from 07.01.2010,
    14.01.2010,21.01.2010,31.01.2010..
    but for testing purpose we are taking thisfrom a flat file andloaded into the infocube..
    and created gen export datasource into scmapo system and loaded into cune from that cube iam feeding itto planning area..
    when I execute the transaction /n/sapapo/tscube tha data copied was successful,
    but the keyfigure amnt when I saw in the planning area Transaction /n.sapapo.sdp94which will be distributed across the week.
    lets say my keyfigure values is 100 in the infocube for the jan month and week 1
    and the value which i CAN SEE IN THE PLANNING AREA IS week1 25 week 2 25 week 3 25 and week 4 25
    which will be 100 total for amonth,
    but it should not be like that 100 should go into a particular week and should display 100 for that particular week..
    I have calmonth calday, fiscper(posting period) which we have maintained in ob29 as 48 periods
    when i derived calweek in the transformation iam getting 48 weeks but
    when i try to load it to planning area iAM GETTING AN ERROR LIKECOMBINATION IS INCONSISTENT..with the calmonth
    CODE WHICH i HAVE DERIVED CALWEEK FROM CALDAY:
    DATA: lv_year(4) TYPE c,
    lv_month(2) TYPE c,
    lv_day(2) TYPE c,
    lv_result(6) TYPE c,
    v_poper TYPE poper.
    lv_year = SOURCE_FIELDS-calday+0(4).
    lv_month = SOURCE_FIELDS-calday+4(2).
    lv_day = SOURCE_FIELDS-calday+6(2).
    SELECT SINGLE poper FROM t009b INTO v_poper WHERE bumon = lv_month
    AND butag = lv_day.
    IF sy-subrc = 0.
    CONCATENATE lv_year v_poper+1(2) INTO lv_result.
    RESULT = lv_result.
    CLEAR: lv_result.
    ENDIF.
    gURUS CAN ANY BODY THROW SOME LIGHT ON THIS.. iWILL BE HIGHLY OBLIGED
    when i load the data from infocube to planning arae using /SAPAPO/TSCCUBE, the copy was succeeful.. But the issue is the keyfigure value is dis aggregating..
    For ex If my sales hostory for week 1 001.2010 and for calmonth 01.2010 is 100, but it is disaggegating the values for whole month 4 weeks as 25,25,25,25 but it needs to b written as 100 for week 1.rather it should be aggregated on the highlevel as 100 for week 1.
    Do I need to check any Charecterstics combination but all are consistent....
    Even the periodicities in the planning area and infocube are consistent , since i am able to copy in to planning area..
    I dont have calweek in my infocube i have derived calweek with logic provided in my earlier thread, but as of now iam going with calyear and calmonth, fisper3 (postig period), since 48 posting periods are maintained in the 0b29 or t009b tables.
    whether I need to implement any oss notes for this, If I include calweek and calmonth and try to copy in to planning area Iam getting an error periodicities are not matching  SAP Note 1408753 - /sapapo/tscube wrong error message
    /SAPAPO/TSM 232
    Regards
    Balaram

    thnx for replying the thread..
    Where do I maintain this the PLANNING OBJECT STRUCTURE(STOR BKT Profile) and Planning Book-data View(Time Bkt Profile) with the time periodicities for Fisc Varnt and Posting period specificu2026
    Can you pls elaborate on this Iam new to APO BW implementation part, Sine this is a burning issue
    Actually what seeting I need to do there, I think infocube structures are good and copying data in to planning area..
    I have calmonth and Fiscper3 for 48 periods in my infocube
    also whether I need to maintain any calweek in the PLANNING OBJECT STRUCTURE(STOR BKT Profile) and Planning Book-data View(Time Bkt Profile)
    when I checked in planning book keyfigure overview it is maintained there for 48 periods
    For Planning Book-data View(Time Bkt Profile)  how can we achieve this...
    If you could throw some light more on this I have my APO Counter part I will ask him to change the master planning structure accordingly
    Regards
    Ram

  • Best practices for loading apo planning book data to cube for reporting

    Hi,
    I would like to know whether there are any Best practices for loading apo planning book data to cube for reporting.
    I have seen 2 types of Design:
    1) The Planning Book Extractor data is Loaded first to the APO BW system within a Cube, and then get transferred to the Actual BW system. Reports are run from the Actual BW system cube.
    2) The Planning Book Extractor data is loaded directly to a cube within the Actual BW system.
    We do these data loads during evening hours once in a day.
    Rgds
    Gk

    Hi GK,
    What I have normally seen is:
    1) Data would be extracted from APO Planning Area to APO Cube (FOR BACKUP purpose). Weekly or monthly, depending on how much data change you expect, or how critical it is for business. Backups are mostly monthly for DP.
    2) Data extracted from APO planning area directly to DSO of staging layer in BW, and then to BW cubes, for reporting.
    For DP monthly, SNP daily
    You can also use the option 1 that you mentioned below. In this case, the APO cube is the backup cube, while the BW cube is the one that you could use for reporting, and this BW cube gets data from APO cube.
    Benefit in this case is that we have to extract data from Planning Area only once. So, planning area is available for jobs/users for more time. However, backup and reporting extraction are getting mixed in this case, so issues in the flow could impact both the backup and the reporting. We have used this scenario recently, and yet to see the full impact.
    Thanks - Pawan

  • Delta load from planning area to third party system

    Hi All,
    Generally we sent full data load from DP planning area to back up cube in BW system..But suppose we are to send delta load from DP planning area to third party BW system, then what changes we need to do on the DP planning area side so that we send all the changed data from livecache. Can this be done in a standard way?
    Thanks

    Hi Tarun,
    It's not possible to extract delta loads from the APO planning areas. The delta loads are only possible with standard BW objects,  and planning area is not a BW object.
    SAP doesn't provide the functionality to have delta loads when you use export data source based on the planning area.
    PS: From backup point of view, anyway having delta load would not make much sense. We take the backup as a snapshot of a particular time, and if restoration is required, based on the request numer or the date of extraction, we restore the relevant data from cube to the planning area.
    Thanks - Pawan

  • RE : BI APO Question Reg Data feeding from cube to Planning area.

    Hi BW Experts,
    iam working in an Implementation project for SCM in BW prcisely working with APO Bw..
    For that I have taken historical data as a flat file and loaded it in to the external BW Infocube and its fine...
    Second step I have created generate export datasource on topr of BW infocube and replicated in to Bw and used this export datasource as datasource for APO BW Infocube which is inbulit BW System from External Bw..
    also I have created tranformations and data is loaded in the BW cube in APO system.Also Included Version charecterstics..
    When I try to fed the APO Cube data to planning area Iam gettinga the following  warning itsnot an error:
    1.Key figure copy: InfoCube - planning area (DP) 01.01.2010 to 31.12.2010-- Successful
    *2.No data exists for the selection made (see long text*
      Diagnosis:Data could not be loaded from the Cube for the selection you made. Check whether the Cube actually contains data that is relevant for your selection.
    For the second point I have time charecterstics filled in the infocube which Iam feding to a Planning area like 0CALMONTH,0CALWEEK,FiscVarnt,0CALMONTH
    3.Characteristic assignment: No data copied --- Message
    Can you please help me with your thoughts so that i wll try to corner the issue I will be highly obliged

    Hi,
    As I understood, you have loaded data from external BW cube to APO BW cube and now loading planning area from APO BW cube.
    I hope your settings in /SAPAPO/TSCUBE transaction code would be correct and you have selected the correct planning version with correct cube.
    Check if Data in APO BW cube is available for reporting or not and data is avilable for given selction (if any but I guess you are not giving any).
    Thanks,
    S

  • Loading data from APO-BW cube into DP planning area

    Hello,
    Please explain to me how to load data from an infocube within APO into the DP planning area.  So far, I generated the data mart, created the update rules for the DP planning area (cube) and replicated data sources.  The problem that I am having is that when I create the infopackage, there is no data targets on the data target tab.  How can I get the data targets to display?  All the other actions appear to be set up correctly (confirmed by BW experts).  Can data be loaded from an APO-BW cube directly to the DP planning area cube?
    Thanks

    Hi James,
    Loading of data from an InfoCube to a Planning Area is done using transaction /SAPAPO/TSCUBE (program /SAPAPO/RTSINPUT_CUBE). You just specify the InfoCube, target Planning Area, Selection condition, Infocube Key Figure to PA KF mapping, etc).
    Have you tried this transaction already?
    Hope this helps.

  • Planned orders created at APO side are not coming to the ECC

    Hi All,
    I am not getting the planned orders generated in the APO in to the ECC side after running the Product Heuristic. Many times I refreshed the stock requirements at ECC side and also APO side but still the planned orders generate has not come to the ECC. I am just new to APO. It would be great help if somebody can help me can guide/advice me on this.
    Thanks & Regards
    psamp1

    As well as Sajeev's reply, you also need to ensure that the Products and Plants you are planning are in an active integration model and that the transactional data you are expecting is also in an active integration model
    You can check this by using the transaction CFM5 in ECC, enter your products and plants in the general selection options then tick the checkboxes for Materials, Plants and Planned Orders. On execution the system will tell you if they are in an active model or not.

  • Planning areas not active in new APO system

    Hi gurus,
    We just installed SAP SCM 5.1 (APO) dev system, and I have an issue, we are seeing the Planning Areas and Planning Object Structures not active. Any ideas.
    This is when we go from SAP Menu --> Demand Planning --> Environment --> Administration of Demand Planning and Supply Network Planning
    Thanks
    JS

    to activate the Planning area you have to create time series objects..to do that right click on PA and click create time series objects.
    Toactivate Planning object structure..right click on that and select "Activate"..you can see the status as green

  • Planning area data not updated in infocube in APO system

    Hi all,
    User is uploading a flat file using addon program to an infocube 1, from this infocube data for  sales forecast KF is getting transferred to a planning area KF using standard program RTS_INPUT_CUBE.  I can see the updated data in Planning book (which refer the same planning area) data views. In planning book the sales forecast data is also getting copyied to second KF 'Arrangement FC' and its correct.
    Now there is a infocube 2 (second) which is getting data directly from this planning area (infocube also contains both KFs). But When i checked this infocube 2 , the updated data is availabe in Sales forecast KF , but arrangement forecast KF is having old data only and user is running query on second KF , so he is getting wrong report.
    Since there is no data flow for this infocube 2, and it is getting data from planning area, I feel its remote infocube.
    I have also found the datasource for this planning area but don't know how to move further to find why data is not updating properly? Please help on this.
    I have information that 2 weeks before data was properly updated, but this time data is not updated,
    system version  is SAP SCM 4.0

    Hi Vivek
    it is advisable to run the background jobs when the planning books are not being accesses by the users to prevent such inconsistencis. Hence i would advise you to run the jobs durng non-working hours. and if you have a global system, then you may restrict to run the jobs based on regional levels.
    in addition, it is also a good practice to run consistency jobs before and after your have completed the background jobs. i hope you are using process chains to execute the sequeuce of jobs. if yes, then you can add consistency check jobs in the process chains.
    some of the consistency check reports are :
    /SAPAPO/OM17 - liveCache Consistency Check
    /SAPAPO/TSCONS - Consistency Check for Time Series Network
    /SAPAPO/CONSCHK - Model Consistency Check
    and so and so forth..
    you can find these conssistency jobs under APO Adiminstration --> Consistency checks in APO.
    let me know if this helps.
    Rgds, Sandeep

  • Changes to selected columns to display in Details tab in Task Manager are not saved during a reboot

    I use Task Manager on Server 2008 R2 to monitor performance. I have added several columns to the Processes tab using View..Select Columns. I have added CPU time, I/O read/write/other bytes, etc.
    The equivalent in Server 2012 is the Details tab. I figured out how to add these same columns - right click on the column headers, and choose "Select columns". This is not obvious by the way.
    However these selections are not persistent across a reboot. So every time I reboot, I need to add the columns back to the display. That is not the case with Server 2008 R2 - they are persistent.
    I am using a domain administrator userid, but not using roaming profiles.
    This is a retrogression vs. Server 2008 R2. Can it be fixed?

    Hi,
    I meant that both the added columns of Windows Server 2008 R2 and Windows Serve r2012 RC are persistent.
    You may install a new Windows Serve r2012 RC to check the symptom.
    Currently, you may perform a System File Checker (SFC /Scannow) to check the result.
    Regards,
    Arthur Li
    TechNet Community Support
    Arthur, I have the exact same problem as David Trounce.  The problem exists, as he described it, and I don't think you are understanding it.  Please pass this problem/bug/defect report to someone else so that it can be logged and fixed before
    RTM.  Thanks! -Kent

  • APO- BI Datasource from Planning Area

    Hi All,
    I need help with APO-BI datasource generated from Planning Area.
    In the Dev environment we had two clients:
    DCLNT020 (Holds APO part) DCLNT010 (Holds BI workbench).
    So a datasource was generated from the Planning area in DCLNT020 --> it was replicated in DCLNT010 --> data from Planning Area was extracted to BI cube using this.
    Now we transported this datasource to the Test environment which has only one client (TCLNT010). I maintained the Source to target mapping there such that DCLNT020 -- TCLNT010 and DCLNT010 -- TCLNT010.
    However the Transport fails and the error message is:
    Cannot replicate DataSource
    Errors occurred during post-handling RS_AFTER_IMPORT for ISFS L
    If I go to the Test system and try to generate the transported Datasource directly from the Planning area again, it says this DataSource already exists. However I cannot see this datasource in the system even after replicating and refreshing multiple times.
    Please provide your inputs as to what might be wrong and hat I need to do to solve this.
    TIA
    Amrita

    Hi   Amrita Goswami
    Based on the above post it seems to be your maintain two clients in Dev one is for creation and another is for testing and when it comes to test environment your maintain only one client and maintain the DS in one it wont give any impact..
    Based on the error
    > +Cannot replicate DataSource+
    > +Errors occurred during post-handling RS_AFTER_IMPORT for ISFS L+
    There could be two reasons
    1) Needs to replicate the data source once you have imported it to test environ ment and than ran the program "RSDS_DATASOURCE_ACTIVATE_ALL" by giving the name of the source and DS name if its BI 7.0
    If its 3.x then have to execute the program :"RS_TRANSTRU_ACTIVATE_ALL" By specifying the transfer structure name.
    2) RS_AFTER_IMPORT  in some cases its because of improper transport of the update rules.
    Solution would be recollect the transport and release the DS transport first and execute the ( 1)Activities and then transport the remaining._
    Hope its clear a little..!
    Thanks
    K M R
    ***Even if you have nothing, you can get anything.
                                                But your attitude & approach should be positive..!****
    >
    Amrita Goswami wrote:
    > Hi All,
    > I need help with APO-BI datasource generated from Planning Area.
    >
    > In the Dev environment we had two clients:
    >
    > DCLNT020 (Holds APO part) DCLNT010 (Holds BI workbench).
    >
    > So a datasource was generated from the Planning area in DCLNT020 --> it was replicated in DCLNT010 --> data from Planning Area was extracted to BI cube using this.
    >
    > Now we transported this datasource to the Test environment which has only one client (TCLNT010). I maintained the Source to target mapping there such that DCLNT020 -- TCLNT010 and DCLNT010 -- TCLNT010.
    >
    > However the Transport fails and the error message is:
    > Cannot replicate DataSource
    > Errors occurred during post-handling RS_AFTER_IMPORT for ISFS L
    >
    > If I go to the Test system and try to generate the transported Datasource directly from the Planning area again, it says this DataSource already exists. However I cannot see this datasource in the system even after replicating and refreshing multiple times.
    >
    > Please provide your inputs as to what might be wrong and hat I need to do to solve this.
    >
    > TIA
    > Amrita
    Edited by: K M R on Feb 6, 2009 12:03 PM
    Edited by: K M R on Feb 6, 2009 12:18 PM

  • APO DP - loading from InfoCube to planning area

    I am using APO DP V5.
    I have the following situation:
    1. I am extracting sales history data at the end of each day from a connected ECC system into an InfoCube, using delta logic, so in the Cube I just have the new sales history transactions (despatches) for the day
    2. I am then loading data from the Cube to my DP planning area via transaction /SAPAPO/TSCUBE
    I assume I must have the 'add data' flag set for each key figure in this transaction, to ensure I see the consolidated sales history in the planning area.
    Is this the 'best practice' approach for regular loading of sales history? I assume it's better to have a 'delta' approach for the Cube, for improved performance.
    Thanks, Bob Austin

    Hi,
            Good questions!
    1. What does the 'period' really refer to? Is it referring to the date of a particular key figure? Or the 'date of data being added to Cube'?
    A: Both are same
    The date is generally the date in your cube like the calendar day, month etc. This date is again based on a time stamp in the format DDMMYYYYHHMMSS. The calendar day is part of this field as DDMMYYYY. Also the system recognizes the changes by the same time stamp. So, if a customer changes the qty 05/15 at 3.30 pm, then the time stamp is 15052007153000. The calendar day in your cube or your key figure date is 15052007 and the delta is recognized by the changes in time stamp, between last load at the current system time. So , you are talking about the same time field.
    Check in your system if this is the same case you got. let me know if not.
    2. Suppose original dispatch qty = 100 (two weeks ago), and 'today' the customer returns a qty of 60; how does system handle this - I would want this posted to the original date of two weeks ago.
    A: The data from your ECC system is generally brought to an ODS first. The reason is we overwrite the data there if there is any data that has the same key. If your key for the ODS is Customer and division. Then you overwrite the customer qty for that division whenever the value changes. If you need it by time, lets say per month, include it in the key. The system over writes the value for that month only. For next month, it's a new record.
    In your case, if the qty. is 100 2 weeks back and now it's 60, if the time stamp is not in key, the system overwrites it to 60 and you have only 60 when you load it to your ODS and thereby to your PA as it overwrites. Delete the delta in your ODS and it shows the same 100 again. Then load it to PA. This is not a good option. The alternative is to include time stamp like calweek in your ODS key and load it over to cube. That way you have two records.
    I hope I answered all your questions.  Please feel free to ask more. I would love to answer that way I can also brush things up that were unused.
    Thanks.

Maybe you are looking for

  • Process txt files in zip file

    Hello everybody, I have a scenario where PI needs to obtain a zip file via FTP using the File adapter, this zip file contains a number of txt files that I need to process, and the content of one of them send it to an ECC, now I'm using the PayloadZip

  • How to know the program that a user is running.

    Hello SAPients! Is there a programmatic way (FM or something else) to know what program is being executed by a specific user? Thanks in advance.

  • Disabling prompts blocks InfoPath publishing

    I have a SharePoint 2010 Enterprise environment with InfoPath 2010.  Using several forms published as content types in several site collections, all working fine. Users are getting prompted Open/Save of documents in libraries (word, excel) - Office W

  • Selecting photoshop as an external editor

    I am using freehand mx 11.0.2 and have photosop cs3. I am trying to send a picture in freehand to photoshop. I go to edit -> preferences -> object and then i want to send the file as a JPEG but when i click on the button to select the external editor

  • EVDRE: refresh dimension cache files each time user saves or refreshes data

    Hi experts, We're facing a performance trouble with our input schedules, so we pretend to decrease response times. We've noticed an unexpected issue when users work with input schedules. Each time they save or refresh data into any input schedule, it