APO DP - load from InfoCube to a 'fixed' cell in planning area

I am using APO DP V5.
Suppose I have a key figure which is fixed for a particular time bucket cell.
If I try and load data into this cell from an InfoCube, what will be the effect? I assume that the cell will NOT be udpated.
Regards,
Bob Austin, Atos Origin

Hi,
Fixing information will be
extracted when saving the data into an InfoCube and will be reloaded
when using transaction /SAPAPO/TSCUBE to load the data back into the
planning area. If you load the data back only on the level of the
basis planning object structure, the data will be aggregated to the
level of the aggregates but no fixing information will then be
available on the level of an aggregate.
and also /sapapo/tscube doesn't check fixing info in LC when data is written to LC. In case there is fixed data in iCube, that fixing info will be loaded into LC.
You can also refer the note  1057308.
Regards,
Srikanth.

Similar Messages

  • APO DP - loading from InfoCube to planning area

    I am using APO DP V5.
    I have the following situation:
    1. I am extracting sales history data at the end of each day from a connected ECC system into an InfoCube, using delta logic, so in the Cube I just have the new sales history transactions (despatches) for the day
    2. I am then loading data from the Cube to my DP planning area via transaction /SAPAPO/TSCUBE
    I assume I must have the 'add data' flag set for each key figure in this transaction, to ensure I see the consolidated sales history in the planning area.
    Is this the 'best practice' approach for regular loading of sales history? I assume it's better to have a 'delta' approach for the Cube, for improved performance.
    Thanks, Bob Austin

    Hi,
            Good questions!
    1. What does the 'period' really refer to? Is it referring to the date of a particular key figure? Or the 'date of data being added to Cube'?
    A: Both are same
    The date is generally the date in your cube like the calendar day, month etc. This date is again based on a time stamp in the format DDMMYYYYHHMMSS. The calendar day is part of this field as DDMMYYYY. Also the system recognizes the changes by the same time stamp. So, if a customer changes the qty 05/15 at 3.30 pm, then the time stamp is 15052007153000. The calendar day in your cube or your key figure date is 15052007 and the delta is recognized by the changes in time stamp, between last load at the current system time. So , you are talking about the same time field.
    Check in your system if this is the same case you got. let me know if not.
    2. Suppose original dispatch qty = 100 (two weeks ago), and 'today' the customer returns a qty of 60; how does system handle this - I would want this posted to the original date of two weeks ago.
    A: The data from your ECC system is generally brought to an ODS first. The reason is we overwrite the data there if there is any data that has the same key. If your key for the ODS is Customer and division. Then you overwrite the customer qty for that division whenever the value changes. If you need it by time, lets say per month, include it in the key. The system over writes the value for that month only. For next month, it's a new record.
    In your case, if the qty. is 100 2 weeks back and now it's 60, if the time stamp is not in key, the system overwrites it to 60 and you have only 60 when you load it to your ODS and thereby to your PA as it overwrites. Delete the delta in your ODS and it shows the same 100 again. Then load it to PA. This is not a good option. The alternative is to include time stamp like calweek in your ODS key and load it over to cube. That way you have two records.
    I hope I answered all your questions.  Please feel free to ask more. I would love to answer that way I can also brush things up that were unused.
    Thanks.

  • Error /SAPAPO/TSM 041 when loading from InfoCube to SNP planning area

    I am using APO V5.1.
    I have a 'backup' InfoCube with characteristics 9ARNAME and 9AVERSION which I'm loading to an SNP planning area via trans /SAPAPO/TSCUBE. The InfoCube is itself populated from the 9ARE aggregate of the SNP planning area.
    But I get error /SAPAPO/TSM 041when I run the load, suggesting absence of a required characteristic or navigation attribute in the Cube. Do I need 9ALOCNO for instance?
    Can anyone advise me here?...

    Hi,
    I am not very sure about the 9ARE aggregate (haven't used it in backups), but RTSCUBE is used to copy time Series (TS) KF data from cube to planning area (SNP or DP).
    Are you trying to restore some time series data from your backup cube to the planning area? If yes, then do a mapping of characteristic from cube to planning area in RTSCUBE, and also map the TS KF between cube and planning area.
    If your KF is not a time series KF, then you can't copy it from cube to planning area. You could get data to cube for some reporting, otherwise I am not sure what use the backup is for you. For SNP, most of the data would be received from R/3, so there's not much point in having a backup.
    Hope this helps.
    Thanks - Pawan

  • How is data loaded from Infocube to Planning area - need technical info

    I would like to find out how data is loaded from cube to planning area ,
    I know that CUBE has tables, but how does that get loaded to planning area , where is the mapping from which I can compare
    data in CUBE and the same data that reaches planning area.
    Say for example I have below values in the infocube
    Prod1 --> Loc1 -
    > available provisioning qty
    AAA        AB90             100
    Then where do i check in the planning area tables ( are there any tables mapped) after running the TSINPUT program, i know it can be checked in the planning book , but i want to check in planning area tables if they exi

    Hi ,
    The data is loaded from infocube to planning area using update rules. The issue you have mentioned seems to be two requests are having data for the same CVCs in the cube.
    Example: For the same CVC, two requests avilable in the infocube. When you copy the data using TSCUBE transaction, whatever data avilable in the cube for the CVC gets copied into Planning Area.
    CVC1 - cube - Old Request - 100
    CVC1 - cube - Actual Request - 200
    CVC1 - Planning Area = 300 ( The value is supposed to be 200, but since the old request also contains data in the cube for the cvc, it gets copied as 300)
    Issue : There might two request contains data for the same code
    Solution : 1. Check the data in the cube using transaction LISTCUBE  2.  Delete old request and check the data agian. 3. If it matches with your requirement, run the TSCUBE again.
    Please let me know if you need additional information.
    Thanks,
    Jeysraj

  • Data load from Infocube to Infocube 170 million records.

    Hi All,
    I want to load the data from 1 infocube to another infocube. Now I have developed in Development. But in production if I start the Full load it will load or any time out error it will show? bcz 17 crores records I have to load from old infocube to new infocube.
    Please advise any precaution should I take before start load from old to new infocube.
    Thanks in Advance
    Shivram

    You need not load the entire 170 mil records at a go.
    Please do a selective loading, i.e. say, based on Doc Number or Doc Date or Fisc. Period or Cal month, some characteristic like this which is unique for all records.
    This will ensure that the data is getting loaded in small amounts.
    As said above, what you can do is, create a process chain.
    Drop indexes from the 2nd cube. Make multiple infopackages, with different selections, placed one after the other in the process chain, loading the data into cube 2.
    Then build your indexes, after the loads are complete i.e. after 170 mil records have been added to cube 2.

  • Load from Infocube to an ODS.

    Hi All,
    I needed some information on the logic when you are loading from an Infocube to the ODS.
    I have 2 requests with 100,000 records each in an Infocube.All the keys(dimensions-chars) in the infocube match. The keyfigures in both the requests are different.When i execute  Full load to load the ODS i see only 100,000 records loaded into
    the ODS and the keyfigures have been aggregated.
    Verified the update rules for the keyfigure and the "Update Type" is Overwrite.
    Please help me understand if this is the way its supposed to be workign when loading from an infocube to an ODS.
    thanks in advance

    Hi sandesh,
    As your saying this is how it works"records are aggregated when data is extracted from cube even before DSO update rules comes into act".
    So in this case,i will never be able to overwrite the kefy figures even though i have the update type as "Overwrite".
    Ex- I have 2 reqs in the cube
                 Mat        Plant       KF1   
    Req1     1000        07           10
    Req2     1000        07             5
    Loaded this into the ODS .
    Then i have another request in the cube
    Req3      1000      07        1
    Layter when i load this Req3 into the ODS,how would it show up in.If its aggregated int he cube before its loaded intot he cube,then it will never use the overwrite functionality i ahve selected for the update rules.
    Please help me understand.

  • No data Loaded from infocube to another infocube

    Hi all,
    I am trying to create copy of info cube....Here I am getting this erro at the end
    No request Idoc generated in BW
    Diagnosis
    No request IDoc has been created in BW. A short dump has most probably been logged in BW.
    Procedure
    Look for the short dump belonging to your data request in the short dump overview in BW. Pay attention to the correct date and time in the selection screen.
    You can get the short dump list using the Wizard or from the monitor detail screen via the menu path "Environment -> Short dump -> Data Warehouse".
    Removing errors:
    Follow the instructions in the short dump.
    please guide me for this

    hi,
    Go to ST22 or from top menu go to dump pverview and check whts erro message...
    might be you hve run out of space or processes...it could be anything...
    Gaurav
    Asign pts if it helps

  • Loading data from infocube to datastore

    Hello,
    we migrate from 3.1 to 2004s and we are testing our old process chain.
    In one chain we load data to a datastore, from the datastore to an infocube and finally from the infocube to an other datastore. Data are loading fom datastore to infocube but the loading from infocube to final datastore don't work (it finish correctly, but there no data transfered).
    I know that with 2004s i need to use transformation and DTP to load data from an infoprovider to another one.
    But is it still possible to use the old method or do i need to change. For the moment we just want to change nothing if it's possible ?
    Thank's
    Thierry

    Hi,
    The "old" method can still be fully applied in BI 7.0
    As far as the data in the ODS, can you post the eeror you are getting, or the data simply doesn't activate? (In that case, include a process for ODS data Activation in your chain, and you will be fine).
    Hope it helps,
    Gili

  • RE : BI APO Question Reg Data feeding from cube to Planning area.

    Hi BW Experts,
    iam working in an Implementation project for SCM in BW prcisely working with APO Bw..
    For that I have taken historical data as a flat file and loaded it in to the external BW Infocube and its fine...
    Second step I have created generate export datasource on topr of BW infocube and replicated in to Bw and used this export datasource as datasource for APO BW Infocube which is inbulit BW System from External Bw..
    also I have created tranformations and data is loaded in the BW cube in APO system.Also Included Version charecterstics..
    When I try to fed the APO Cube data to planning area Iam gettinga the following  warning itsnot an error:
    1.Key figure copy: InfoCube - planning area (DP) 01.01.2010 to 31.12.2010-- Successful
    *2.No data exists for the selection made (see long text*
      Diagnosis:Data could not be loaded from the Cube for the selection you made. Check whether the Cube actually contains data that is relevant for your selection.
    For the second point I have time charecterstics filled in the infocube which Iam feding to a Planning area like 0CALMONTH,0CALWEEK,FiscVarnt,0CALMONTH
    3.Characteristic assignment: No data copied --- Message
    Can you please help me with your thoughts so that i wll try to corner the issue I will be highly obliged

    Hi,
    As I understood, you have loaded data from external BW cube to APO BW cube and now loading planning area from APO BW cube.
    I hope your settings in /SAPAPO/TSCUBE transaction code would be correct and you have selected the correct planning version with correct cube.
    Check if Data in APO BW cube is available for reporting or not and data is avilable for given selction (if any but I guess you are not giving any).
    Thanks,
    S

  • HR Full Load error in Payroll loading from EBS

    Hi,
    We have configured HR anylatics and when runs the Full load from EBS 11.10 (Vision Demo Database) we are receiving an error which says can not allocate 128 from temp tablespace. We have found the 11i database temp has been grown hugely (more than 10GB) and hard disk space been allocated fully.
    And we found this has been occurred by payroll package (selecting from a view).
    Is there any optimization or any solution for this issue.
    Thanks.
    Nilaksha.

    and remember to fix the bug in the database first:
    UPDATE per_all_people_f
    SET effective_start_date = '07-FEB-02'
    WHERE person_id = 6272
    AND effective_start_date = '04-JAN-91'
    AND effective_end_date > '06-FEB-02';
    because you will see 2 rows if you for:
    SELECT *
    FROM per_all_people_f
    WHERE person_id = 6272
    AND effective_start_date = '04-JAN-91';
    And this isn't possible, becuase HR is date tracked and you can't start twice on 1 date!

  • How to load the data from flat file ( ex excel ) to Planning area directly

    Hi all ,
    How can i load thedata fro m flat file directly to Planning area .
    PLease help me in this.
    Regards,
    Chandu .

    download one key figure data from planning book ( interactive damand plan) and made some changes and need to upload the data back to same planning book
    But, may I know why you are thinking of downloading, changing and uploading for just changing the figures for a particular key figure. You can do it in the planning book itself.
    However, not all the key-figures can be changed. But, what type of key-figure  you are speaking here? Is it like 'Forecast' for which the value is based on other key-figures, or is like a key-figure where some manual adjustments are to be done--so that it can be manually edited? However,  in both the cases, the data can be changed in the planning book only. In first case, you can change the values of dependant key-figures and in the second case, you can change the key-figures directly.
    And please note that you can change the values of the key-figures only at the detailed level. So, after loading the data in the book, use drill-down option, maintain the data at the detailed level, change the figures, and automatically, this gets reflected at the higher level.
    In case you are unable to change the values, go to the 'Design' mode of the book, right-click your key-figure, under "Selected Rows", uncheck "Output Only" option. In case you are unable to see that option, then you are not authorised to change that. See if you can change the authorisations by going to the "Data View" tab in planning book configuration (/n/sapapo/sdp8b), and change the value of Status to 3.
    Hope your query is answered with different solutions offered by many of the sdn colleagues here.
    Regards,
    Guru Charan.

  • Help Required for Mapping Key figures from Cube to APO Planning area.

    Hello Experts,
    We have created cube in APO BW and now we want to map it to Planning area how we can map.
    Can any body explain how we can map keyfigures?
    What is the use of livechache how it will be updated?
    Regards
    Ram

    Hi,
    I am not very sure about the 9ARE aggregate (haven't used it in backups), but RTSCUBE is used to copy time Series (TS) KF data from cube to planning area (SNP or DP).
    Are you trying to restore some time series data from your backup cube to the planning area? If yes, then do a mapping of characteristic from cube to planning area in RTSCUBE, and also map the TS KF between cube and planning area.
    If your KF is not a time series KF, then you can't copy it from cube to planning area. You could get data to cube for some reporting, otherwise I am not sure what use the backup is for you. For SNP, most of the data would be received from R/3, so there's not much point in having a backup.
    Hope this helps.
    Thanks - Pawan

  • Can the source files be loaded from target server

    Hi,
    I have owb client on windows2000 and target on linux server. The current plan is to create runtime repository connection for the target and execute mapping from windows where the source files are located.
    Is there way to put the source files on target server machine (there is no owb client install)? What's the best business practice regarding how the owb and source files are distributed? Thanks.
    Tarcy

    The problem is not the code or html.
    This: "The Java Runtime Environment cannot be loaded from <\bin\server\jvm.dll>
    indicates that you are attempting to run the server jvm, and it does not exist. This can be because either the java command option "-server" was used, or a configuration file setting.
    As shipped by Sun, the JRE does not include the server jvm; the JDK does. If you want the server jvm in the JRE, copy the \server\ directory and contents from the JDK to the JRE.
    If you installed using defaults,
    copy from: C:\Program Files\Java\jdk1.5.0\jre\bin
    copy to: C:\Program Files\Java\jre1.5.0\bin

  • Difference between Reconstruction and regular load from PSA

    Gurus,
          What is the Difference between Reconstruction and load from PSA into datatarget.I understand that during Reconstruction we are not loaing data from R3.
          I have a cube and there are 3 requests in that.I deleted all the requests went to PSA and selected one of the requests and selected reconstruct from this.I went to cube and I can see this selected request in PSA in the reconstruction requests of cube.I submitted the reconstruction/Insert button there.
          Is this the procedure to do reconstruction.If yes then how is it different from regular load from PSA.
    Thanks

    If look function wise both are doing samething. But there is difference in data flow, PSA is used as staging area. If we choose at first place to load only to PSA and analyze our load do manual editing if needed and then load to data target in this case ofcourse reconstruct option is not available or  we could parrallely or sequencialy load to data target beside laoding into PSA. So now having already have loaded into data target if  some how we needed to delete and reload we can reconstruct the request or run normal load from PSA.

  • Fixed Cells in DP saved and reapply after initialization of the PA

    In our environment, we re-initialize the Planning Area each month and generate characteristic combos from scratch.  When you deinitialize the planning area, Fixed cells are typically deleted.  I'm interested in finding out if there is a way to save the information on Fixed Cells and then be able to reapply them after our system is reloaded.
    Does anyone know if there is anything available in APO to help do this or how I would go about doing this?
    Thanks.
    Tekii

    Hi Tekii,
    You can take the fixing information into backup infocube and reload the data to planning area along with
    fixing information.
    To extract the fixing information to backup cube you have to consider the folllowing:
    Since fixing is originally a BW feature,which requires 2 key figures (the original one and the fixable key
    figure), the backup-cube always requires these 2 key figures. This can only be done if fixing information is stored the "old school" way.
    As of SCM5.0 design of the DP liveCache it's much easier to define a key figure as fixable key figure( and is very flexible to change it when ever we want). But when we have to save the information for backup reasons,  the data extraction doesn't deliver any information if the key figure was fixed.
    For lower release than SCM5.0 the fixing information was saved in a reference key figure to the fixable key figure. Fixed key figure had to be assigned in the corresponding field in order to hold fixing information. When it comes to backup information, the data sorce must contain both the key figures.
    So  follow the same procedure  SCM4.x.
    The reason is simply that the only possibility to save fixed key figures in BW is via reference key figures.
    So assign the reference keyfigure to fixed keyfigures from RSD1 and then  use these 2 keyfigures in backup cube.
    Hope this will helps you.
    Regards,
    Sunitha

Maybe you are looking for

  • Open dialog box in application server.

    hai frnds, i am using the following FM for open dialog box in application.   CALL FUNCTION 'F4_DXFILENAME_TOPRECURSION'     EXPORTING       i_location_flag = 'A'       i_server = '?'       i_path = f_app       filemask = c_fnh_mask        FILEOPERATI

  • Sale order stock de-assigning.

    Dear Gurus, Please address this scenario: Suppose I have a strategy which consumes from stock every time a sale order is raised. Imagine the situation that I have a stock of 50kg for a material and I receive a sale order for 30 kg, so automatically 3

  • User Exit or Enhancement or BADI for CA03 & MM03

    Hi Experts, I had a scenario like validate the Material No for CA03 & MM03. When ever we give 'material no' in the MM03 or CA03 ans press enter I need to validate that material no. I don't see any proper user exit or badi for these tcodes to validate

  • Accordion on SharePoint's td element

    I'm trying to use Accordion on my SharePoint project and I have this CEWP using JavaScript that creates a header placed under <td> element (see attached) class name "sectionHeader" the problem is its not working. Do we have some work around so I can

  • Why is my iMovie so slow to open and at times freezes

    my imovie has always appeared to be "slow" (EVERYTHING else for all of my Apple products is "instant") but now my imovie, every once and a while, will freeze during a movie im making.  It does "report a error but i never keep the error since the erro