IP- Planning CUBE data deletion.

Dear Experts,
We have a planning enabled cube and the data is loaded and distributed as per reference (planning Function) is used and the planning sequence is run on the cube.
Unfortuntely due to some issue in CRM we have to delete data from the cube for the last two months and reload.
My doubt is that if i just delete the data requests and also planning requests and then reload and run my planning sequence which will reporcess the data from last two months again or else will their be an issue.
Because in the normal cube its quite simple just delete all the requests from the cube and DTP will take care about the rest.
So please suggest is their anything i need to take care.
Thanks and Regards
Neel

Hi,
If you know what data you want to delete (Selection criteria is clear to you ),as you specified last two months data then i will suggest go with selective deletion.
You can follow the following steps for correct deletion:-
1) Check the data in the cube by using listcube,with same selction criteria (In u r case last two months) .This will confirm you know and checked what you want to delete.
2) Select the cube--> Right click --->Change Real time load Behavior.
3)Select Planning not allowed.
4)Select the cube,right click manage option.
5) Select contents tab ,click on delete selection button.
6)Specify same selction which you checked in cube and delete the contents.
7)Change the load behavior back to load not allowed.

Similar Messages

  • Info cube data deletion fact only

    When we want to delete the data in the info cube, we get an option dialog box to choose
    Do you want to delete only the contents of the fact table
    in InfoCube IC_SDGR1 or do you
    want to delete the dimension tables as well?
    1)fact table only
    2)fact table and dimension table.
    Could anyone tell me when we use only fact table , is anyone encountered with this scenario.
    Because generally we use fact table and dimensional table option for deletion.
    Thanks for the replies

    as long as you haven't changed anything to the dimension you can leave the data in the dimension tables when deleting.Your reload will go a lot faster as less (or none) entries are to be created. on the other hand if you have made a change to your cube and you transport to production and you need to reload, then you need to delete the entries in the dimension table, as you have a new key in the structures and all old entries are useless...
    M.

  • Dimension tables not deleted when cube data deleted

    We are using a process chain to delete the entire contents of the cube prior to the next load.  Over the course of time repeating this everyday, we see that the RSRV check tells us that the "MAX % ENTRIES IN DIMS COMPARED TO F-TABLE"  = 4718% !!  How can this be if each day the data is getting deleted from all related tables?  The max could only be 100% then, correct?
    Please advise.  I could not find an OSS note that the dimensions are not deleted as well with that step in a process chain.
    Thanks, Peggy

    Hi,
    we have fixed it call creating our own ABAP code with the following CALL:
      CALL FUNCTION 'RSDRD_DIM_REMOVE_UNUSED'
          EXPORTING
            i_infocube        = YOUR_ICUBE
    OPT       i_t_dime          = YOUR_DIMENSION
    OPT       i_check_only      =
          IMPORTING
            e_repair_possible = tp_repair_possible
    OPT     CHANGING
    OPT        c_t_msg           = it_msg
          EXCEPTIONS
            x_message         = 1
            OTHERS            = 2.
    (inserting code in the new SDN frontend is a bit of a nightmare...)
    and included this ABAP in the chain
    You can also go via RSRV and do that manually.
    I don't recommend to delete the whole dim table since the loading time will be drastically higher if you have your DIMS empty...
    The above works as well for aggregates...
    hope this helps
    Olivier.
    Edited by: Olivier Cora on Jan 15, 2008 5:41 PM

  • IP Exit Planning function copy data from cca to pca planning cubes

    Hello All,
    I have a requirement where I have to copy the characteristics and keyfigures of CCA plan cube data to pca plan cube data.The infoobjects in CCA aggregation level are are {0amount,0costcenter,0costelement,version,0calmonth,0infoprovider} which needs to be copied to corresponding infoobjects in PCA level  {0amount,0profitcenter,0account,0version,0calmonth,0infoprovider}.
    The CCA and PCA aggregation level are built on the top of the multiprovider.
    I can do it using the fox coding but 0costelement cannot be mapped to 0account as these two are different fields.Since I have to copy the values of 0costelement to 0account , I was wondering how can I do it using the exit function.
    As I have never used the exit function before, I was wondering if somebody can help me out with this.
    By the way, I have read the forums and figured out to create a class in se24 and use interface
    IF_RSPFLA_SRVTYPE_IMP_EXE and since I am generating some records , will be using the method IF_RSPLFA_SRVTYPE_IMP_EXEC~INIT_EXECUTE.
         By the way , I read in the forums where there are methods/function modules which can copy data from one aggregation level to another aggregation level.Anyways, can you tell how can I loop thru the records of CCA aggregation level and copy the records to the PCA aggregation level.
    Edited by: nazeer on Feb 22, 2009 12:04 PM

    This thread might help you.
    https://forums.sdn.sap.com/click.jspa?searchID=22634973&messageID=5317176

  • Does exist a way to read data from IP planning cube with ABAP?

    Hello All.
    My scenario is as follows:
    I have an ODS where we store costcenters to be planned. This ODS is loaded via a manual falt file load (regular dta transfer process).  In order to avoid inconsisten data in the planning cube, I want to check in my ODS load process, if any costcenter which is already planned (it has planned data in my planning cube) is being deleted in the loaded file.
    To do this, I need to access my planning data cube ( i guess also planning buffer), so my question is if there is any function module which retrieves data from a planning level as reference data.
    Thanks a lot and best regards,
    Alfonso.

    Hi Alfonso,
    note 1101726 shows how the plan buffer can be read.
         l_r_plan_buffer = cl_rsplfa_plan_buffer=>if_rsplfa_plan_buffer~get_instance( i_infoprov ).
         l_r_plan_buffer->get_data( EXPORTING i_t_charsel = l_t_charsel       
                                              i_include_zero_records = rc_false
                                              i_r_msg = l_r_msg
                                    IMPORTING e_r_th_data  =  l_r_th_data
                                    EXCEPTIONS OTHERS  = 2 ).  
    Please take a look at the note. You do not need to implement the after_burn_selection exit, but you can find sample code how to read the planning buffer. Please give it a try.
    Another solution would be to use the function module RSDRI_INFOPROV_READ. But you need to make shure that you first close the yellow request. This can be done using function module RSAPO_SWITCH_BATCH_TO_TRANS.
    Hope this helps
    Matthias Nutt
    SAP Consulting Switzerland

  • Data requests in planning cube

    Hi Friends,
    I like to find out from you how can I delete from the cube data I loaded into a planning cube via planning activities given that the cube only creates new request after n thousand records have filled the request.
    In short, how can I manage the changes I introduced, for example, delete the loaded data to undo?
    regards
    Michael

    Hello Micahel,
    There is one more easy option to delete plan data without having the need to close any open requests. BEWARE .. deleting data using this method will be difficult to undo once saved.
    If you can identify the exact subset of data that you want to delete, you can create a planning level which has all the characteristics and key figures. Create a function of type delete.
    You can then set the selection criteria in the planning level and delete the data. In most cases, we have restricted the access to this functionality to one or two super users only.
    Hope this helps.
    Sunil

  • A/P & Consignment data pull to Planning Cube.....

    I put this in "BC & Extractors" section, but did not get any response.....
    Need couple of clarifications (brief background is provided.....):
    1. We do DELTA extractions using 0FI_AP_3 adn 0FI_AR_4 to a base cube which includes both 'Open' and 'Cleared'
    items. From this cube we do a DELTA load to a Planning Cube on a DAILY basis. We filter the data on Company Code
    and 'Item Status' (OnlyOPEN Items) for Planning purposes. The key figure pulled to Planning Cube is '0DEB_CRE_DC'.
    The bsae cube has data at Line Item Document level, where as the planning cube is only at Customer & Vendor level.
    The question is - Will this Daily DELTA to Planning cube with filter on Open Item bring correct key figure? For example,
    if the status of one of the items pulled as OPEN in earlier Delta changes to "CLEARED" the subsequent day, will the Key
    Figure reflect the changes and show correct value? Is there any impact on the key figure due to the filter on Open Item?
    My thinking is the key figure 0DEB_DRE_DC will reflect he correct number (adjsuted) since it is a summated key figure.
    I could be totally wrong and want to understand this with comments from experts.
    2. Has anyone worked with pulling Consignment (SMI) data into Planning cube? We pull this data using a generic extractor
    from a SAP table "RKWA" with some logic. Since I pull A/P data that contains SMI & Non-SMI transactions. To avoid
    Double Counting, how can I avoid bringing in the SMI transactions from A/P? There is no flag to diffferentiate the SMI /
    Non-SMI transactions....
    I am hoping someone has worked in these area to offer expert comments/ recommendations.
    Thanks.. Shaun
    Please post one question only once
    Edited by: Vikram Srivastava on Aug 13, 2010 1:02 PM

    Hello Lee,
    For your first question as you are using the Filter on the status of the record,
    lets say today the status is OPen and as a part of the filter the Delta will be able to pick this record as the status is OPen but if the status is changed to Closed tomorrow then this record will be completely ignored and hence in the next level target you will not have the right status at all.
    So you need to plan your filters accordingly.
    Hope this helps.
    Thanks
    Murali

  • How to Get the Current data into Planning Layout from the Planning cube

    Hi,
    I have a problem in BPS.   I am selecting the data from cube based on Month org details but I want to see the latest data.   How can i get this data into planning cube.
    Like
    data
    Tran           cal month            org               amt
    1                 jan                        a                  100
    1                 feb                        a                   200
    if i want to read based on Tran org as input values I shoud get the below data but I am getting the previous data.   
    Tran           cal month            org               amt
    1                 feb                        a                   200
    Kindly help me in this regard
    Thanks
    Naveen

    Naveen,
        Are yo having issue when you save something in the layout, the data doesn't appear in the listcube ? Or do you have issue that the latest data you see in the cube doesn't appear in the layout ?
    For the former issue, please look at your selections of listcube and for the second issue, please check your planning level definition, make sure all the restrictions you have applied are valid for this latest data to be presented in the layout.
    Hope this helps.
    Cheers
    Srini

  • Unable to load data into Planning cube

    Hi,
    I am trying to load data into a planning cube using a DTP.
    But the system throws a message saying that, real time data loading cannot be performed to the planning cube.
    What would be the possible reason for the same?
    Thanks & Regards,
    Surjit P

    Hi Surjit,
    To load data into cube using DTP, it should be put in loading mode only.
    The real time cube behaviour is set to planning mode only when the data is input to the cube via the planning layouts / or through the Fileupload program.
    You can change the behaviour of the cube in RSA1 -> Right click on the cube -> Change Real-time load behaviour and select the first option(real time cube can be loaded with data; planning not allowed)
    Best Rgds
    Shyam
    Edited by: Syam K on Mar 14, 2008 7:57 AM
    Edited by: Syam K on Mar 14, 2008 7:59 AM

  • Automatically trigger the event to load data from Planning cube to Standard Cube

    Hello,
    We have a below set up in our system..
    1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
    2. An actual reporting cube which gets data from the planning cube above.
    Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
    This involves 2 things..
    1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
    2. Trigger the DTP which loads data from Planning cube to reporting cube.
    We want to automate the above two steps...
    I have tried few things to achieve the same..
    1. Created an event in SM64,
    2. In the Planning cube "Manage" Screen,  clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
    3. Wrote a ABAP program which changes the setting of the planning cube (  " Change real time load behaviour " to Loading )
    4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
    This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
    This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?

    hi,
    try to do the transformation directly in the input cube by using CR of type exit, more details :
    http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
    hope it helps.

  • How to write data from planning folder into a planning cube in BPS

    Hi All,
    I have an issue in writing data from planning folder to planning cube.
    I updated the excel sheet in planning folder in UPSPL tcode.
    After clicking on save button in the excel sheet the data did not get updated into the cube.
    I set the real time cube to planning mode.
    Please let me know if there are any settings to save the data to cube or how can it be done.
    Thanks in advance.
    Regards,
    Lavanya.

    Hi Lavanya,
    What do you mean by "the data did not get updated into the cube" ?
    Don't you find a yellow request in the cube administration transaction ? What happened if you set it to green ?
    Regards,
    Fred

  • Data from planning cube to TEXT FILE IN directories of BW

    hi all
    i have 2 transactional planning cubes in BW. the data in these cubes in bw needs to be uploaded monthly in to a single text file so that APO system can access the data for some purpose.
    my question is how to upload the data from transactional cubes to the text file, which is more detailed. is that use ful to construct a ODS or CUBE(basic) on top of the transactional cubes.
    if i keep the file in BW directory whethere APO can access that.?..
    please help me out
    praveen.

    Hi,
    You could create a multicube over your 2 transactional cubes and extract your data with a query on the multicube.
    Make sure you restrict 0REQUID 'request id' using 0S_RQMRC 'most current data' to be sure you extract the most recently planned data.
    No need to convert the transactional cubes to basic cubes.
    To download the query I think you have to use the reporting agent of the broadcaster.
    Thanks for you points if you found this helpful or if this solved your problem.
    Best Regards,
    Filip

  • Newbie questions regarding pulling data from hyperion planning cube w/OBIEE

    Hello there,
    we've recently implemented Essbase and are currently pumping it full of revenue/expense data from out source systems to calculate NOI. This data is stored in a staging table at the detail level where it is sourced into Hyperion and aggregated. We also have OBIEE 10g (plan to upgrade to 11g later this year) and we would like to connect to and report out of Essase. Our ultimate goal is to be able to report on the NOI numbers in the planning cube but have the ability to drill down to the detail level which is not stored in Essbase. We've heard it is possible, albeit not native for OBIEE 10g to do this. We've also heard that it is not best practice to use our transactional cube for this type of reporting but to actually create a second "reporting cube".
    What are best practices for getting this NOI data out of Hyperion and merging it with our relational detail reporting? Can we somehow export the data from the cube and store it in a relational database? Should we clone the cube (if even possible) and configure both it and our relational source in the BI repository and setup all drill-throughs there?
    Any info is GREATLY appreciated. Thank you.
    Edited by: cisGuy on Sep 20, 2010 5:31 PM

    i have found information on how to use ODI to extract data for the cube. What I'm really trying to find out though is best practices for reporting off summary level data in Essbase with the ability to drill-through to the detail.
    We've heard reporting off the same cube that users are writing back and transacting on is bad. Do we need to make a "reporting cube" and then bring that in OBIEE and merge it with a relational source or is it better to extract the data from Essbase into flat files and join it to detail tables in our relational source?

  • Issues with Writing back data to Planning Cube After adding a new characterstic

    Hello,
    We have two cubes and a MP on top of it. ( Reporting Cube and Planning Cube ). An aggregation layer has also been built and so does a planning BEx query on top of it.
    Before Scenario: things were working fine as we were able to write back data to planning cube using the BEx.
    After Scenario: We created a new Dimension in the planning cube and added a new characteristic ( lets call it ZX_KEY ) . This Characteristic holds the concatenated values of some of the other characteristics in its master data tables . ( For ex: Account_Customer ID_Country-code_Industry-code ).
    This new characteristic is also present in the reporting cube and has been added and transferred to MP.
    This new characterstic was added to the planning BEx query as one of the rows and we executed it the analyzer. After entering the required values for all characterstics ( including the newly added ZX_KEY ) and the key figure values, we tried saving it. Now this is where the Analyzer is throwing up this error..
    ~~Characterstic combination cannot be assigned to part provider ~~
    ~~Characterstic 'ZX_KEY'; Characterstic value 'R123000000_12_US_TX'~~
    ~~ Entered values are incorrect:Correct before navigation ~~
    The value that we are entering ( R123000000_12_US_TX ) is picked up from the master data table of ZX_KEY by double clicking and selecting it from the fetched values. So i am not sure as to why its throwing up the above error. Request your help on this please.
    regards,
    Karthik

    Hi,
    Try to check that all characteristics are correctly assigned at multiprovider level.
    hope it helps

  • How to delete aggreagetd data in a cube without deleting the Aggregates?

    Hi Experts,
    How to delete aggreagetd data in a cube without deleting the Aggregates?
    Regards
    Alok Kashyap

    Hi,
    You can deactivate the aggregate. The data will be deleted but structure will remain.
    If you switch off the aggregates it wont be identified by the OLAP processor. report will fetch the data directly from the cube. Switching off the aggreagte won't delete any data,but temporarly the aggregate will not be availbale as if it is not built on the info cube. No reporting is not possible on swtiched off aggregates. The definition of the aggregate is not deleted.
    You can temporarily switch off an aggregate to check if you need to use it. An aggregate that is switched off is not used when a query is executed.This aggregate will be having data from the previous load's. If the aggregate is switched off means it wont be available for reporting but data can be rolled up into it.
    If u deactivate the aggregates the data will be deleted from the aggregates and the aggregate structure will remain the same.
    The system deletes all the data and database tables of an aggregate. The definition of the aggregate is not deleted.
    Later when you need those aggregate once again you have to create it from scratch.
    Hope this helps.
    Thanks,
    JituK

Maybe you are looking for

  • How do i sync ical to google calendars with Leopard?

    Hi I am trying to syn two macbooks ical to google calendars.  I can do it on one by going ical>preferences>accounts>+account> which pops up a box with  Accounts information : server settings:  Delegation along the top and description: username:passwo

  • I am running 10.7.5. Is maverick different?

    am running 10.7.5. Is mavericks different?

  • Pen tool without snap

    I have little bit provlem with using a pen tool in Photoshop. I am conviced that solution would by somewhere in options. When I use a pen tool it snap my path to whole pixels. Is there any settings where I could set that my paths which I make with pe

  • Problem in updating table

    Hi, I have a prolem in updating value in new field. I have a Z - table ZABC with three fields. I have added one more colunm in ZABC. There are already few rows avaiable in this table.Now i want to update a new Col which i have added to this table. Bu

  • Photosmart C310 can't find driver to reinstall

    I went to your site (this page: http://support.hp.com/us-en/models?q=Photosmart+C310&tool=s-002#Z7_LAI0GH40KGGIE0AKF0CNEN20G1) to try and download the driver for my HP Photosmart C310 but none of the links work.