Data requests in planning cube

Hi Friends,
I like to find out from you how can I delete from the cube data I loaded into a planning cube via planning activities given that the cube only creates new request after n thousand records have filled the request.
In short, how can I manage the changes I introduced, for example, delete the loaded data to undo?
regards
Michael

Hello Micahel,
There is one more easy option to delete plan data without having the need to close any open requests. BEWARE .. deleting data using this method will be difficult to undo once saved.
If you can identify the exact subset of data that you want to delete, you can create a planning level which has all the characteristics and key figures. Create a function of type delete.
You can then set the selection criteria in the planning level and delete the data. In most cases, we have restricted the access to this functionality to one or two super users only.
Hope this helps.
Sunil

Similar Messages

  • Does exist a way to read data from IP planning cube with ABAP?

    Hello All.
    My scenario is as follows:
    I have an ODS where we store costcenters to be planned. This ODS is loaded via a manual falt file load (regular dta transfer process).  In order to avoid inconsisten data in the planning cube, I want to check in my ODS load process, if any costcenter which is already planned (it has planned data in my planning cube) is being deleted in the loaded file.
    To do this, I need to access my planning data cube ( i guess also planning buffer), so my question is if there is any function module which retrieves data from a planning level as reference data.
    Thanks a lot and best regards,
    Alfonso.

    Hi Alfonso,
    note 1101726 shows how the plan buffer can be read.
         l_r_plan_buffer = cl_rsplfa_plan_buffer=>if_rsplfa_plan_buffer~get_instance( i_infoprov ).
         l_r_plan_buffer->get_data( EXPORTING i_t_charsel = l_t_charsel       
                                              i_include_zero_records = rc_false
                                              i_r_msg = l_r_msg
                                    IMPORTING e_r_th_data  =  l_r_th_data
                                    EXCEPTIONS OTHERS  = 2 ).  
    Please take a look at the note. You do not need to implement the after_burn_selection exit, but you can find sample code how to read the planning buffer. Please give it a try.
    Another solution would be to use the function module RSDRI_INFOPROV_READ. But you need to make shure that you first close the yellow request. This can be done using function module RSAPO_SWITCH_BATCH_TO_TRANS.
    Hope this helps
    Matthias Nutt
    SAP Consulting Switzerland

  • A/P & Consignment data pull to Planning Cube.....

    I put this in "BC & Extractors" section, but did not get any response.....
    Need couple of clarifications (brief background is provided.....):
    1. We do DELTA extractions using 0FI_AP_3 adn 0FI_AR_4 to a base cube which includes both 'Open' and 'Cleared'
    items. From this cube we do a DELTA load to a Planning Cube on a DAILY basis. We filter the data on Company Code
    and 'Item Status' (OnlyOPEN Items) for Planning purposes. The key figure pulled to Planning Cube is '0DEB_CRE_DC'.
    The bsae cube has data at Line Item Document level, where as the planning cube is only at Customer & Vendor level.
    The question is - Will this Daily DELTA to Planning cube with filter on Open Item bring correct key figure? For example,
    if the status of one of the items pulled as OPEN in earlier Delta changes to "CLEARED" the subsequent day, will the Key
    Figure reflect the changes and show correct value? Is there any impact on the key figure due to the filter on Open Item?
    My thinking is the key figure 0DEB_DRE_DC will reflect he correct number (adjsuted) since it is a summated key figure.
    I could be totally wrong and want to understand this with comments from experts.
    2. Has anyone worked with pulling Consignment (SMI) data into Planning cube? We pull this data using a generic extractor
    from a SAP table "RKWA" with some logic. Since I pull A/P data that contains SMI & Non-SMI transactions. To avoid
    Double Counting, how can I avoid bringing in the SMI transactions from A/P? There is no flag to diffferentiate the SMI /
    Non-SMI transactions....
    I am hoping someone has worked in these area to offer expert comments/ recommendations.
    Thanks.. Shaun
    Please post one question only once
    Edited by: Vikram Srivastava on Aug 13, 2010 1:02 PM

    Hello Lee,
    For your first question as you are using the Filter on the status of the record,
    lets say today the status is OPen and as a part of the filter the Delta will be able to pick this record as the status is OPen but if the status is changed to Closed tomorrow then this record will be completely ignored and hence in the next level target you will not have the right status at all.
    So you need to plan your filters accordingly.
    Hope this helps.
    Thanks
    Murali

  • Newbie questions regarding pulling data from hyperion planning cube w/OBIEE

    Hello there,
    we've recently implemented Essbase and are currently pumping it full of revenue/expense data from out source systems to calculate NOI. This data is stored in a staging table at the detail level where it is sourced into Hyperion and aggregated. We also have OBIEE 10g (plan to upgrade to 11g later this year) and we would like to connect to and report out of Essase. Our ultimate goal is to be able to report on the NOI numbers in the planning cube but have the ability to drill down to the detail level which is not stored in Essbase. We've heard it is possible, albeit not native for OBIEE 10g to do this. We've also heard that it is not best practice to use our transactional cube for this type of reporting but to actually create a second "reporting cube".
    What are best practices for getting this NOI data out of Hyperion and merging it with our relational detail reporting? Can we somehow export the data from the cube and store it in a relational database? Should we clone the cube (if even possible) and configure both it and our relational source in the BI repository and setup all drill-throughs there?
    Any info is GREATLY appreciated. Thank you.
    Edited by: cisGuy on Sep 20, 2010 5:31 PM

    i have found information on how to use ODI to extract data for the cube. What I'm really trying to find out though is best practices for reporting off summary level data in Essbase with the ability to drill-through to the detail.
    We've heard reporting off the same cube that users are writing back and transacting on is bad. Do we need to make a "reporting cube" and then bring that in OBIEE and merge it with a relational source or is it better to extract the data from Essbase into flat files and join it to detail tables in our relational source?

  • Overwrite data in the Planning Cube

    Hi Gurus,
    I am new to BPS, I have created a simple Manual planning layout for my users
    where they could enter their plan data for either plan or forecast version.
    As you all know the cube is always additive and cannot overwrite the Keyfigures directly
    I want to know if there is any in built planning function which would do that.
    For example Right now if the users first enter like
    Plant  Profit center version QTY
    1000  2200             SP0     100
    and they think the number 100 is too much and they want to readjust it 80
    they want to enter as 1000 2200  SP0 80 but cube is additive so it is making the QTY as 180 instead of 80.
    As of now they are entering as 1000 2200 SP0 -20 to adjust it to 80 which they don't want to do
    Hope I communicated well.
    Thanks
    Jay.

    The basic functionality of manual planning is to save the data in the cube the values which we enter in the layout. If we make 100 to 80 in the layout, after saving, cube will have 80 only, not 180. Ofcourse cube is additive, in the sense, to make the 100 to 80, system generates a record of -20 in the cube so that the overall result is 80. User doesnt need to enter -20 to have 80 in the cube. Recheck once again.

  • How to resolve data request error in cube load

    I apologize in advance if this is too much detail, but here goes...
    We are on BW 3.5 and have only one cube active (0SD_C03).  This cube contains five data targets.  One of these targets had an update request fail due to an error in the transfer rule. 
    Basically, I have a custom characteristic which is populated by parsing out the Product Hierarchy field from R/3.  The BW custom characteristic is a 5-character Alpha field.  Problem is that in this case, the format of the code is not five characters; instead, it is a blank and then four digits.  So instead of being '8613A' it is ' 8613', so the transfer rule terminates with the error 'InfoObject /BIC/ZPRDCODE does not contain alpha-conforming value'.
    I recall being told by consultants to always "fix the data in the source system" but in this case, the data isn't wrong per se.
    So I guess I have two questions:
    1) Can I manually edit the data in the PSA to "correct" the product line field to be a conforming value? i.e. I would change it from ' 8613' to '8613x'. The help says something about deleting the request in order to maintain data in the PSA, but I'm not clear on how to do this.  And once I have edited the data, how do I "release" all the subsequent deltas so they are available for reporting?
    2) Can I change my transfer rules such that these errors don't occur in the future? 
    Again, sorry for all the detail, but I'm fairly desperate. 
    Thanks in advance,
    Michele
    [email protected]

    Hi,
    <i>1) Can I manually edit the data in the PSA to "correct" the product line field to be a conforming value? i.e. I would change it from ' 8613' to '8613x'. The help says something about deleting the request in order to maintain data in the PSA, but I'm not clear on how to do this. And once I have edited the data, how do I "release" all the subsequent deltas so they are available for reporting?</i>
    You can manuallly edit the data in PSA .But to do this you have to delete the request from the Data target(s).Then edit the data in PSA. And then schedule the request from PSA tree.
    <i>2) Can I change my transfer rules such that these errors don't occur in the future?</i>
    I hope, you can . select tick mark for 'Conversion' column in transfer rule of this object .
    With rgds,
    Anil Kumar Sharma .P

  • How Distinguish planning data and Actual data in a Planning cube

    Hi,
    I want to dispaly planning data & actaul data in a query how can i do it?
    in my query it is showing all the requests it is not distinguishing.

    Well use infoobject 0VTYPE for actual and plan differentiation
    and 0VERSION to categorise which version of Plan data it is,typically you might have different planning versions .
    Example
    0VTYPE = 10 actual
    0VTYPE = 20  plan
    0VERSION = 001
    0VERSION = 002
    0VERSION = 003
    Business decides which 0version value to be taken as final planning version once freezed.0VTYPE = 10 for actuals is always one ,it do not have 0VERSION to be taken in to account.
    Hope it Helps
    Chetan
    @CP..

  • Automatically trigger the event to load data from Planning cube to Standard Cube

    Hello,
    We have a below set up in our system..
    1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
    2. An actual reporting cube which gets data from the planning cube above.
    Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
    This involves 2 things..
    1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
    2. Trigger the DTP which loads data from Planning cube to reporting cube.
    We want to automate the above two steps...
    I have tried few things to achieve the same..
    1. Created an event in SM64,
    2. In the Planning cube "Manage" Screen,  clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
    3. Wrote a ABAP program which changes the setting of the planning cube (  " Change real time load behaviour " to Loading )
    4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
    This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
    This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?

    hi,
    try to do the transformation directly in the input cube by using CR of type exit, more details :
    http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
    hope it helps.

  • Unable to load data into Planning cube

    Hi,
    I am trying to load data into a planning cube using a DTP.
    But the system throws a message saying that, real time data loading cannot be performed to the planning cube.
    What would be the possible reason for the same?
    Thanks & Regards,
    Surjit P

    Hi Surjit,
    To load data into cube using DTP, it should be put in loading mode only.
    The real time cube behaviour is set to planning mode only when the data is input to the cube via the planning layouts / or through the Fileupload program.
    You can change the behaviour of the cube in RSA1 -> Right click on the cube -> Change Real-time load behaviour and select the first option(real time cube can be loaded with data; planning not allowed)
    Best Rgds
    Shyam
    Edited by: Syam K on Mar 14, 2008 7:57 AM
    Edited by: Syam K on Mar 14, 2008 7:59 AM

  • IP Exit Planning function copy data from cca to pca planning cubes

    Hello All,
    I have a requirement where I have to copy the characteristics and keyfigures of CCA plan cube data to pca plan cube data.The infoobjects in CCA aggregation level are are {0amount,0costcenter,0costelement,version,0calmonth,0infoprovider} which needs to be copied to corresponding infoobjects in PCA level  {0amount,0profitcenter,0account,0version,0calmonth,0infoprovider}.
    The CCA and PCA aggregation level are built on the top of the multiprovider.
    I can do it using the fox coding but 0costelement cannot be mapped to 0account as these two are different fields.Since I have to copy the values of 0costelement to 0account , I was wondering how can I do it using the exit function.
    As I have never used the exit function before, I was wondering if somebody can help me out with this.
    By the way, I have read the forums and figured out to create a class in se24 and use interface
    IF_RSPFLA_SRVTYPE_IMP_EXE and since I am generating some records , will be using the method IF_RSPLFA_SRVTYPE_IMP_EXEC~INIT_EXECUTE.
         By the way , I read in the forums where there are methods/function modules which can copy data from one aggregation level to another aggregation level.Anyways, can you tell how can I loop thru the records of CCA aggregation level and copy the records to the PCA aggregation level.
    Edited by: nazeer on Feb 22, 2009 12:04 PM

    This thread might help you.
    https://forums.sdn.sap.com/click.jspa?searchID=22634973&messageID=5317176

  • Compression in a Planning cube

    I was trying to compress 2009-2010 data in a planning cube and when i collapse the request it got csheduled but the compressio  didnt happen.
    When i look at the logs of the batch job it says that -
    No requests needing to be aggregated have been found in InfoCube DE_RBQTPL
    Compression not necessary; no requests found for compressing
    Compression not necessary; no requests found for compressing
    Job finished
    Can you please help me in resolving this issue.
    Thanks,
    Sravani

    Please check that all the request are loaded properly
    into the info cube.
    Is there any yellow request in the cube?.
    Then wait for some time and two the compression.
    Thanks,
    Saveen

  • How to give planners read only access to Planning cube for ad-hoc analysis?

    Hi Everyone,
    I am trying to issue smartview reports to the planners. Most of the reports are coming off Essbase (reporting) application. However, we have to generate one report from planning server because of the text fields. Currently, on planning server, planners have access only to the forms. They dont have access to the planning cube. But to make them able to refresh this report, we need to give them access to the planning cube.
    But we dont want them to do ad-hoc data write on planning cube. So my question goes like this - Is there a way for planners to have write access on the web forms but only read access to the cube through samrtview ??

    The access permissions will match between planning and smart view if a planning connection is used.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Was told not to report off transaction planning cubes

    Hello there,
    we recently implemented Hyperion Planning and have 3 Essbase cubes: Planning, Capex, and workforce. We want to hook OBIEE up to the cubes for reporting. We have been told it is not best practice to report off the same cubes we're transacting in. How should we go about building reporting cubes? Is it as simple as "backing up" or copying our transactional cubes?

    It is always better to restrict the planning application for the planning operations and providing the reqd reporting applications for the reporting purposes.
    This will avoid the unnecessary conflict b/w the planning and reporting cycles.
    In the planning 11.1.1.2 there is inbuilt feature of mapping the planning cubes for the reporting application(ASO applications).
    Under covers it works like this
    It extract the required set of data from the planning cube by genaratiing and executing calc script into a text file.
    After that it will format the data file to map the reporting cube and dimensionality and load into the reporting cube.
    Insteade of using this in built feature we can provide the better automated customi interface for our application with three regularly used components
    1. data export calc script
    2. Rule file to map the exported file with the reporting cube.
    3. MaxL file.
    If you are using the same dimension structure for the reporting cube as of planning cube except hsp_rates and currency dimensions.
    You can try this as well.
    1. creating bso-aso replicated partition(suppoerted in essbase 11)
    2. replicate data
    3. drop replicate data

  • IP- Planning CUBE data deletion.

    Dear Experts,
    We have a planning enabled cube and the data is loaded and distributed as per reference (planning Function) is used and the planning sequence is run on the cube.
    Unfortuntely due to some issue in CRM we have to delete data from the cube for the last two months and reload.
    My doubt is that if i just delete the data requests and also planning requests and then reload and run my planning sequence which will reporcess the data from last two months again or else will their be an issue.
    Because in the normal cube its quite simple just delete all the requests from the cube and DTP will take care about the rest.
    So please suggest is their anything i need to take care.
    Thanks and Regards
    Neel

    Hi,
    If you know what data you want to delete (Selection criteria is clear to you ),as you specified last two months data then i will suggest go with selective deletion.
    You can follow the following steps for correct deletion:-
    1) Check the data in the cube by using listcube,with same selction criteria (In u r case last two months) .This will confirm you know and checked what you want to delete.
    2) Select the cube--> Right click --->Change Real time load Behavior.
    3)Select Planning not allowed.
    4)Select the cube,right click manage option.
    5) Select contents tab ,click on delete selection button.
    6)Specify same selction which you checked in cube and delete the contents.
    7)Change the load behavior back to load not allowed.

  • How to write data from planning folder into a planning cube in BPS

    Hi All,
    I have an issue in writing data from planning folder to planning cube.
    I updated the excel sheet in planning folder in UPSPL tcode.
    After clicking on save button in the excel sheet the data did not get updated into the cube.
    I set the real time cube to planning mode.
    Please let me know if there are any settings to save the data to cube or how can it be done.
    Thanks in advance.
    Regards,
    Lavanya.

    Hi Lavanya,
    What do you mean by "the data did not get updated into the cube" ?
    Don't you find a yellow request in the cube administration transaction ? What happened if you set it to green ?
    Regards,
    Fred

Maybe you are looking for

  • 5th Generation Ipod wont Work in Itunes

    Hi I have a 5th generation Ipod. I recently tried to restore but it wont restore i tried 6 times to restore it and nothing happend it keeps on asking me to restore over and over. I unpluged and pluged it back it asked me if i wanted to format it so i

  • I have Microsoft Office 2007 for Windows XP and just got a Macbook Pro. Is there any way to install MO2007 on my new Mac?

    Just purchased a new Mac and need Word, etc. etc. I was wondering how to/if I could install the Microsoft Office 2007 disc on my Mac.

  • RFC: Exception handling

    Hi there, after playing a bit with external xml files provided by my application i'm ready to go to the next topic on my ToDo-List. How could one implement a good and stable technique for exception handling? I'm talking about the interaction from spr

  • Home Hub 3 Access Control not working

    Not sure if anyone can help with this, but here goes! I have access control enabled on my HomeHub 3 which should stop my daughter from using the Internet between 9:00pm and 3:00pm - it seems to be working in as much as she isn't able to access the in

  • Transfering old emails onto new Macbook Pro ?

    On my old white macbook there are lots of old emails that I wish to keep for various reasons. How can i transfer these onto my new Macbook Pro so they sit in the email page (running in classic mode) above the new mail ? ie all the old is in email, ne