Reading aggregated data from a cube/multiprovider
Hi BI people
My project is currently looking for a functionmodule that reads aggregated data from a cube/multiprovider.
I already have a functionmodule that reads data from a cube and returns it in a flat format. I have debugged this, but have not found any flags that can enable the OLAP functionality needed to perform the aggregation. The functionmodule is "RSDRI_INFOPROV_READ_RFC".
The situation is that I need to apply the aggregation logic of a profit center hierrarchy to the data I read from RSDRI_INFOPROV_READ_RFC, this means manually replicating the the OLAP engine functionality (keyfigure aggregation exception, ect.) and this is not an option with the available time/budget.
Please have a look at the example below:
Say that I have a profit center hierarchy as displayed below (with postable nodes).
PC1 - $10
|---- PC2 - $30
|---- PC3 - $20
The data I'm getting back from the functionmodule RSDRI_INFOPROV_READ_RFC looks like this:
PC1 $10
PC2 $30
PC3 $20
But I need the data aggregated. An aggregation utilizing the hierarchy above will make the data look like this:
PC1 $60
PC2 $30
PC3 $20
Instead of building an aggregation program, it would be usefull if it was possible to extract aggregated data.
Any comments appreciated.
Regards
Martin
Thx Olivier,
The problem is that I need a functionmodule that can apply the OLAP aggregation for a hierarchy to the data outpu from RSDRI_INFOPROV_READ_RFC.
... or the best alternative would be if there were a fm/class that could provide me with the hierarchy aggregation of the data.
/Martin
Similar Messages
-
Hi All ,
In Case of reading huge amount of record from BPC Cube from BADI code , performing calculations and writing back huge amount of data into the cube , It takes lot of time . If there is any suggestion to read the data from Cube or writing data into the cube using some Parallel Processing methods , Then Please suggest .
Regards,
SHUBHAMHi Gersh ,
If we have a specific server say 10.10.10.10 (abc.co.in) on which we are working, Then under RZ12 we make the following entry as :
LOGON GROUP INSTANCE
parallel_generators abc.co.in_10 ( Lets assume : The instance number is 10 )
Now in SM59 under ABAP Connections , I am giving the following technical settings:
TARGET HOST abc.co.in
IP address 10.10.10.10
Instance number 10
Now if we have a scenario of load balancing servers with following server details (with all servers on different instance numbers ) :
10.10.10.11
10.10.10.13
10.1010.10
10.10.10.15
In this case how can we make the RZ12 settings and SM59 settings such that we don't have to hardcode any IP Address.
If the request is redirected to 10.10.10.11 and not to 10.10.10.10 , in that case how will the settings be.
I have raised this question on the below thread :
How to configure RZ12 and SM59 ABAP connection settings when we have work with Load Balancing servers rather than a specific server .
Regards,
SHUBHAM -
Any standard function module to read data from a cube
Hi,
I want to read data from a cube say XYZ, into an internal table. Is there any standard function module to do this? If so can anyone plz tell me what change should i make in the function module for my requirement.
Regards
BW Fresher.Hi R,
Try function module 'RSDRI_INFOPROV_READ'
ABAP Report RSDRI_INFOPROV_READ_DEMO contains an example of how the function module can be used.
Udo -
Read and write data from / to cube in CUSTOM_LOGIC BADI
Here are the details:
BPC displays a set of details - with 5 rows and 5 columns. There are 50 additional rows for the 5 columns which are blank to start with.
User changes a cell. BPC 10 only sends that cell which is changed by the user in CUSTOM_LOGIC and WRITE_BACK BADIs.
I need to get other 5 X 5 cells (other than 1 cell, which is changed and being passed in the BADIs) details to determine all the details and calculate the additional 50 rows. After calculating these values, i need to update the cube so that the BPC report refreshes with the data i have updated.
Whats the best way? What are the funtion modules involved in reading / writing? Are there any best practices to read / write data from cube.
Appreciate your help.Hi Ravan,
Look at my sample write back badi here: http://scn.sap.com/message/14290977#14290977
At the end of the code I have values updated in ct_data. The contents of ct_data will be automatically written to the cube (you don't need to have special code to write data).
B.R. Vadim -
Automatically trigger the event to load data from Planning cube to Standard Cube
Hello,
We have a below set up in our system..
1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
2. An actual reporting cube which gets data from the planning cube above.
Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
This involves 2 things..
1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
2. Trigger the DTP which loads data from Planning cube to reporting cube.
We want to automate the above two steps...
I have tried few things to achieve the same..
1. Created an event in SM64,
2. In the Planning cube "Manage" Screen, clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
3. Wrote a ABAP program which changes the setting of the planning cube ( " Change real time load behaviour " to Loading )
4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?hi,
try to do the transformation directly in the input cube by using CR of type exit, more details :
http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
hope it helps. -
Move data from one cube to another cube
Hi,
I am on BW 3.5 and I have moved the data from one cube to another cube and found that the number of records in the original cube does not match to the newly created cube. for eg. if the original cube contains 8,549 records then the back up cube contains 7,379 records.
Please help me on what I need to look on and if in case the records are getting aggregated then how do I check the aggregating record.
Regards,
TysonDear tyson m ,
check with any update rules in ur transfer.If so check in it.
Just go through these methods for making transfer from one cube to another cube fully without missing data.
Update rules method
if it's updated from ods, you can create update rules for cube2 and update from ods
or you can try datamart scenario
cube1 right click 'generate export datasource'
create update rules for cube2, assign with cube1
rsa1->source system->bw myself, right click 'replicate datasource'
rsa1-> infosource -> search 8cube1 name
(if not get, try right click root note 'infosource'->insert lost node(s)
from that infosource, you will find assigned with datasource, right click and 'create infopackage', schedule and run.
Copy from
While creating the new cube give the cube name in the "Copy from" section. It would copy all the characteristics and Key figures. It would even copy the dimensions and Navigational attributes
Another option is:
The steps for copying the contents of one cube to another:
1. Go to Manage -> Recontruct of the new cube.
2. Select the "selection button"(red , yellow, blue diamond button).
3.In the selection screen you can give the technical name of the old cube, requests ids you want to load, from & to date.
4.Execute and the new cube would be loaded.
Its all that easy!!!!!!
Refer this link:
Copying the structure of an Infocube
Reward if helpful,
Regards
Bala -
Hello Everyone,
I have a question, we developed a Basic Cube by building dimensions & Fact from the Source table(Everything in DSV, no Physical DIM and Fact Tables). Now everything goes well. The Source table will only have CUrrent day data.
My Question is: Need to delete particular day data from the cube without disturbing the Existing Data.
Eg: When I process the data on 31st of March, cube will have only 31st. But when I process 1st , 2nd data CUbe should have 31st +2nd data. 1st Dated data should be deleted from cube,when ever I process the data I just need month end data need to be stored
with current day data. all the other stuff should be deleted from the cube.
if I process the Cube on 1st of May, I should only have 31st March, April 30th and May 1st data.
Hope the Question is clear, please let me know.
Any help/suggestions would be appreciated.
Thanks In Advance.
Thanks, Please Help People When they need..!!! Mark as answered if your problem is solved.Hi BKomm,
I Guess the only way to handle this scenario is by using partitions.
Create partitions for every last day of months + One additional partition for current_date.
your where clause for current_date partition should be somewhat like this
Where Date = current_date And Date <> Last_Day_of_current_Month
so that it does not duplicate data for the last day of current month.
Saurabh Kamath
Hello Kamat, I was looking for something else inorder to delete the existing data from a cube. But your Approach is far better than making it complex, it did not strike to my mind. Thanks will implement practically and check.
Thanks Again.
Thanks, Please Help People When they need..!!! Mark as answered if your problem is solved. -
SSRS pulls data from ESSBASE cube, data not showing
Dear Experts,
We are connecting SSRS 2012 to Essbase 11.1.3 to pull data from the cube, and having some issues with some members data not showing in the query designer and report, while some have no problem. And we couldn't find a pattern on which members won't show. We are wondering if this is a known issue and if there is a way to solve the problem.
Thanks very much.
GraceHi ,
Check the following:-
1 Check whether you have made the joins properly.
2 Check data at multiprovider level .
3 is data available for reporting in both the cubes
Regards
Rahul -
Please Help - How to Fetch the Data from a Cube
Dear All,
We created a cube containing dimensions
Customer, Product, Branch, Activity, Time dimensions
using Oracle Analytical WorkSpace Manager.
Once Cube is created,
How can I see the Data existing in the Cube using normal SQL Queries ??? Through Analytical Workspace Manager Toll we r able to see the data. But our requirement is to see the data from the Cube using SQL Queries.
Regards,
S.Vamsi KrishnaYou appear to have the wrong forum. This forum is for the Oracle Workspaces application, which is a component of Oracle Collaboration Suite.
Perhaps one of the forums in the database section will be more relevant:
http://forums.oracle.com/forums/category.jspa?categoryID=18
regards,
-Neil. -
Hi,
I added a info object to a dimension in a cube. now i want to delete the info object from the cube. But since the cube has data in it, i cannot do it unless i delete all the data from the cube. When i select 'delete data', i'm given 2 options, delete fact table only and 2nd option delete fact table and dimension table. what is the difference between the two and what option should i select.
i then need to load the cube back from ODS. the cude right now is setup in the proce4ss chain to do a delta load every day. would this be affected after i delete the data and load it again from the ODS?
Thanks
SameerThe way I understand your question is...
1) You have data in the cube from which you have to remove an infoobject
Ans: To acheive this, you have to delete the data from the cube
2) You have deleted the data and then deleted the IO, activated the cube and the update rules from the ODS which feeds this cube
3) Now you have to reload the deleted data from the ODS. But you already did an init and you are loading delta's from this ODS to the cube through process chain.
Ans: As you've already deleted the whole data from the cube, do a full load from the ODS (you can do this even when delta initialization is done) to your cube by creating a new infopackage under the export datasource of the ODS. This will capture all the records from the ODS. Then to be on the safe side so as not to load any duplicate records...go the infopackage that you've used to do the full load (you can even create a new Info package)...SCHEDULER -> Initilization options for source system -> Delete the existing INIT -> Then do Init again...
Leave the delta infopackage as is and let the process chain take care of the deltas from now on...
Hope this helps. Let me know I am missing something here... -
How to read the value from other cubes ??
Dear All,
I have requirement to do some calculation where part of number there must be got from other info-cube.
e.g.
Cube 1
Has: Ch1, Ch2, Kf1, Kf2, Kf3
Cube 2
Has: Ch3, Ch4, Kf4, Kf5
Then the requirement need to compute:
KF1 = KF4 * KF3
Let say that our planning area use Cube 1.
Means, to get the value for KF1 we need to read data from cube 2.
I've tried to use this function: RSDRI_INFOPROV_READ
But it can't fetch the data. It caused when we go "manage" to
cube 1, the status for the corresponding request is yellow.
If we change the status into green, then we can get the record by that function.
My Questions is
Is there other way that can fetch data from other cube ??
Do you have any suggestions ??
Or ..
In my explanation above, is there some thing missing ?
Regards,
Niel.Hi Olivier,
Really thanks for your response...
But i still have distraction regarding your preceding posting ..
in a query, yellow requests from a trans cube can be read with the variable 0S_RQTRA (most current data) on 0REQUID.
I had a look into the corresponding exit ( FM = RSVAREXIT_0S_RQTRA ) and can see that SAP is filling 0REQUID with VALUE 'REQ_TRANS'
You mean, that we can check what status happening in corresponding info-provider by using this function: RSVAREXIT_0S_RQTRA . Am i right ?
Could you clear me up what the objective is for ?
I've tried your suggested function:
RSSEM_INFOPROV_READ
And yes it can take it .. But why it couldn't fetch all data ?
And also in RSSEM FG, i saw this function:
RSSEM_INFOPROV_READ_RFC
Do you know what the objective is for ??
Still need your guidance.
Regards,
Niel. -
ABAP Function Module Example to move data from one Cube into Another
Hi experts,
Can any please help out in this ..?
A Simple ABAP Function Module Example to move data from one Cube into Another Cube
(How do i send the data from one client to another client using Function moduel).
Thanks
-Upen.
Moderator message: too vague, help not possible, please describe problems in all technical detail when posting again, BI related? ("cube"), also search for information before asking.
Edited by: Thomas Zloch on Oct 29, 2010 1:19 PMThis is the start routine to duplicate records in two currencies.
DATA: datew TYPE /bi0/oidateto,
datew2 TYPE rsgeneral-chavl,
fweek TYPE rsgeneral-chavl,
prodhier TYPE /bi0/oiprod_hier,
market TYPE /bic/oima_seg,
segment TYPE /bic/oizsegment.
DATA: BEGIN OF S_DATA_PACK OCCURS 0.
INCLUDE STRUCTURE /BIC/CS8ZSDREV.
DATA: END OF S_DATA_PACK.
S_DATA_PACK[] = DATA_PACKAGE[].
REFRESH DATA_PACKAGE.
LOOP AT S_DATA_PACK.
move-corresponding s_data_pack to DATA_PACKAGE.
if DATA_PACKAGE-loc_currcy = 'EUR'.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalgrc.
DATA_PACKAGE-CURRENCY = 'USD'.
APPEND DATA_PACKAGE.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalloc.
DATA_PACKAGE-CURRENCY = 'EUR'.
APPEND DATA_PACKAGE.
else.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalgrc.
DATA_PACKAGE-CURRENCY = 'USD'.
APPEND DATA_PACKAGE.
endif.
ENDLOOP.
This is to load Quantity field
RESULT = COMM_STRUCTURE-BILL_QTY.
This is to load Value field
RESULT = COMM_STRUCTURE-NETVAL_INV.
UNIT = COMM_STRUCTURE-currency. -
Data from transaction cubes to text file in bw directories
hi all
i have 2 transactional planning cubes in BW. the data in these cubes in bw needs to be uploaded monthly in to a single text file so that APO system can access the data for some purpose.
my question is how to upload the data from transactional cubes to the text file, which is more detailed. is that use ful to construct a ODS or CUBE(basic) on top of the transactional cubes.
if i keep the file in BW directory whethere APO can access that.?..
please help me out
praveen.You can load data directly from the BW cubes to APO if the BW system is set up as a source system.
If you need the file then you may want to investigate using open hub. -
Partitioning two Years Of Data From One Cube Yo Another
Hi Experts,
Can any one please help me on the following issue.
I got the requirement where I need to copy the data from one cube to another having unequal number of dimensions.
I have achieved this using Transparent partitioning, But I could be able to copy only one year data.
In cube1 I have the months like Jan, Feb...Dec and in cube2 I have the structure like Jan-11..Dec-11, Jan-12...Dec-12.
While creating the partition how can I map the Jan with Jan-11 and Jan-12.
Thanks In Advance,
RamYou can map manually. it is possible.
-
Copy Master Data from BW cube to BPC cube
Hi,
I need to copy a master data from BW cube to BPC cube.
Can I do this from BW? That is, Can I copy master data from the view of the cube of BPC in BW?
I do this from BW and then I see the master data in my BPC cube from the BW view, but when I open my cube in the BPC environment, I don't see nothing of the master data.
What's the problem?
Regards,
Miguel.Hi Miguel -
I think you are asking if you can copy transactional data from a BW cube into a BPC cube (cubes do not contain master data).
If this is what you want to do, yes the BPC IMPORT package is a delivered data manager package that allows you to select a BW cube, transformation file, and additional options (such as work status checking and default logic execution). The use of the Data Manager [process is the "best practice" since you will need to transform the BW data model into the BPC data model and the tools delivered in the IMPORT data manager package are ideally suited for the job. You can automate this process by following the guide at:
[https://wiki.sdn.sap.com/wiki/display/BPX/Enterprise%20Performance%20Management%20%28EPM%29%20How-to%20Guides|https://wiki.sdn.sap.com/wiki/display/BPX/Enterprise%20Performance%20Management%20%28EPM%29%20How-to%20Guides] The specific document is: "How to Export BPC Transaction Data Using a Custom Process Chain"
If you are actually asking for the process for loading master data into BPC dimensions, please read the following "How To" guide that describes the current best practices:
[https://wiki.sdn.sap.com/wiki/display/BPX/Enterprise%20Performance%20Management%20%28EPM%29%20How-to%20Guides|https://wiki.sdn.sap.com/wiki/display/BPX/Enterprise%20Performance%20Management%20%28EPM%29%20How-to%20Guides] The specific document is: "How to Automate Master Data Loads in BPC NW"
Regards,
Sheldon
Maybe you are looking for
-
My photos look terrible in CC. Very pixelated. However they look much better in Bridge and CS6. Am I missing a setting. It is impossible to edit this way.
-
N-step process driven workflow in SRM7
I am trying to develop a custom n-step workflow in SRM7. To begin with I thought I would copy the standard Spending Limit n-step set up, get it working and then adapt the code. When I set up my process schema with SAP standard evaluation ID 0EV_SC_SL
-
I want to make a game that scrolls. I've made games before and I've had to use voids like checkMap() to find out where the user is and put him in the right place and all this boolean, int, void crap and I think it would be much easier if I could just
-
Apply archive logs to database
I have a production database that is online. I'm planning to setup a pc with the same configurations as that of the db server and use that server as the reporting generation. The clone pc's database will be one day delay compared to the production da
-
Hello All. I currently have a 3Gs running iOS version 3.0.1. It has never been jailbroken, I just haven't upgraded because I haven't had a computer in a long while. I just recently got a macbook pro 13" (09 model) and would like to upgrade from iOS v