Data change in a cube
Hi,
I've a situation.
My company has a BI reporting system in place with mutliple cubes & ODS storing the demand & billing & all other data, with periodic loading taking place.
Now due to some organizational changes, some (approximately 100) of the customer_shipto's code has to be chaged in the main ERP system and accordingly this change should be reflected in the BI system also.
for e.g : A Old shipto : 9000001101 should now become 9000002201 (a hypothetical example).
What would be the best strategy to make this change happen.
I was thinking in the following lines :
1. create a copy of each cube
2. load the data data from original cube to copy (new) cube
3. While loading, in the update rule of customer_shipto do the translation (change from old shipto to new shipto).
3. Delete data from the original cube
4. Copy data from the new cube to original cube.
The problem with this approach is : the number. of such info-providers to too high and the data is huge in volume, the above activity will take a lot of effort.
Is there better approach to achieve the results.
I'll appreciate a good response.
Regards,
Nagendra.
Delete Data....then write the routine for the change in the code...
now instead of creating a copy, just reconstruct those requests....
but before that make an ODS that contains old codes and the corresponding new codes...
while writing the routine, look-up for the corresponding new codes from that ODS.
If you think of something better, let me know.
I am also thinking.
Regards
Gajendra
Similar Messages
-
How can data in cube will update after Master data changes
Hi,
We have a Revenue cube which we are loading actuals after every month.
Now we have master data changes at profit center level. Profit center is moving from Holdings to Power, effective with October reporting. So master data we updated accordinglt now we need to update data in cube.
I want know how can we update all the data in cube without dropping and relaoding again.
We have DS0s first then cube.
Please let me know available options.
Thanks
SivaprasadHi,
Make the field which stores data whether it is holding or power as navigational attribute of profit centre.
So whenever master data changes just update profit center master data . Use this nav attribute in reports.
So whenever master data changes report changes automatically.
If you want time dependant 2008 to 2009 Holdings then 2007 power enable profit center time dependant master data.
Hope it helps.
Thanks,
Arun -
ERROR : "Exceptions in Substep: Rules" while loading data from DSO to CUBE
Hi All,
I have a scenario like this.
I was loading data from ODS to cube daily. everything was working fine till 2 days before. i added some characteristic fields in my cube while i didn't do any change in my key figures. now i started reloading of data from ODS to cubes. i got the following error "Exceptions in Substep: Rules" after some package of data loaded successfully.
I deleted the added fields from the infocube and then i tried toreload data again but i am facing the same error again "Exceptions in Substep: Rules".
Can any one tell me how to come out from this situation ?
Regards,
Komik ShahDear,
Check some related SDN posts:
zDSo Loading Problem
Re: Regarding Data load in cube
Re: Regarding Data load in cube
Re: Error : Exceptions in Substep: Rules
Regards,
Syed Hussain. -
hi gurus,
please tell me from which table (Changelog or Active data) of DSO the cube will pick the records
and also overwrite functionality works with DSO Change log
Please awaiting a quick response
Thank youHi,
Answer is already in my previous post.
i.e. Delta load - From Change Log,
but in BI7.0 data flow, when you are using DTP the Delta behaviour will be based on your selection. as below.
BW Data Manager: Extract from Database and Archive?
The settings in the group frame Extraction From... or Delta Extraction From... of the Data Transfer Process maintenance specify the source from which the data of the DTP is extracted. For a full DTP, these settings apply to all requests started by the DTP. For a delta DTP, the settings only apply to the first request (delta initialization), since because of the delta logic, the following requests must all be extracted from the change log.
For Extraction from the DataStore Object, you have the following options:
Active Table (with Archive)
The data is read from the active table and from the archive or from a near-line storage if one exists. You can choose this option even if there is no active data archiving process yet for the DataStore object.
Active Table (Without Archive)
The data is only read from the active table. If there is data in the archive or in a near-line storage at the time of extraction, this data is not extracted.
Archive (Only Full Extraction)
The data is only read from the archive or from a near-line storage. Data is not extracted from the active table.
Change Log
The data is read from the change log of the DataStore object.
For Extraction from the InfoCube, you have the following options:
InfoCube Tables
Data is only extracted from the database (E table and F table and aggregates).
Archive (Only Full Extraction)
The data is only read from the archive or from a near-line storage.
hope this gives you clear idea. -
Data Extraction and ODS/Cube loading: New date key field added
Good morning.
Your expert advise is required with the following:
1. A data extract was done previously from a source with a full upload to the ODS and cube. An event is triggered from the source when data is available and then the process chain will first clear all the data in the ODS and cube and then reload, activate etc.
2. In the ODS, the 'forecast period' field was now moved from data fields to 'Key field' as the user would like to report per period in future. The source will in future only provide the data for a specific period and not all the data as before.
3) Data must be appended in future.
4) the current InfoPackage in the ODS is a full upload.
5) The 'old' data in the ODS and cube must not be deleted as the source cannot provide it again. They will report on the data per forecast period key in future.
I am not sure what to do in BW as far as the InfoPackages are concerned, loading the data and updating the cube.
My questions are:
Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
Your assistance will be highly appreciated. Thanks
Cornelius FaurieHi Cornelius,
Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
-->> Try to load data into ODS in Overwrite mode full update asbefore(adds new records and changes previous records with latest). Pust delta from this ODS to CUBE.
If existing ODS loading in addition, introduce one more ODS with same granularity of source and load in Overwrite mode if possible delta or Full and push delta only subsequently.
Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
--> Yes, It is correct. Otherwise you will loose historic data.
Hope it Helps
Srini -
How can i update data of DSO and Cube using ABAP??
Hello Experts
I have a requrement in which i need to update/ delete data from DSO and cube based on certain keys using ABAP.
Please let me know how can i modify contets of cube or active table of DSO.
Thanks
SudeepHi
I have requirement such that i need to update certain key figures in DSO after certain time.
for eg. say record with key a is loaded to DSO and currospoding key figure was loaded with value 10
after few days because of certain parameters changing in system i need to modify key figure value.
currently i am doing same using 1 self transformation i.e. by loading same data again in same DSO.
Amount of data is very huge and causing performance issues in system.
now A is not key but i need to modify record for few combinations which apperar multiple times in DSO.
Same DSO data is loaded into Cube regularly.
I may need to update data after gap of few weeks as well.
This design will be used as template and needto keep whole logic generic.
So wring Function module can make it generic.
Thanks
Sudeep -
Where to get BW Metadata: owner, creation date, change date of a query / WS
Hello,
I need a report over the existing queries / worksheets and the owner, creation date, change date of a query etc.
You see some of the information when you go over query properties in the query designer. But you see only the information of one (the opened) query. And you have to do this for every query ...
My idea is to go over BW Metadata in the technical content.
Here is the cube BW Metadata 0BWTC_C08
(The InfoCube BW Statistics u2013 Metadata contains metadata from the Metadata Repository.)
Is this the way to do it? Or any other suggestions u2026
Can I get infos about used structures , etc over this way
Thanks MarkusI had to work on an other subject:
But now the source of information is clear:
RSRREPDIR - index of all queries
RSZELTDIR - index of all queries-components
RSRWORKBOOK - Verwendungsnachweis für Berichte in Workbooks
RSRWBINDEX - List of binary large objects (Excel workbooks)
RSRWBINDEXT - Titles of binary objects (Excel workbooks) in InfoCatalog
The tables are to join over
RSRREPDIR.COMPUID = RSZELTDIR.ELTUID
RSZELTDIR.TEXTLG contains the description
RSRWORKBOOK.GENUID = RSRREPDIR.GENUID
RSRWBINDEXT and RSRWBINDEX are connected over WORKBOOKID
I'd like to put the information of all of this tables in a cube and define a query on it.
I have to define new datasource, infosource to get the data in the cube.
Right?
Now i see some existing datasource objects in the technical content.
0TCTQUERID, 0TCTQUERID_TEXT, 0TCTQUERY, 0TCTQUERY_TEXT
I can't open them to look in. But they might be helpfull. Anybody used them?
Markus -
Cannot Lock and Send data to an Essbase cube
Hi all,
One of our customer is executing a Macro script to lock and send data to the essbase cube from an excel sheet.
They reported that in several cases where users will submit their data, and later discover that their changes are not in Essbase.
The calls to EssVRetrieve (to lock the blocks) and EssVSendData are both returning successfully and there is no error message received while executing the above macros.
I reviewed the application log file and found the following message:
[Mon Nov 24 18:59:43 2008]Local/Applicn///Warning(1080014)
Transaction [ 0xd801e0( 0x492b4bb0.0x45560 ) ] aborted due to status [1014031].
I analysed the above message and found the user is trying to lock the database when already a lock has been applied to it and some operation is being performed on it. Because of that the transaction has been aborted. But customer says no concurrent operation is being performed at that time.
Can anyone help me in this regard.
Thanks,
RajaThe error message for error 1014031 is 'Essbase could not get a lock in the specified time.' The first thought I have is that perhaps some user/s have the 'Update Mode' option set in their Essbase Options and thus, when they are retrieving data, they are inadvertantly locking the data blocks. If that is the case, you will probably see this issue sporadically as the locks are automatically released when the user disconnects from Essbase.
To make it stop, you will have to go to every user's desktop and make sure they have that Essbase Option turned off. Further, you will have to look at any worksheets they may use that may have an Essbase Option name stored on it. The range name is stored as a string and includes a setting for update mode. Here is a sample that I created for this post where I first turned 'on' update mode and then turned 'off' update mode:
A1100000001121000000001100120_01-0000
A1100000000121000000001100120_01-0000
Note the 11th character in the first string is '1' which indicates that Update Mode is 'on'; in the second string, it is 'off'.
This behavior, particularly with update mode, is the only one of the behaviors that I disliked in Excel and pushed me to design our Dodeca product. In Dodeca, the administrator controls all Essbase options and can either set individual options to the value they want or they can allow the user to choose their own options. Most of our customers do not allow the user to set update mode.
Tim Tow
Applied OLAP, Inc -
RDA for Transactional Data Changes (But needs to load as Master Data to IO)
Hi,
I would like to use RDA to load Data changes into IO (Not for DSO).
I tried using this, i can able to load data into PSA. But, from PSA to IO, No datapackage is getting transfered. i.e. Transfered records = 0.
We considered several options i.e. remote cubes etc. We need to implement RDA.
Note: I have gone through the Documentation on RDA. Which SAP mentioned, RDA won't support for Master Data if we have aggregates. We don't have any Aggregates build on my cubes.
Thanks in advacne!!!
Nagesh Ganisetti.RDA DTP is used for supplying data to the Data Store Objects but not to the Master data infoobjects...
Please follow the links to get to know more on this..
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
thanks
hope this helps.. -
Information Broadcasting(Event Data change in Info provider)
Hi All,
Does anyone have experince on the functionality Information Broadcasting ,Can you please help me where we use Trigger event when change in the info provider in the process chains.I want to know when we use the (Trigger event in the Broadcaster) in the process chains,how will it take effect in the Information Broadcasting scheduler screen when we select the particular Info provider when there is any data change.Can you please elaborate on the topic if any one has used the event data change in the info provider.Actually I Executed and scheduled the pocess chains, when I go to the Process chain log ,It says that the Job is finished and the data change is occured for the Cube,but I dont get any error message and I dont get any mail to my Inbox.Can you please through some light if anyone has worked on this,Answers are always appreciated and rewarded.
Thanks.Hi,
Usually we trigger an event in SM64 tcode..if u want to create an event u will go for SM62.
In addition to time- and calendar-based job scheduling, the background processing system supports event-driven scheduling.
Triggering an event notifies the background processing system that a particular condition has been met. The background processing system reacts by starting any jobs that were waiting for that event.
Events are not saved. Once an event occurs and triggers any jobs that were waiting for that event, the event is discarded
U can monitor process chain via Tcode 'CCMS'..
Information broadcasting allows you to make objects with Business Intelligence content available to a wide spectrum of users, according to your own requirements.
Go through this
Information Broadcasting:
http://help.sap.com/saphelp_nw04/helpdata/en/a5/359840dfa5a160e10000000a1550b0/content.htm
Including an Event Data Change in a Process Chain :
http://help.sap.com/saphelp_nw04/helpdata/en/ec/0d0e405c538f5ce10000000a155106/content.htm
Regards-
Siddhu
Message was edited by: sidhartha -
Masterdata infobject data changes report
Hi all,
i have a query regarding Infobject Data changes to be tracked through Technical content cubes.
we have many TC cubes in BI were we can track Infobjects last changes and so on .
Now req is that when anyone changes the Infobject Data it should be tracked in .
bec we dont have an standard Technical content Infobject(whcih tracks data chnges of Infobject) to include in the TC cubes .
So can u suggest me in this issue .
Thanks
VenkatHi Chuan,
Attached the Error message here. We use BPC NW10
Rajesh. -
Max data pull from Virtual Cube - is this a setting?
We have a user doing a query against a Remote cube in our BW system, and they're hitting a "maximum data" limit of data from this remote cube. Is this a setting for this cube or globals, and can you modify it?
Thanks,
Ken Little
RJ Reynolds TobaccoHi,
MAXSIZE = Maximum size of an individual data packet in KB.
The individual records are sent in packages of varying sizes in the data transfer to the Business In-formation Warehouse. Using these parameters you determine the maximum size of such a package and therefore how much of the main memory may be used for the creation of the data package. SAP recommends a data package size between 10 and 50 MB.
https://www.sdn.sap.com/irj/sdn/directforumsearch?threadid=&q=cube+size&objid=c4&daterange=all&numresults=15
MAXLINES = Upper-limit for the number of records per data packet
The default setting is 'Max. lines' = 100000
The maximum main memory space requirement per data packet is around
memory requirement = 2 * 'Max. lines' * 1000 Byte,
meaning 200 MByte with the default setting
3 THE FORMULA FOR CALCULATING NUMBER OF RECORDS
The formula for calculating the number of records in a Data Packet is:
packet size = MAXSIZE * 1000 / transfer structure size (ABAP Length)
but not more than MAXLINES.
eg. if MAXLINES < than the result of the formula, then MAXLINES size is transferred into BW.
The size of the Data Packet is the lowest of MAXSIZE * 1000 / transfer structure size (ABAP Length) or MAXLINES
Goto RSCUSTV6 tcode and set it.
Go to your Infopackage, from tool bar, scheduler, data packet settings, here you can specify your data packet size
Go to R/3, Transaction SBIW --> General settings --> Maintain Control Parameters for Data Transfer.
Here you can set the maximum number. But the same can be reduced in BW...
Info package>Scheduler>datas default data transfer-->here you can give the size, but can reduce the size given in R/3 side, you cant increase here...
In RSCUSTV6 you can set the package size...press F1 on it to have more info and take a look to OSS Notes 409641 'Examples of packet size dependency on ROIDOCPRMS' and 417307 'Extractor Package Size Collective Note'...
Also Check SAP Note 919694.
This applies irrelevant of source system meaning applicable for all the DS:
Go To SBIW-> General Settings -> Maintain Control Parameters for Data Transfer -> Enter the entries in table
If you want to change at DS level then:
IS->IP -> Scheduler Menu -> Datas. Default Data Transfer and change the values.
Before changing the values keep in mind the SAP recommended params.
Hope this helps u..
Best Regards,
VVenkat.. -
Sql query to bind data from grid and print total count and amount total when date changes
SELECT SLHD.VOUCH_CODE,SLHD.VOUCH_DATE,SLHD.VOUCH_NUM,SUM(SLTXN.CALC_NET_AMT) AS AMT,ACT.ACT_NAME,SUM(SLTXN.TOT_QTY) AS QTY
FROM SL_HEAD20132014 AS SLHD,ACCOUNTS AS ACT,SL_TXN20132014 AS SLTXN
WHERE SLHD.ACT_CODE=ACT.ACT_CODE AND SLTXN.VOUCH_CODE=SLHD.VOUCH_CODE
GROUP BY SLHD.VOUCH_CODE,SLHD.VOUCH_DATE,SLHD.VOUCH_NUM,ACT.ACT_NAME
ORDER BY SLHD.VOUCH_DATE
i want to print total quatity and total sale in grid when data changes
like
date amount quantity
01/02/2013 1200 1
01/02/2013 200 1
01/02/2013 1400 2 // date changes here
02/03/2013 100 1
02/03/2013 50 4
02/03/2013 150 5 // date changes and so onthis query only print all the data from table i want total quantity and total amount of daily sale in same grid when ever date changes
You may add the date filter to Visakh's query:
SELECT SLHD.VOUCH_DATE,SUM(SLTXN.CALC_NET_AMT) AS AMT,SUM(SLTXN.TOT_QTY) AS QTY
FROM SL_HEAD20132014 AS SLHD,ACCOUNTS AS ACT,SL_TXN20132014 AS SLTXN
WHERE SLHD.ACT_CODE=ACT.ACT_CODE AND SLTXN.VOUCH_CODE=SLHD.VOUCH_CODEand SLHD.VOUCH_DATE = @yourdate --passed from the front end application
GROUP BY SLHD.VOUCH_DATE
WITH CUBE
ORDER BY SLHD.VOUCH_DATE
Having said, each time when you select the date, you query the table would be expensive method. May be you can filter the date within your dataset already populated if you have entire data in the dataset. -
Broadcaster event data change process step in process chain goes on HOLD
Hi,
We have a process step of event data change in process chain for a cube. the broadcaster has some settings built on this to send email whenever data updates happent to that cube.
It was working fine till last week; suddenly the process started going on sleep or hold mode. i ahve tried deleting old process variant and added new one ...still the job hangs.
Also in SOST i can see that the email based on broadcaster setting were sent. Though users havent recieved any. the job logs in SM37 say:
Job started
Step 001 started (program RSPROCESS, variant &0000000518457, user ID BIWREMOTE)
Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCAST4E9S51L2EFE7AXG4IFUKNJW3W with ID 01462600
Program RSBATCH_EXECUTE_PROZESS successfully scheduled as job BIBCAST4E9S51L2EFE7AXG4IFUKNJW3W with ID 01462601
Submit report for report RSBATCH_EXECUTE_PROZESS failed; see long text
Submit report for report RSBATCH_EXECUTE_PROZESS failed; see long text
Submit report for report RSBATCH_EXECUTE_PROZESS failed; see long text
but when i see the job mentioned in this failed step:
Job started
Step 001 started (program RSBATCH_EXECUTE_PROZESS, variant &0000001419141, user ID KRANTUL1)
Precalculation: initialization started
Precalculation: processing started
Precalculate data (fill DataStore)
Processing data provider DP, query CP_NORDIC_IAN023
Processing data provider GR4Broadcaster, query
Calculate documents (build documents)
Start Precalculation Web Templates with Precalculated Data
Implementing settings finished
Precalculation: initialization started
Precalculation: processing started
Precalculate data (fill DataStore)
Processing data provider DP, query CP_NORDIC_IAN017
Processing data provider GR4Broadcaster, query
Calculate documents (build documents)
Start Precalculation Web Templates with Precalculated Data
Implementing settings finished
Job finished
So the users should have recieved the email . we have separately tested sending out emails and it works fine. just that process in process chain doesnt get completed at all.
the slg1 logs are all successful.
any pointers what could be wrong ?Use process the INTERRUPT if the last precalculation and alert messaging are succesful. Guess it is looping with first precalucatation
-
Can i get the data file from the cube
Hi All,
Previously I created and activated one cube using one data file, by mistake lost that data file. Is there any chance to get that data file from that Cube?
Thanks
BhaskarHi Paul,
yes you can..
1) If you have loaded through PSA , Then goto that PSA table and from settings menu> change display variants-> view tab> view as Excel> then save this file
2) If you have not used PSA.then use LISTCUBE to view the data and change settings to excel and download..
Don't forget to assign points,If it is useful..
Further queries always welocme..
Regards.
Pradeep choudhari
Maybe you are looking for
-
Aperture reprocesses previews for all videos every time I open it
Hello, I use Aperture to organise all my videos shot on Canon 7d and an old HD Cam (705 vids, 381GB). Whereas my stills are stored in the aperture library, the videos are imported into eparate folders named after each project inside one master video
-
I just upgraded my iMac to Lion, and a few seconds ago, I upgraded the iWork package and now I'm having some problems with the new version of Pages (4.1) When I try to change something in a document, a message says "the program is not responding", so
-
Printer HP Laserjet 200 MFP M276 is not printing in color
I am trying to get my printer to print in color, however I only get blue output, not in the least the colors it should print. Any ideas what might be the cause? FYI: I am using original HP toner Thanks Yvonne This question was solved. View Solution.
-
2012 R2 Hyper-V HOST with virtualized 2008 R2 SP1 RDSH - Remote FX Capable?
This article goes over RemoteFX and RDSH for 2008 R2 SP1, but nothing is discussed in regards to 2012 R2 Hyper-V HOSTs for the virtualized RDSH server. http://blogs.msdn.com/b/rds/archive/2011/03/25/q-amp-a-microsoft-remotefx-and-remote-desktop-sessi
-
Don't we have option to tranfer through IDOC
Hi Friends, don't we have option to upload the data through IDOC in BI 7.0 , in BW 3.X means we have option that we can upload the data through PSA or IDOC.. ..in tranfer structure.. .. if yes means.. where can we select this option.. and i