Data Marting from Infocube to ODS
Hi,
I am trying to extract information from an Infocube and load it into an ODS. I generated the export datasource of the infocube and created update rules on the ODS from that generated infosource (8+infocube name). Created an infopackage for the init, and the deltas. When I try to load an init with data, it brings 0 rows, but when I do a full update, it brings all the rows from the infocube. Even if I do an init w/o delta, and then do a delta afterwards, it does not bring any rows. The problem with that is obviously our development system is small compared to our larger production system, and we cannot afford to do full loads every night. Am I doing something wrong? Do additive deltas work from Infocube datamarts?
Thanks,
Hi,
If u dont roll up when u have aggregates, then the requests are not available for reporting. and its even not identified for next delta. So if u have aggregates, pls roll it up and then try pulling delta. Hope this time it will work.
Regards
Sriram
Similar Messages
-
Error Caller 09 contains error message - Data Marts loading(cube to ODS)
Dear all,
Please ! Help me in this problem, This is very urgent.
I have one process chain that loads data from BIW to BIW only through Data Marts. In that process chain, one process loads data from one cube(Created by us) & loads data to one ODS(also created by us). Data is loaded through full update & for the selected period specified in 'Calender Day' field in data selection.
Previously I was able to load data for 2 months, but some days ago, suddenly one day, the process of Extraction got stuck in background for long time,& showed following error :
Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Extractor .
Refer to the error message.
Procedure
How you remove the error depends on the error message.
Note
If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
Then we killed that process on server & after another attempt, It showed some calmonth...timestamp error. Then after reducing data selection period, It had been loaded successfully, after that I was able to load data for 20 days,Again after some days process got stuck,I followed the same procedure,reduced the period to 15 days & continued, Now I can't even load data for 5 Days successfully in one attempt, I have to kill that process in background & repeat it, then sometimes It get loaded.
Pls, suggest some solutions as soon as possible. I am waiting for your reply. Points will be assigned.
Thanks,
Pankaj N. Kude
Edited by: Pankaj Kude on Jul 23, 2008 8:33 AMHi Friends !
I didn't find any short dump for that in ST22.
Actually , What happens is, Request continues to run in background for infinite time. At that time
Status Tab in Process Monitor shows this messages :
Request still running
Diagnosis
No errors found. The current process has probably not finished yet.
System Response
The ALE inbox of BI is identical to the ALE outbox of the source system
or
the maximum wait time for this request has not yet been exceeded
or
the background job has not yet finished in the source system.
Current status
in the source system
And Details Tab shows following Messages :
Overall Status : Missing Messages or warnings
Requests (Messages) : Everything OK
Data Request arranged
Confirmed with : OK
Extraction(Messages) ; missing messages
Data request received
Data selection scheduled
Missing message : Number of Sent Records
Missing message : selection completed
Transfer (IDOCS and TRFC) : Everything OK
Info Idoc1 : Application Document Posted
Info Idoc2 : Application Document Posted
Processing (data packet) : No data
This Process runs for infinite time, then I have to kill that process from server, & Then It shows Caller 09 Error in Status Tab
Should I Changed the value for that memory parameter of server or not ?. We r planning to try it today, Is it really belongs to this problem, Will it be helpful ? What r the risks ?
Please, give your suggestion as early as possible, I m ewaiting for your reply.
Thanks,
Pankaj N. Kude -
Giving Error while generating the Data mart to Infocube.
Hi Gurus,
I need to extract the APO infocube data in to the BW infocube. For that iam trying to generate the data mart to APO infocube .
So, that i can use that data mart as a data source to extract that APO Infocube data in to BW infocube.
In that process iam trying to generate the datamart for APO Infocube . But while generating it is giving errors like below:
Creation of InfoSource 8ZEXTFCST for target system BW 1.2 failed
The InfoCube cannot be used as a data mart for a BW 1.2 target system.
Failed to create InfoSource &v1& for target system BW 1.2.
PLease suggest me what to do for this Error problem.
Thanks alot in Advance.Hi,
Point No : 1
What is Planning Area :
http://help.sap.com/saphelp_scm41/helpdata/en/70/1b7539d6d1c93be10000000a114084/content.htm
Point No : 2
Creation Steps for Planning Area :
http://www.sap-img.com/apo/creation-of-planning-area-in-apo.htm
Note : We will not create Planning Area.This will be done by APO team.
Point No 3 : Afetr opening the T-Code : /n/SAPAPO/MSDP_ADMIN in APO you will be able to see all the planning areas.
Point No 4 : Select your planning area and Goto Extras menu and Click on Generate DS
Point No 5. System automaticall generate the DS in APO (Naming Convention start with 9) and Replicate the DS in BI Map to your cube and load the data.
Regards
Ram. -
Unable to automate deletion of old data (requests) from InfoCube using a PC
Each week I copy open Notifications into a 'SnapShot' cube, from the Transaction Cube, using the current date to make it a unique copy/snapshot for that week. This is then reported on showing, amongst other things, the trend of open Notifications throughout the year.
In an effort to ensure this doesn't grow too big, I will only keep a rolling 52 weeks of data. I thought I could use a feature available when creating Process Chains called 'Delete Overlapping Requests from InfoCube' to manage the amount of data held (52 weeks only).
The variant used to delete the requests has been created linked to object type 'Data Transfer Process' (DTP) and the actual DTP used to load the 'SnapShot' cube ... I then click the 'delete selections' button. On the next window I select 'Request Selection Through Routine', then click 'Change Routine'. I entered the following code, which I borrowed from another thread:
data: datum(16) type c,
date type sy-datum.
date = sy-datum.
date = date - 365.
break-point.
concatenate date sy-uzeit into datum.
loop at l_t_request_to_delete.
delete l_t_request_to_delete where timestamp gt datum.
endloop.
append l_t_request_to_delete.
p_subrc = 0.
I get a message saying "No Requests for deletion were found'. Any ideas?HI Tony,
Please check whether the timestamp value you are popolating i filed datum with the following statement:
concatenate date sy-uzeit into datum.
has same data format and type as the timestamp for the request.
If this is not same then you will not be able to find any request corresponding to timestamp.
- Geetanjali -
How is data loaded from Infocube to Planning area - need technical info
I would like to find out how data is loaded from cube to planning area ,
I know that CUBE has tables, but how does that get loaded to planning area , where is the mapping from which I can compare
data in CUBE and the same data that reaches planning area.
Say for example I have below values in the infocube
Prod1 --> Loc1 -
> available provisioning qty
AAA AB90 100
Then where do i check in the planning area tables ( are there any tables mapped) after running the TSINPUT program, i know it can be checked in the planning book , but i want to check in planning area tables if they exiHi ,
The data is loaded from infocube to planning area using update rules. The issue you have mentioned seems to be two requests are having data for the same CVCs in the cube.
Example: For the same CVC, two requests avilable in the infocube. When you copy the data using TSCUBE transaction, whatever data avilable in the cube for the CVC gets copied into Planning Area.
CVC1 - cube - Old Request - 100
CVC1 - cube - Actual Request - 200
CVC1 - Planning Area = 300 ( The value is supposed to be 200, but since the old request also contains data in the cube for the cvc, it gets copied as 300)
Issue : There might two request contains data for the same code
Solution : 1. Check the data in the cube using transaction LISTCUBE 2. Delete old request and check the data agian. 3. If it matches with your requirement, run the TSCUBE again.
Please let me know if you need additional information.
Thanks,
Jeysraj -
Data copy from infocube to infocube in two different BW systems
Dear All,
If i have a scenario like i have an infocube for sales in two different bw systems, and if i want to send the data from one infocube to another infocube what would be the strategy we have to follow. Is it that i need to send it as a flat file or is there any other mechanism with which i can send the data into the other infocube.
Yours answers would be rewarded.
Regards
Vijayhi Vijay,
no, you have no need to send it as flat file if you have these 2 bw connected,
you can use datamart scenario, where one bw act as source system for the other bw, we have such discussion several times, you can check the steps
Loading data from cube1 to cube2
Loading data from one cube to another cube.
hope this helps. -
Data load from Infocube to Infocube 170 million records.
Hi All,
I want to load the data from 1 infocube to another infocube. Now I have developed in Development. But in production if I start the Full load it will load or any time out error it will show? bcz 17 crores records I have to load from old infocube to new infocube.
Please advise any precaution should I take before start load from old to new infocube.
Thanks in Advance
ShivramYou need not load the entire 170 mil records at a go.
Please do a selective loading, i.e. say, based on Doc Number or Doc Date or Fisc. Period or Cal month, some characteristic like this which is unique for all records.
This will ensure that the data is getting loaded in small amounts.
As said above, what you can do is, create a process chain.
Drop indexes from the 2nd cube. Make multiple infopackages, with different selections, placed one after the other in the process chain, loading the data into cube 2.
Then build your indexes, after the loads are complete i.e. after 170 mil records have been added to cube 2. -
Advantage of data transfer from infoobject to ODS
Hi,
1)what is the advantage of bringing data first in infoobject and then moving into DSO ?? why cant we directly bring into DSO?why we need infoobject in between
2) can we use infocube in Infoset? if no then y?Hello,
You don't normally feed data from InfoObject to DSO. You normally feed InfoObject and DSO. The DSO can only have the key of the InfoObject, everything else related to that InfoObject is already loaded.
For example you have InfoObject employee. This InfoObject can be categorized by this:
employee_key employee_name employee_addr
1 xpto bla bla bla
In the DSO you can only feed the key of the employee, in this example 1. With InfoObjecte loaded you can use a relation of the key to read the other fields in the query, in this example you could read employee name and address in the query created over the DSO even if DSO only has the employee key.
InfoSet previously (in BW 3.x) didn't support InfoCubes, because InfoCubes isn't a flat table like a DSO or InfoObject tables.
Now SAP manage to support it under the new version of Netweaver 2004s BI 7.
Diogo. -
No data Loaded from infocube to another infocube
Hi all,
I am trying to create copy of info cube....Here I am getting this erro at the end
No request Idoc generated in BW
Diagnosis
No request IDoc has been created in BW. A short dump has most probably been logged in BW.
Procedure
Look for the short dump belonging to your data request in the short dump overview in BW. Pay attention to the correct date and time in the selection screen.
You can get the short dump list using the Wizard or from the monitor detail screen via the menu path "Environment -> Short dump -> Data Warehouse".
Removing errors:
Follow the instructions in the short dump.
please guide me for thishi,
Go to ST22 or from top menu go to dump pverview and check whts erro message...
might be you hve run out of space or processes...it could be anything...
Gaurav
Asign pts if it helps -
Data mart from two DSOs to one - Loosing values - Design issue
Dear BW experts,
I´m dealing with a design issue for which I would really appreciate any help and suggestions.
I will be as briefly as possible, and explain further based on the doubts , questions I received in order to make it easier go through this problem.
I have two standard DSOs (DSO #1 and #2) feeding a third DSO (DSO #3), also standard.
Each transformation DOES NOT include all fields, but only some of them.
One of the source DSO (let´s call it DSO #1) is uploaded with a datasource that allows reverse type of records (Record Mode = 'R'). Therefore some updates on DSO #1 comes with one entry with record mode 'R' and a 2nd entry with record mode = 'N' (new).
Both feeds are delta mode, and not the same entries are updated through each of them, but the entries that are updated can differ (means an specific entry (unique key values) could be update by one of the feeds, but no updates on the 2nd feed for that entry).
Issue we have: When a 'R' and 'N' entries happen in DSO #1 for any entry, that entry is also reversed and re created in the target DSO #3 (even being that not ALL fields are mapped in the transformation), and thefore we loose ALL the values that are exclusively updated through DSO #2, becoming blank.
I don´t know it we are missing something in our design, or how should we fix this issue we have.
Hope I was more or less clear with the description.
´d really appreciatted your feedback.
Thanks!!
GustavoHi Gustavo
Two things I need to know.
1. Do you have any End Routine in your DSO? If yes, what is the setting under "Update behavior of End Routine Display"....Option available right side of Delete Button ater End Rouine.
2. Did you try with Full Load from DSO1 and DSO2 to DSO3? Do you face the same problem?
Regards
Anindya -
Get back the Data mart status in ODS and activate the delta update.
I got a problem when deleting the requests in ODS.
actually there is Cube(1st level. it gets loaded from an ODS(2nd level). this gets loaded from 3 ODS'S( 3rd level). we were willing to delete recents requests from all the data tardets and reload from PSA. but while delting in the request in ODS(2nd level), it has displayed a window, showing as follows.
- the request 132185 already retrived by the data target BP4CLT612.
-Delta update in BP4CLT612 must be deactivated before deleting the request.
- Do you want to deactivate the delta update in data target BP4CLT612.
I have clicked on execute changes in the window. it has removed the data mart status for all the request which i have not deleted.
in the same it happened inthe 3 ODS's(3rd level).
I got clear that if we load further data from source system. it will load all the records from starting.
so to avoid this can any body help me how to reset the Data mart status and activate the delta update.Hi Satish,
U have to make the requests RED in cube and back them out from cube,before u can go for request deletions from the base targets(from which cube gets data).
Then u have to reset data mart status for the requests in your 'L2 ODS' before u can delete requests from ODS.
Here I think u tried to delete without resetting data mart status which has upset the delta sequence.
To correct this..
To L2 ODS,do an init without data transfer from below 3 ODS's after removing init request from scheduler menu in init infopackage.
Do similar from L2 ODS to Cube.
then reconstruct the deleted request in ODS.It will not show the tick mark in ODS.Do delta load from ODS to Cube.
see below thread..
Urgentt !!! Help on reloading the data from the ODS to the Cube.
cheers,
Vishvesh -
Hi all,
I have created an ODS from an InfoCube. Now, I am trying to
load data from InfoCube to ODS using export DataSource.
I am getting an error while defining the update rule on
ODS.
"0RECORDMODE is missing from the InfoSource"
How can I add 0RECORDMODE in this situation? Any other solutions?
With best regards, VSGo to the infosource, add 0recordmode to communication structure..
Regards
Manga -
Selective Deletion from Infocube Only (Not from Aggregates)
Hi,
For the Selective Deletion, i used the FM 'RSDRD_SEL_DELETION' in program. But this FM deletes the data first from Infocube then from aggregate.Because of this, deletion took more time to delete and adjust the Aggregate.
I need an way in which i can delete the data from Infocube only and after i reach to my reconcile period then i will deactivate and actiavted the aggregate, so the data will be consistent in Infocube and aggregate.
is there any to Delete(Selective Deletion) the data only from the Infocube not from its aggregate (for the Performance Point of View)?
Thanks in Advance.
Regards,
Himanshu.Hi
You can try manual selective deletion which will delete data from cube only. check out the below thread.
steps fro selective deletion for cubes and ODS
If you want to delete data from FM only, then try to deactivate your aggregates before you start your selective deletion.
Once deletion is done, you can reload the aggregates
Regards,
Venkatesh -
Hi experts,
I am using data mart from ods to cube
my issue is when i am loading data from ods to cube it takes more time i want to reduce the time can any body give me good suggestion
Regards,
SivaHi,
One way is to delete the indices as mentioned earlier.
This is the most common and effective method.
If you want i can explain step by step procedure of how to delete the indices...Do let me know.
Other is that you can take care that no other load is taking place from ODS to cube.
Sometimes that also reduces the performance.
Hope it helps.. -
can we load data from infocube to ODS? if so, how?
if not, why?Hi,
Yes, you can. Right Click on the InfoCube and then Generate Export DataSource -> Create update rule by right clicking on the ODS and then do the transformation -> Create InfoPackage under 8InfoCube and load the data.
Hope that helps.
~ Vaishnav
Maybe you are looking for
-
Apple tv stop streaming when screensaver starts
When I am listening to Spotify from My iPad through appletv the appletv stop streaming the music when screensaver turn in. It worked before the last update of appletv.
-
IPod touch 4G - icons went blank
Several of the icons on my son's iPod touch went blank yesterday. The apps still seem to work, but the icons are now solid white. The name of the app is still directly below each blanked icon. I don't think any of the native iOS apps have gone blank.
-
1) Why does it not transfer picturesat all times even when my pc is on wi fi 2) When i copy pic into my upload file it does not transfer back to my i phone ?
-
Wifi-menu broken after wpa_supplicant update
Yesterday I updated wpa_supplicant, and everything was fine until I rebooted, after which I can't connect to the Internet through wifi. When I try to use wifi-menu, It seems to be scanning, but before I get to the next screen I get "Aborted". I've tr
-
I have 1000 requests how can i compress the data?
hi guru's 1. i have 1000 requests is it possible to compress all requests at a time? 2. i have 3years data from 2001 to 2003 i want to see the 2001 and 2002 ? if possible , how can u see can any one give the detailes ?