Load from ODS to Infoobject
How do we load data from ODs to infoobject??
Hi,
In RSA1 (Modelling) , when you right click under your infoarea you can see the option "Insert characteristic as a data target".. Choose ur characteristic which you u want to update from ODS..
Right click on the ods - Generate export data source , now you will find the infosource with 8ODS.. Create a update rule for that characteristics by giving ODS name..
Now you can create infopackage and schedule the data from ODS to infoobject.
Hope it helps.
Regards,
Siva.
Similar Messages
-
Load from ODS into InfoCube gives TIME-OUT runtime error after 10 minutes ?
Hi all,
We have a full load from ODS into InfoCube and it was working fine till the last week upto with 50,000 records. Now, we have around 70,000+ records and started failing with TIME_OUT runtime error.
The following is from the Short Dump (ST22):
The system profile "rdisp/max_wprun_time" contains the maximum runtime of a
program. The current setting is 600 seconds. Once this time limit has been exceeded, the system tries to terminate any SQL statements that are currently being executed and tells the ABAP processor to terminate the current program.
The following are from ROIDOCPRMS table:
MAXSIZE (in KB) : 20,000
Frequency : 10
Max Processes : 3
When I check the Data Packages under 'Details' tab in Monitor, there are four Data Packages and the first three are with 24,450 records. I will right click on each Data Package and select 'Manual Update' to load from PSA. When this Manual Update takes more than 10 minutes it is failing with TIME_OUT again.
How could I fix this problem, PLEASE ??
Thanks,
Venkat.Hello A.H.P,
The following is the Start Routine:
PROGRAM UPDATE_ROUTINE.
$$ begin of global - insert your declaration only below this line -
TABLES: /BIC/AZCPR_O0400, /BIC/AZCPR_O0100, /BIC/AZCPR_O0200.
DATA: material(18), plant(4).
DATA: role_assignment like /BIC/AZCPR_O0100-CPR_ROLE, resource like
/BIC/AZCPR_O0200-CPR_BPARTN.
$$ end of global - insert your declaration only before this line -
The follow definition is new in the BW3.x
TYPES:
BEGIN OF DATA_PACKAGE_STRUCTURE.
INCLUDE STRUCTURE /BIC/CS8ZCPR_O03.
TYPES:
RECNO LIKE sy-tabix,
END OF DATA_PACKAGE_STRUCTURE.
DATA:
DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
WITH HEADER LINE
WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
FORM startup
TABLES MONITOR STRUCTURE RSMONITOR "user defined monitoring
MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
DATA_PACKAGE STRUCTURE DATA_PACKAGE
USING RECORD_ALL LIKE SY-TABIX
SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
$$ begin of routine - insert your code only below this line -
fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
to make monitor entries
clear DATA_PACKAGE.
loop at DATA_PACKAGE.
select single /BIC/ZMATERIAL PLANT
into (material, plant)
from /BIC/AZCPR_O0400
where CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID
and ( MATL_TYPE = 'ZKIT' OR MATL_TYPE = 'ZSVK' ).
if sy-subrc = 0.
DATA_PACKAGE-/BIC/ZMATERIAL = material.
DATA_PACKAGE-plant = plant.
modify DATA_PACKAGE.
commit work.
endif.
select single CPR_ROLE into (role_assignment)
from /BIC/AZCPR_O0100
where CPR_GUID = DATA_PACKAGE-CPR_GUID.
if sy-subrc = 0.
select single CPR_BPARTN into (resource)
from /BIC/AZCPR_O0200
where CPR_ROLE = role_assignment
and CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID.
if sy-subrc = 0.
DATA_PACKAGE-CPR_ROLE = role_assignment.
DATA_PACKAGE-/BIC/ZRESOURCE = resource.
modify DATA_PACKAGE.
commit work.
endif.
endif.
clear DATA_PACKAGE.
endloop.
if abort is not equal zero, the update process will be canceled
ABORT = 0.
$$ end of routine - insert your code only before this line -
Thanks,
Venkat. -
Error when loading from ODS to Cube
Hello Friends,
I am having trouble loading data from ODS to Infocube in my 2004S system.When loading data i get this message
07/03/2007 13:10:25 Data target 'ODSXYZ ' removed from list of loadable targets; not loadable.
07/03/2007 13:28:42 Data target 'ODSXYZ ' is not active or is incorrect; no loading allowed
I checked for ODSXYZ in my data target but there is nothing by that name.Even the infopackage doesnt have it.What needs to be done.Please help.
Thanks.Its a expected behavior. When you migrate ur DS, the infopacks associated with it, will grey out all the data targets that they were feeding before, that applies to any infopack you create even after the migration. You womt be able to delete it.
Having said this, this shouldnt impact ur loads from ODS to Cube. As this shoudl be taken care of by ur DTP's rather than ur infopacks.
A few questions:
How are you loading ur cube?
Did the data get thru fine to PSA with the infopack in question?
How did you load ur DSO(assuming the load was successful)?
Message was edited by:
voodi -
Loading from ODS to Cube in process chain
Hi Experts,
How can I do a full load from ODS to cube when using further processing in process chain? Your help is much appreciated.
Thanks,
BillHi,
You can use a DTP for this.
Create transformation between DSO and cube.
Create DTP and run it.
Loading data from one cube to another cube.
Cube to cube data loading
how to upload data from cube to cube
Can we pull data from one cube to another cube
Data Load Steps:
Reading data from another cube
Hope this helps.
Thanks,
JituK -
Loading from ODS to Master data Object
Hi Friends ,
Please let me know how to load from ODS to a master data object.
Thanks,
krishnaHi Krishna,
If your master data is enabled as "Infoprovider" then you can use an export datasource on the ODS and load the data into your master data.
I am not sure if it will work if the master data info-object is not enabled as "Infoprovider".
The other mechanism would be to use an infospoke to download the ODS data into a flatfile. And then use the flatfile to upload the master data.
Hope this helps.
Bye
Dinesh -
Problem when loading from ODS to the CUBE
Hi Experts,
I am facing an unusual problem when loading data from ODS to the cube.
I have a first level ODS where the delta postings are updated. I have checked the active table of the ODS and the data is accurate.
I have deleted the entire data from the Cube and trying to do a full load from ODS to the CUBE.
I am sure when I run a full load the data goes from Active table of the ODS.
After the full load the the keyfigure values are 0. I have tried doing the testing by loading couple of sales documents and still the keyfigure values are 0.
I wonder when I load the full. The data should be picked exactly the way it is in active table of the ODS.
I also dont have any fancy routines in the update rules.
Please help me in this regard.
Regards
RaghuHi,
Check the procedure did u do that exactly or not. just follow the the laymen steps here:... and let me know ur issue
o First prepare a flat file in Microsoft excel sheet for master data and transaction data and save it in .CSV format and close it.
o First select info objects option and create info area then create info object catalog then char, & key figure. Then create id in Char, Name as attribute activate it then in key figures create no and activate it.
o In info sources create application component then create info source with direct update for master data and flexible update for transaction data. The flat files would be created for master data create info package and execute it.
o For transaction data go to info provider then select the file right click on it then select option create ODS object. Then a new screen opens give the name of the ODS and create other screen opens then drag character to key fields.
o Activate it then update rules it will give error to it then go to communication structure and give 0Record mode then activate it then go to update rules it will activate it.
o Then go to info source then create info package then external data then processing update, Data targets then the scheduling then click start then the monitor.
o Then other screen opens there we can see if we doesnt able to see the records then come back to the info provider right click on it then other screen opens then select the option activate data in ODS then a screen opens.
o so select the option QM where it would be in a yellow co lour then by double click then another screen opens then select green color option then click continue it and it comes to the previous screen then select the option and click save it would be underneath of it just click it up and close it.
o Then come back to the main screen to the info source to the info package go to the package and right click on it then a screen opens then click schedule without distributing any of the tab directly click the monitor tab. Then refresh it until the records come to the green.
o Once it is green the click the data target and see the file executed.
o Now go to the info provider then a right click on the file name then create info cube. Then a new screen opens and gives the name and activates it. The drag the key figures to the other side, time characteristics to the other side, and chacetristics to the other side.
o Then create the dimension table then assign it and activate it .Then go to file name right click on it then select the option create update rules select ODS option and give the file name and activate it .
o Then come back to the main screen and refresh it. Whenever you see 8 file names you will know that it is undergoing the data mart concept.
o In the main screen click ODS file right click then a pop up screen opens select the option update ODS data target. Then another screen opens in that we are able to see 2 options full update, initial update then select initial update.
o Then another screen opens info Package there the external data , data targets , scheduler select the option later in background option then a another screen opens just click immediate then click save automatically it closes then click start option.
o Then select monitor option then the contents then the field to be elected then the file to be executed.
regards
ashwin -
Can I do Parallel Full loads from ODS to Cube.
Hai,
Usually I will do one a full update load from OD'S to Cube. to speed up the process can I do parallel Full update loads from ods to Cube?
Please advise.
Thanks, Vijay,Assuming that the only connection we are talking about is between a single ODS and a single Cube.
I think the only time you could speed anything up is in full drop an reload scenario. You could create multiple InfoPackages based on selection and execute them simultaneously.
If the update is a delta there is really no way to do it.
How many records are we talking about? Is there logic in the update rule? -
Summing up key figure in Cube - data load from ODS
Hello Gurus,
I am doing a data load from ODS to cube.
The records in ODS are at line-item level, and all needs to be summed up at header level. There is only one key-figure. ODS has header and line-item fields.
I am loading only header field data ( and not the item-field data) from ODS to Cube.
I am expecting only-one record in cube ( for all the item-level records) with all the key-figures summed up. But couldn't see that.
Can anyone please explain how to achieve it.
I promise to reward points.
=====
Example to elaborate my point.
In ODS
Header-field item-field quantity
123 301 10
123 302 20
123 303 30
Expected record in Cube
Header-field Quantity
123 60
====================
Regards,
Pramod.Hello Oscar and Paolo.
Thanks for the reply. I am using BW 7.0
Paolo suggested:
>>If you don't add item number to cube and put quantity as adition in update rules >>from ODS to cube it works.
I did that still I get 3 records in cube.
Oscar Suggested:
>>What kind of aggregate do you have for your key figure in update rules (update >>or no change)?
It is "summation". And it cannot be changed. (Or at least I do not know how to change it.)
There are other dimensions in the cube - which corresponds to the field(s) in ODS.
But, I just mentioned these two (i.e. header and item-level) for simplicity.
Can you please help?
Thank you.
Pramod. -
Index for loads from ODS To Cube
The load from ODS to cube is taking a long time - In the start routine another ODS is being looked up - The keys for look up is say X and Y
There is already an index existing on keys X , Y & Z -
Will this index be used while doing the select on that ODS or I need to create a new index with only X and Y keys ?
ThnxWhen you are running the start routine - run an SQL trace - ST05 - that will tell you if the index is being used.
Arun -
The load from ODS - 0FIAP_O03 to cube 0FIAP_C03 is duplicating records
Hello Everyone
I need some help/advice for the following issue
SAP BW version 3.5
The Delta load for ODS - 0FIAP_O03 works correctly
The load from ODS - 0FIAP_O03 to cube 0FIAP_C03 is duplicating records
NB Noticed one other forum user who has raised the same issue but question is not answered
My questions are
1. Is this a known problem?
2. Is there a fix available from SAP?
3. If anyone has had this issue and fixed it, could you please share how you achieved this
i have possible solutions but need to know if there is a standard solution
Thankyou
PushpaHello Pushpa,
I assume that you are using the Delta load to the initial ODS and then to the CUBE as well.
If the Delta is placed in both the places then there should not be any issue while sending the data to CUBE.
If you are using the FULL Load then normally the data will gets aggregated in the cube as objects are with the Addition mode in the Cube.
Can you post the exact error that you are facing here as this also can be of the design issue.
Murali -
Problem while loading data from ODS to infoobject
Hi guys,
I am facing problem while loading data from <b>ODS to infoobject</b>.
If I load data via PSA it works fine but
if I load data without PSA its giving error as Duplicate records.
Do u have any idea why it is so.
Thanks in advance
savioHi,
when you load the data via the PSA, what did you select? Serial or Paralel?
If you select serial most likely you don't have duplicates within the same datapackage and your load can go through.
Loading directly in the IObj will happen thefore if you have the same key in two different packages, the "duplicate records" will be raised; you can perhaps flag your IPack with the option "ignore duplicate records" as suggested...
hope this helps...
Olivier. -
Hi All,
I have loaded data from ODS to CUBE. now i have requirement to add some fields in the standard cube. so, for testing purpose i have created copy of the original and created transformation . now when i try to load data from ODS it shows me no more data available . while data is already there in ODS.
now what should i do ? i don't want to delete data from original cube. is there any other way to load data through transformation ?
Regards,
Komik ShahHi,
Check the DTP of old cube n see whether its Delta. If yes then check whether any one of the foll is check:
1) get delta only once
2) get data by request.
If 1 is checked then delta wont come for the second cube as it says to get delta once and delta is already in one of the cube.
Generally both should be unchecked but can vary as per requirements.
Now for your new DTP, i dont think it will aloow you to change to FULL.
If its allowing you to select FULL, then select it and select from acive table.
try to load and see.
regds,
Shashank -
Automatic loading from ODS to Cube in 3.5
Hi All
I was under the impression that in version 3.5 in order to load delta from ODS to Cube you had to run the 8 series Ipak.
However I have recently noticed that this ipak is running automatically after a delta load into the ODS even when the load is not via a process chain.
Can somebody where and how this setting is maintained.
Regards
AHi,
Go to ODS display mode and check if "Update Data Automatically" is ticked in Settings.
Regards,
Kams -
Debugging data load from ODS to InfoCube?
Hi Expert,
I got some errors when I load data from ODS to Cube. I tried to simulate the load, but I got No data in PSA table. I checked mt Infopakage, it is only update Datatarget directly. How can I debug in this situation. I have only that choice "Data Target only" How can I select update Cube through PSA? Thanks in advance!
WeidongHi robert,
Thanks for your reply! Do I have to delete the old one? It looks like that system generated two Infopackages - One is full update and another one is Initial update. I think that I have to delete both of them and re-created both, right? Thanks a lot!
Weidong -
Hi All,
Iam having one source ODS and target also ODS.Can u explain when we have to go for
1. Full Repair.
2.Full Load.
3. Delta Load.
and also pls give the procedure for loading for all the three above said scenarios.
Points will be awarded suitably for the answers.
Thanks in advance gurus.
Regards...
BW WorldHi,
For loading from one ODS to another you use export datasource. Its called a datamart load.
Full load is generally performed once to load all the historical data.
After that you do an init load and deltas will follow which will load only the changed or new records frequently.
Full repair can be said as a Full with selections. But the main use or advantage of Full repair load is that it wont affect delta loads in the system. If you load a full to a target with deltas running you again will have to initialize them for deltas to continue. But if you do full repair it wont affect deltas.
This is normally done we when we lose some data or there is data mismatch between source system and BW.
Check the OSS Note 739863 'Repairing data in BW' for all the details
You need to create an export datasource (8<Source ODS>) and the update rules from Source ODS to target ODS for loading.
Also InfoPackages for Init, Full/Full repair and Delta loads.
Loading procedure had been discussed previously in SDN.
You will get detailed info in SDN.
Thanks,
JituK
Maybe you are looking for
-
Windows 7 and Thunderbolt Display
I have installed BootCamp using the reccommended method, a USB Drive and an ISO. BootCamp installed it's drivers at the end of the initial Win7 install and all seem to be running ok..BUT when I hook up an Apple "Thunderbolt Display", I get a black s
-
Please explain the process for the creation of Outgoing excise Invoice J1IS scenerio is: Did GR for 2 quantity and it went for qualtiy inspection then out of which 1 failed in quality and 1 accpeted so please suggest if we need to do JIEX for 2 quant
-
HT5044 can i update Mac OS X 10.6.8 to 10.7.0
can i update Mac OS X 10.6.8 to 10.7.0
-
Hi Database Version: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production PL/SQL Release 11.2.0.3.0 - Production CORE 11.2.0.3.0 Production TNS for IBM/AIX RISC System/6000: Version 11.2.0.3.0 - Production NLSRTL Version 11.2.
-
why, if the update is available, can't i use it?