Can I do Parallel Full loads from ODS to Cube.
Hai,
Usually I will do one a full update load from OD'S to Cube. to speed up the process can I do parallel Full update loads from ods to Cube?
Please advise.
Thanks, Vijay,
Assuming that the only connection we are talking about is between a single ODS and a single Cube.
I think the only time you could speed anything up is in full drop an reload scenario. You could create multiple InfoPackages based on selection and execute them simultaneously.
If the update is a delta there is really no way to do it.
How many records are we talking about? Is there logic in the update rule?
Similar Messages
-
Loading from ODS to Cube in process chain
Hi Experts,
How can I do a full load from ODS to cube when using further processing in process chain? Your help is much appreciated.
Thanks,
BillHi,
You can use a DTP for this.
Create transformation between DSO and cube.
Create DTP and run it.
Loading data from one cube to another cube.
Cube to cube data loading
how to upload data from cube to cube
Can we pull data from one cube to another cube
Data Load Steps:
Reading data from another cube
Hope this helps.
Thanks,
JituK -
Error when loading from ODS to Cube
Hello Friends,
I am having trouble loading data from ODS to Infocube in my 2004S system.When loading data i get this message
07/03/2007 13:10:25 Data target 'ODSXYZ ' removed from list of loadable targets; not loadable.
07/03/2007 13:28:42 Data target 'ODSXYZ ' is not active or is incorrect; no loading allowed
I checked for ODSXYZ in my data target but there is nothing by that name.Even the infopackage doesnt have it.What needs to be done.Please help.
Thanks.Its a expected behavior. When you migrate ur DS, the infopacks associated with it, will grey out all the data targets that they were feeding before, that applies to any infopack you create even after the migration. You womt be able to delete it.
Having said this, this shouldnt impact ur loads from ODS to Cube. As this shoudl be taken care of by ur DTP's rather than ur infopacks.
A few questions:
How are you loading ur cube?
Did the data get thru fine to PSA with the infopack in question?
How did you load ur DSO(assuming the load was successful)?
Message was edited by:
voodi -
Index for loads from ODS To Cube
The load from ODS to cube is taking a long time - In the start routine another ODS is being looked up - The keys for look up is say X and Y
There is already an index existing on keys X , Y & Z -
Will this index be used while doing the select on that ODS or I need to create a new index with only X and Y keys ?
ThnxWhen you are running the start routine - run an SQL trace - ST05 - that will tell you if the index is being used.
Arun -
Hi All,
I have loaded data from ODS to CUBE. now i have requirement to add some fields in the standard cube. so, for testing purpose i have created copy of the original and created transformation . now when i try to load data from ODS it shows me no more data available . while data is already there in ODS.
now what should i do ? i don't want to delete data from original cube. is there any other way to load data through transformation ?
Regards,
Komik ShahHi,
Check the DTP of old cube n see whether its Delta. If yes then check whether any one of the foll is check:
1) get delta only once
2) get data by request.
If 1 is checked then delta wont come for the second cube as it says to get delta once and delta is already in one of the cube.
Generally both should be unchecked but can vary as per requirements.
Now for your new DTP, i dont think it will aloow you to change to FULL.
If its allowing you to select FULL, then select it and select from acive table.
try to load and see.
regds,
Shashank -
Error while loading from ODS to CUBE
Hi guys,
I am loading data from source system to ODS and from ODS to CUBE. Ok. The data came successfully from Source System to ODS. But while coming to ODS to CUBE it is showing some error at Data Packet 13. Ok. In CUBE, I loaded the data by using update ODS data to data target, at that time it shows two options like FULL UPDATE and INIT UPDATE. Then I selected FULL UPDATE and then I checked in processing tab there is only one option enabled i.e. only data target. OK and then I loaded it. Now after getting the error where I can correct it. There is no PSA option in monitoring.
Other wise how I can change option in Processing tab of infopackage for PSA. But I know one point that when we load the data from one target to another the only one option available in processing tab is only data target. How can I change that option and how can I correct the error.
Thanks
RajeshHi,
i solved my question like the following.
Go to monitoring of the CUBE and select the option "read every thing as manually" -> then it shows another screen for correcting the records -> correct all the records and load the data again. ok
Thanks
Rajesh -
Automatic loading from ODS to Cube in 3.5
Hi All
I was under the impression that in version 3.5 in order to load delta from ODS to Cube you had to run the 8 series Ipak.
However I have recently noticed that this ipak is running automatically after a delta load into the ODS even when the load is not via a process chain.
Can somebody where and how this setting is maintained.
Regards
AHi,
Go to ODS display mode and check if "Update Data Automatically" is ticked in Settings.
Regards,
Kams -
Hi,
i have delta loads comming into the ODS. i do full update from ODS to the cube by date range for material moments no. last time when i loaded the data, it loaded few for the date range. rest did not load and sitting at ODS. this is full load and tried to load again. any suggestions...
spHi Srinivas,
check your update rules between ODS and cube whether they are mapped properly(check your date range for the cube load).
Do a Init load and then do the delta load.
hope this will help. -
I have a process chain for GL.
Data was getting loaded first to ODS and then to CUBE.
Now i there is no update to cube is scheduled. I removed 'Further Update' process from process chain.
During loading I am getting a warning message as follow:
There must be a type "Update ODS Object Data (Further Update)" process behind process "Activate ODS Object Data" var.ACTIVATE_ODSu201D
So plz suggest me any way to remove this warning message.
Thanx,
VishalHi Vishal,
since you removed 'Further Update' process from process chain, this message is comming, i hope instead of this peorcess you are loading dta from ODS to Cube using Infopackge at process chain.
at ODS - setting uncheck setting Further update to data targets (i assume you are loading data from ODS to Cube using Infopackage at process chain).
Best Regards. -
Hi,
I am loading a delta initialization from ODS to a cube. The load goes fine but never ends. I saw through SM37 transaction and the job has finished but with these messages at the end:
tRFC: Data Package = 6, TID = 0A11010C066C45438FC70221, Duration = 01:01:03, <b>ARFCSTATE = SYSFAIL</b>
tRFC: Start = 28.10.2006 12:10:34, End = 28.10.2006 13:11:37
tRFC: Data Package = 7, TID = 0A11010C066C45438FE00222, Duration = 01:01:38, <b>ARFCSTATE = SYSFAIL</b>
tRFC: Start = 28.10.2006 12:10:59, End = 28.10.2006 13:12:37
tRFC: Data Package = 8, TID = 0A11010C066C45438FE90223, Duration = 01:01:32, <b>ARFCSTATE = SYSFAIL</b>
tRFC: Start = 28.10.2006 12:11:07, End = 28.10.2006 13:12:39
tRFC: Data Package = 10, TID = 0A11010C04BC454390020000, Duration = 01:01:12, <b>ARFCSTATE = SYSFAIL</b>
tRFC: Start = 28.10.2006 12:11:32, End = 28.10.2006 13:12:44
Synchronized transmission of info IDoc 5 (0 parallel tasks)
tRFC: Data Package = 0, TID = , Duration = 00:00:03, ARFCSTATE =
tRFC: Start = 28.10.2006 13:12:45, End = 28.10.2006 13:12:48
Job finished
What does this ARFCSTATE = SYSFAIL means and how to correct this?
Thanks for your help!!!Hi Miguel Sanchez,
I hope you have with PSA in IP. If not run with PSA option and goto Detail tab of the Monitor , Expand the node Extraction Check for the message 'Data Selection is Ended'.
If you find this message you can update from PSA.
otherwise Go for Repeat the Delta.
and also refer the notes 516251.
also the error you are reporting could be for several reasons. Could you therefore please check the following:
1) check the RFC destination in SM59 for both your BW my self system
2) in SM59, in BW source system, go to menu path "test"
-> "connection" and test this for errors
3) in SM59, in your BW source system, go to menu path "test"
-> "authorization" and check if the user and password are o.k.
4) check the port definitions in WE20 and WE21
5) finally check that you have sufficient DIA processes defined in
your BW and R/3 source system (you should have at least one more
DIA process than all other work processes combined; this is described
in more detail in notes 561880 and 74141).
Please check if this solves the problem for you.
Hope it helps.
Regards,
Srikanth. -
Delta load from ODS to cube failed - Data mismatch
Hi all
We have a scenario where the data flow is like
R/3 table - >dataSrc -- > pSA - >InfoSrc -> ODS ->Cube.
The cube has an additional field called "monthly version"
and since it is a history cube , it is supposed to hold data
snapshots of the all data in current cube for each month .
We are facing the problem that the Data for the current month
is there in the history ODS but not in the cube . In the ODS ->Manage->requests
tab i can see only 1 red request that too with 0 recs.
However ,In the cube -> manage-> reconstruction tab , i can see 2 Red request
with the current month date. Could these red requests be the reason for
data mismatch in ODS and cube .
Please guide as to how can i solve this problem .
thanks all
annieHi
Thanks for the reply.
Load to Cube is Delta and goes daily .
Load to ODS is a full on daily basis .
Can you help me how to sort this issue . I have to work directly in production env . so has to be safe and full proof .
Thanks
annie -
We have some data loads from ODS to Cube (Deltas)
Dear Experts,
We can see request status green in DSO but in cube it is in yellow and not available for reporting (the request got compressed)
I want to make the status to green and available for reporting but i could'nt do it as i can see some popup message saying the request is already activated and change QM status not possible.
when i run the delta again that also becomes yellow status due to the previous request's was in yello status.
please suggest me how to change the status of compressed request and why it is happening.
Thank you in advance,
Kiran.Hi,
Chk whether automatic request processing--->Set quality status Ok checked or not?
It might be in manage of cube, in the menu-->goto options somewhere
as I was not before the system ...I couln't guide u exactly...
rgds,
Edited by: Krishna Rao on Mar 30, 2010 3:47 PM -
No data loaded from ODS to Cube
Hi,
I am trying to do init with data transfer from 0FIAP_O03 to 0FIAP_C03...
though there is data in the ODS, nothing is loaded to the cube. It just says
No data available
Diagnosis
The data request was a full update.
In this case, the corresponding table in the source system does not
contain any data.
System Response
Info IDoc received with status 8.
Procedure
Check the data basis in the source system.
Any suggestions?
Thanks
ShaliniHi
"Info IDoc received with status 8" implies that there is no data in the source.
But as you say, that data is there in the source(ODS), then there is some inconsistency in the data source and you need to replicate the data source and reactivate that.
IDoc Status 8 signifies that there is no data in the source and its not any error.
For more info on idocs check the below link
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/0087780d-81b0-2a10-eb82-86e1062f1a2d
Also goto sm58 and clear the idoc there.
Hope this helps
Regards
Shilpa -
Problem when loading from ODS to the CUBE
Hi Experts,
I am facing an unusual problem when loading data from ODS to the cube.
I have a first level ODS where the delta postings are updated. I have checked the active table of the ODS and the data is accurate.
I have deleted the entire data from the Cube and trying to do a full load from ODS to the CUBE.
I am sure when I run a full load the data goes from Active table of the ODS.
After the full load the the keyfigure values are 0. I have tried doing the testing by loading couple of sales documents and still the keyfigure values are 0.
I wonder when I load the full. The data should be picked exactly the way it is in active table of the ODS.
I also dont have any fancy routines in the update rules.
Please help me in this regard.
Regards
RaghuHi,
Check the procedure did u do that exactly or not. just follow the the laymen steps here:... and let me know ur issue
o First prepare a flat file in Microsoft excel sheet for master data and transaction data and save it in .CSV format and close it.
o First select info objects option and create info area then create info object catalog then char, & key figure. Then create id in Char, Name as attribute activate it then in key figures create no and activate it.
o In info sources create application component then create info source with direct update for master data and flexible update for transaction data. The flat files would be created for master data create info package and execute it.
o For transaction data go to info provider then select the file right click on it then select option create ODS object. Then a new screen opens give the name of the ODS and create other screen opens then drag character to key fields.
o Activate it then update rules it will give error to it then go to communication structure and give 0Record mode then activate it then go to update rules it will activate it.
o Then go to info source then create info package then external data then processing update, Data targets then the scheduling then click start then the monitor.
o Then other screen opens there we can see if we doesnt able to see the records then come back to the info provider right click on it then other screen opens then select the option activate data in ODS then a screen opens.
o so select the option QM where it would be in a yellow co lour then by double click then another screen opens then select green color option then click continue it and it comes to the previous screen then select the option and click save it would be underneath of it just click it up and close it.
o Then come back to the main screen to the info source to the info package go to the package and right click on it then a screen opens then click schedule without distributing any of the tab directly click the monitor tab. Then refresh it until the records come to the green.
o Once it is green the click the data target and see the file executed.
o Now go to the info provider then a right click on the file name then create info cube. Then a new screen opens and gives the name and activates it. The drag the key figures to the other side, time characteristics to the other side, and chacetristics to the other side.
o Then create the dimension table then assign it and activate it .Then go to file name right click on it then select the option create update rules select ODS option and give the file name and activate it .
o Then come back to the main screen and refresh it. Whenever you see 8 file names you will know that it is undergoing the data mart concept.
o In the main screen click ODS file right click then a pop up screen opens select the option update ODS data target. Then another screen opens in that we are able to see 2 options full update, initial update then select initial update.
o Then another screen opens info Package there the external data , data targets , scheduler select the option later in background option then a another screen opens just click immediate then click save automatically it closes then click start option.
o Then select monitor option then the contents then the field to be elected then the file to be executed.
regards
ashwin -
Load from ODS into InfoCube gives TIME-OUT runtime error after 10 minutes ?
Hi all,
We have a full load from ODS into InfoCube and it was working fine till the last week upto with 50,000 records. Now, we have around 70,000+ records and started failing with TIME_OUT runtime error.
The following is from the Short Dump (ST22):
The system profile "rdisp/max_wprun_time" contains the maximum runtime of a
program. The current setting is 600 seconds. Once this time limit has been exceeded, the system tries to terminate any SQL statements that are currently being executed and tells the ABAP processor to terminate the current program.
The following are from ROIDOCPRMS table:
MAXSIZE (in KB) : 20,000
Frequency : 10
Max Processes : 3
When I check the Data Packages under 'Details' tab in Monitor, there are four Data Packages and the first three are with 24,450 records. I will right click on each Data Package and select 'Manual Update' to load from PSA. When this Manual Update takes more than 10 minutes it is failing with TIME_OUT again.
How could I fix this problem, PLEASE ??
Thanks,
Venkat.Hello A.H.P,
The following is the Start Routine:
PROGRAM UPDATE_ROUTINE.
$$ begin of global - insert your declaration only below this line -
TABLES: /BIC/AZCPR_O0400, /BIC/AZCPR_O0100, /BIC/AZCPR_O0200.
DATA: material(18), plant(4).
DATA: role_assignment like /BIC/AZCPR_O0100-CPR_ROLE, resource like
/BIC/AZCPR_O0200-CPR_BPARTN.
$$ end of global - insert your declaration only before this line -
The follow definition is new in the BW3.x
TYPES:
BEGIN OF DATA_PACKAGE_STRUCTURE.
INCLUDE STRUCTURE /BIC/CS8ZCPR_O03.
TYPES:
RECNO LIKE sy-tabix,
END OF DATA_PACKAGE_STRUCTURE.
DATA:
DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
WITH HEADER LINE
WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
FORM startup
TABLES MONITOR STRUCTURE RSMONITOR "user defined monitoring
MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
DATA_PACKAGE STRUCTURE DATA_PACKAGE
USING RECORD_ALL LIKE SY-TABIX
SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
$$ begin of routine - insert your code only below this line -
fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
to make monitor entries
clear DATA_PACKAGE.
loop at DATA_PACKAGE.
select single /BIC/ZMATERIAL PLANT
into (material, plant)
from /BIC/AZCPR_O0400
where CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID
and ( MATL_TYPE = 'ZKIT' OR MATL_TYPE = 'ZSVK' ).
if sy-subrc = 0.
DATA_PACKAGE-/BIC/ZMATERIAL = material.
DATA_PACKAGE-plant = plant.
modify DATA_PACKAGE.
commit work.
endif.
select single CPR_ROLE into (role_assignment)
from /BIC/AZCPR_O0100
where CPR_GUID = DATA_PACKAGE-CPR_GUID.
if sy-subrc = 0.
select single CPR_BPARTN into (resource)
from /BIC/AZCPR_O0200
where CPR_ROLE = role_assignment
and CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID.
if sy-subrc = 0.
DATA_PACKAGE-CPR_ROLE = role_assignment.
DATA_PACKAGE-/BIC/ZRESOURCE = resource.
modify DATA_PACKAGE.
commit work.
endif.
endif.
clear DATA_PACKAGE.
endloop.
if abort is not equal zero, the update process will be canceled
ABORT = 0.
$$ end of routine - insert your code only before this line -
Thanks,
Venkat.
Maybe you are looking for
-
Error while calling a web service from BCM to CRM
Dear Experts, I'm calling a CRM web service from BCM via Custom IVR. The web service works fine when run using SOAPUI. But when called from BCM, there is an error in the CEM logs as mentioned below: 11:16:51.515 (11224 ) ERR> Connection/
-
Zero Value Line Item in the accounting document
Hi Experts When I create an accounting doc from Invoice, system adds a new accounting line with 0 value and posts as below Itm PK Account Account short text G/L acct short text Tx Amount 50 50000
-
I updated my Apple TV and my itunes says there are no purchases?
IIt's says there are no purchases.
-
I added an URL item in a folder, but when wieving the folder portal returns the following error: Error: Call to utl_http failed (WWS-32136) ORA-1: User-Defined Exception (WWC-36000) Using: IAS102, Portal 3.0.6.5 on Solaris. Thanx & Bye all Fab null
-
CSM-S connections under Redundancy
We have installed two redundant Catalyst 6506E with CSM-S modules. Now I can see that CSM-S #1 is active and CSM-S #2 is standby. Nevertheless I can see with "show mod csm 3 vserver" that there are a number of connections on the vservers on CSM-S #2,