ODS Update CUBE hanges up in yellow stage?
Hi
I have ODS updating CUBE using process chain every day, it fails, if i try to run manually by right clicking on ODS and select update targets -
> Select DELTA----
> Start
Then it hang up at yellow state ...if i check details
@5B@ Start of processing update rules for data target ZSRV_C02 @35@
and it hangs at that stage and after that no message available.
Can some one give me solution?
My data mart status is OK.
Babu
Hi,
So u can rest them by edit-->reset status->Yes......and manually push them by selecting each and F6......
If all the idocs' are pushed u can cmeback to the monitor and right click on each packet and manual update this'll take some time but it shud finish....
Later refresh everything...for next time ask BASIS team to increase the Dialog process time as the idoc processing will be done in dialog only and if the time crosses the time limit error occurs...
Or
If its a full load u can delete the old request and run again but delete the indexes and also increase the dialog time and also jst monitor the SM58 session how fast the idocs are moving when the load is going on...
rgds
Edited by: Krishna Rao on Jun 17, 2009 8:51 PM
Similar Messages
-
Error in Data updation From ODS TO CUBE
hello Experts
We are working on BW 3.5
for FI GL line items we are loading the data to ODS and than to Cube.
earlier there was a process chain failure 2 months back so delta were stuck for 2 months now about 10 million records have been uploaded in ODS but when i am tryint to upload these records to CUBE via delta upload its giving Error
Value 'Ã
 ABC100711396 ' (hex. 'C30DC2A04C4344313030373131333936 ') of characteristic 0REF_DOC_NO c
i would like to know how to correct this and from where this error is coming as the same data is loaded to ODS and there its OK and secondly as there is no PSA bw ODS and Cube Do i need to delete the request and upload again
what if same error appears again should i write a start routine in update rule Between ODS and CUBE for special characters
thanks for repliesHi,
Goto All Elementary Tests in RSRV
Goto Master data
Select the required test
Select your Infoobject
Execute the test
You will get the result with errors or no errors
If you get errors, then you click "Correct Error" button.
Regrds,
Suman -
Updating the last date request from ODS to CUBE
Dear Friends,
Please can someone explain me.
i allways update request from ods to cube, sometime there will be 3 to 4 request in the ods which wont have the Data mart (Tick) and when i update it to the cube. i can only see 1 request which is current date request. i dont know whether the previous days request has been updated to the cube.???
but i see in the ODS that all the data mart (tick) is available for all the request.
And please can someone explain me, if i have many request in the ods, current date previous day and so on.
Is there any way to update in such a way, where i can see all the dates in the Cube instead of only 1 request with current date in the cube.
when i delete the request from the cube and remove the tick from the ods, and refresh, then suddenly the current request and all the previous request TICK will be gone...
Thanks for your help.
will assign complete points.
Thank you so so muchHi,
if you know everday how many request are getting updated in infocube as one request the you can check the added records in infocube .. it should be equal to the sum of all those request ..
but transferred can be more or equal .. also its depends on the
update rules ..designing ..
Hope this helps you ..
Regards,
shikha -
Updation of data from ods to cube
hi all,
I have 8 activated request in ODS . now I want to load data into a cube
from this ods .
is it possible to update request one by one so that I have same no. of request
in cube as in ods?
Thanks in adv.Hi
If you want to do like that, you have to create an infopack which updates data from ODS to cube.....Go to datamart ODS in your Infosouces...create an infopack....and you can giv request id in selection and do one by one
Assign points if useful
Thanks
N Ganesh -
Problem with update from ODS to Cube
Hi All,
I had an issue, when i was loading from ODS to Cube everything was fine except one data package failed coz of some object locked( User : myself, though i wasn't running anything else except this), but i check in SM12 no locks exist for that particular time.
apparently the load failed coz of this single package, its a load from ODS to Cube, so no chance of Manual update from PSA.
any suggestion: do i have to repeat that load all over again( sucks !!! it is for 40 mil)
Regards,
Robyn.Hi Hoggard
Check the Job log fro ur Job in the SM37 and check for that Datapackage is there any thing failed liek
"ARFCSTATE-SYSFAIL" if this is there .. it means its an Deadlock .. please check the ST22 dump...
U can use the PSA...
1) go to the 8ODS infosource and copy the Delta IP
2) check the option in the processsing as Update PSA and the subsequently into the data targets
3) Schedule the IP manually
hope it helps
regards
AK -
Error in updating data from ODS to CUBE.
Hi,
I am tryin to load data manually from ODS to CUBE in NW2004s.
This is a flat file load from the datasource to the ODS and then from the ODS to the CUBE.
In the CUBE, I am trying to populate fields by using the ODS fields.
For eg.
In the ODS, a CHAR Infoobject has the data in the timestamp format(i.e. mm/dd/yyyy hh:mm ). I need to split this data and assign them to the two individual DATE and TIME Infoobject in the CUBE.
For this, I have done the coding in the Transfer Structure in the Rule Group.
The time field is gettin populated , but the date field is not getting populated.
I get an error as Eg:
<b>Value '04052007' for CHAR 0DATE is not plausible</b>
Due to this, the corresponding records is not getting displayed
Also, the records where the time id displayed, the date is not getting displayed inspite of the date being correct.
Please help me with a solution for this.
<b><u><i>REMOVED</i></u></b>
Thanks In Advance.
Hitesh ShettyHello Hitesh
SAP accepts the date format in YYYYMMDD, so in the routine where you have concatenate the day month year...just do it in reverse order.....
Thanks
Tripple k -
Cube hang up is happening during Dimension Build process
We are using the Excel VBA tool for the Dimension build process.The tool will build the members in 3 applications(Atlaunch,ModeLife,Textcube).Most of the times cube hung up is happening at the Text cube. The dimension build process is described below
Dimension Build Process :-
The dimension build process is performed by the Calculation and Dimension Build Tool.
Before executing dimension build logging in to essbase server is required through the tool
Once logged in the particular region where dimension build is supposed to be carried out is selected from NML, NA, NE and NS.
Per region there are 3 applications namely <Region>LNCH, <Region>LIFE and <Region>TEXT and per application there is only 1 database namely AtLaunch, ModeLife and TextData respectively. The cube structure per region is given below:
NML Region:
NMLLNCH.AtLaunch
NMLLIFE.ModeLife
NMLTEXT.TextData
NA Region:
NALNCH.AtLaunch
NALIFE.ModeLife
NATEXT.TextData
NE Region:
NELNCH.AtLaunch
NELIFE.ModeLife
NETEXT.TextData
NS Region:
NSLNCH.AtLaunch
NSLIFE.ModeLife
NSTEXT.TextData
The Dimension Build Process is carried out in 2 steps:
1. Project Master Code Build –
2. Grades, Line Options, Project Description and Reference Currency Build –
Project Master Code Build:-
In the Project Master Code Build by the Calculation and Dimension Build Tool the following processes are involved
• Some members are dynamically created in <Region>LNCH.AtLaunch, <Region>LIFE. ModeLife and <Region>TEXT. TextData cubes.
• Some data is loaded in <Region>LNCH.AtLaunch Cube.
Grades, Line Option, Project Description and Reference Currency Build:-
Grades – Some members are dynamically created in <Region>LNCH.AtLaunch and <Region>LIFE. ModeLife cubes.
Line Options - Some members are dynamically created in <Region>LNCH.AtLaunch
Project Description - Some members are dynamically created in <Region>TEXT.TextData cube. After that some data is loaded in <Region>TEXT.TextData cube.
Reference Currency – UDA is updated in <Region>LNCH.AtLaunch, <Region>LIFE.ModeLife cube and UDA and Alias is updated in <Region>TEXT.TextData cube.
Note: Most of the times during Project Description Build we are experiencing cube hang up / no response for NMLTEXT.TextData, NATEXT.TextData, NETEXT.TextData and NSTEXT.TextData cubes.
Text cube dimension build process flow :-
1. Introduction
The Dimension Build process in CDB system is used for adding new project information in the cubes (AtLaunch, ModelLife and TextData). The text files which are needed for performing dimension build are generated from the child packages and stored in the local system of the user (<LocalDrive>:\GCEP4\Input\CDB_DimensionBuild). The Dimension Build process is carried out in 2 stages:
1. Project Master Code Build
2. Grades, Line Option, Project Description and Reference Currency Build
Project Master Code Build:-
In the Project Master Code Build by the Calculation and Dimension Build Tool the following processes are involved
• MsPr_DIM rule is used for the text file F0P_<Project>-<Milestone>-<Version>.txt to build Project Master Code in AtLaunch, Model Life and Text Data Cubes.
• MsPr_FLG rule is used to Load some data from the text file F0P_<Project>-<Milestone>-<Version>.txt in AtLaunch Cube.
Grades, Line Option, Project Description and Reference Currency Build:-
Dimension Build w.r.t. Grades and Line Options are carried out in AtLaunch and Model Life cubes which is not in scope of this document. Dimension Build w.r.t. Project Description and Reference Currency change are carried out in Text Data cube, so only those parts are discussed in this document.
2. Project Description Build (Detailed process flow)
The Project Description Build is carried out in the following steps:
• First, any Lock on the particular PMV is checked. If not locked next operation is carried out.
• Database availability is checked next (whether any other Dimension build is happening that time or not) and accordingly the availability flag is updated in TextData cube.
• Next cache is set from high to low to carry out dimension build.
• After cache setting, one rule file at a time is taken (as given in the cfgDL sheet) and dimension build is performed by using the Rule file and text file (created in local by input packages) on the intended cube.
• For a particular Rule file “GCM_FDIM” which is used to build CCM information in TextData cube, the following steps are followed by the tool itself:
o From the TextData cube all the child members under CCM member under Grades dimension are fetched by “GetMbrInfo” function (GetMbrInfo is used around 2000 to 3000 times).
o The child members under CCM for that particular PMV combination are deleted (if previously build).
• New child members under CCM are build based on the information provided in the text file F1G_<Project>-<Milestone>-<Version>.txt
• Once CCM is build rest of the rule files are used to carry the dimension build operation.
• The Dimension build process is carried out by building dimensions as well as updating some flags (loading data).
• Reference Currency is updated (if required) at last.
Why this cube hang up is happening?
Thanks in advance53 questions, all still open. Would you mind marking them as closed as without at least the "question is closed" tag, there's no way for future readers to know if your issues have been resolved or not. I know that I personally use this board for answers quite often and it really helps me when I see that a question has been adequately answered. Points assigned to those who helped you is a nice touch.
Now to your question: did you by any chance change from one Essbase server to another? You didn't state if this build issue happens all the time, or only if you do all three databases, or much of anything in your post as to the actual process of failure, so I'm kind of guessing wildly on this. If true, you might want to look at: http://timtows-hyperion-blog.blogspot.com/2007/12/essbase-api-error-fix-geeky.html
Regards,
Cameron Lackpour
P.S. MMIC, GlennS, and I were just discussing your kind of build-hierarchy-in-Excel approach last week and how infrequently it's seen nowadays. Not a criticism, I personally like these old-school approaches, probably because I've built one or two myself. -
How can I activate the transfer rules for the ODS updating a data target.
We are on BW 3.5 and I'm loading data into the 0FIGL_O10 ODS and then uploading the data into the cube 0FIGL_C10. The data loads just fine to the ODS but when I try to <u><b>'update the data target'</b></u> I get a date & time stamp' error on the info-package transfer rules.
I then Replicate the datasource 80FIGL_O01.
I must then <u><b>'activate' the transfer rules</b></u>.
However I cannot get the transfer rules for 80FIGL_O10 in CHANGE MODE to activate them.
How can I activate the transfer rules for the ODS updating a data target.
The error text is as follows:
DataSource 80FIGL_O10 has to be replicated (time stamp, see long text)
Message no. R3016
Diagnosis
DataSource 80FIGL_O10 does not have the same status as the source system in the Business Information Warehouse.
The time stamp in the source system is 02/15/2007 10:42:33.
The time stamp in the BW system is 11/07/2006 13:11:54.
System response
The load process has been terminated.
<b>Procedure
Copy the DataSource again and then activate the transfer rules that belong to it. You have to activate the transfer rules in every case, even if they are still active after the DataSource has been copied.</b>
Thanks for your assistance.
DennyHi Dennis,
Try, using Business Content to activate your data source
hope this will help you
How activate business content?
http://help.sap.com/saphelp_nw04/helpdata/en/80/1a66d5e07211d2acb80000e829fbfe/frameset.htm -
Error while loading data from write optimized ODS to cube
Hi All,
I am loading data from a write optimized ODS to cube
I have done Generate Export Datasource
schedulled the info packge with 1 selection for full load
then it gave me following error in Transfer IDOCs & TRFC
Info IDOC 1: IDOC with errors added
Info IDOC 2: IDOC with errors added
Info IDOC 3: IDOC with errors added
Info IDOC 4: IDOC with errors added
Data packege 1 arrived in BW Processing : selected number does not agree with transferred number
Processing below is green
shows update of 4 new records to Datapackage 1.
Please provide inputs for the resolution
Thanks & Regards,
Rashmi.please let me know, What more details you need?
If I click F1 for error details i get following message
Messages from source system
see also Processing Steps Request
These messages are sent by IDoc from the source system. Both the extractor itself as well as the service API can send messages. When errors occur, several messages are usually sent together.
From the source system, there are several types of messages that can be differentiated by the so-called Info-IDoc-Status. The IDoc with status 2 plays a particular role here; it describes the number of records that have been extracted in a source system and sent to BI. The number of the records received in BI is checked against this information.
Thanks & Regards,
Rashmi. -
Error while uploading data from ODS to Cube
Hi All,
Will you please help out this issue.As this is a priority high issue,please reply if you know the answers.
I am facing an error while loading the data from ODS to CUBE,the error
is
<b>1.Name is not in the namespace for generated BW Metaobjects
2.Error 18 in the update</b>
And this error is occuring only in quality server.
Thanks,
Ram.HI RAM SIVA,
make sure that the Data source is replicated and tranfer rules r active.
And also check whether the all transport requests r imported properly.
hope it helps
bhaskar -
Error while loading data from ODS to CUBE.
Hi friends,
When l am loading data from ODS to Cube with help of data mart, I am getting error in QA system , IN DM system ,every thing went well.IF i see the detail tab in monitor under Processing .
Its is showing like this .
Transfer Rules : Missing Massage.
Update PSA : missing massage.
Processing end : missing message.
I have checked the coding in update rules, everything is ok.
Plz any inputs.
hari
Message was edited by:
hari reddyMight means that IDocs flow is not defined properly in your QA system for myself-SourceSystem.
Regards,
Vitaliy -
Transformation from ods to cube IN BI 7.0
Hi there, I will appreciate if anybody can help, I want to know if I want to assign the constant value to a field in transformation from ods to cube what is the way out.as i can choose the constant in update in BW3.5 but what is the step in BI 7.0
thanks
SoniyaGo to rule type column right clicj and select rule details. Under rule type u can fuind the option of constant.
http://help.sap.com/saphelp_nw04s/helpdata/en/e5/f913426908ca7ee10000000a1550b0/content.htm
KJ!!! -
ODS to CUBE loading - taking too much time
Hi Experts,
I am loading data from R/3(4.7) to BW (3.5).
I am loading with option --> PSA and then Data Target (ODS ).
I have a selection criteria in Infopackage while loading from standard Datasource to ODS.
It takes me 20 mins to load 300K records.
But, from ODS to Infocube ( update method: Data Target Only), it is taking 8 hours.
The data packet size in Infopackage is 20,000 ( same for ODS and Infocube).
I also tried changing the data packet size, tried with full load , load with initialization,..
I tried scheduling it as a background job too.
I do not have any selection criteria in the infopackage from ODS to Cube.
Please let me know how can I decrease this loading time from ODS to Infocube.Hi,
To improve the data load performance
1. If they are full loads then try to see if you make them delta loads.
2. Check if there are complex routines/transformations being performed in any layer. In that case see if you can optimize those codes with the help of an abaper.
3. Ensure that you are following the standard procedures in the chain like deleting Indices/secondary Indices before loading etc.
4. Check whether the system processes are free when this load is running
5. try making the load as parallel as possible if the load is happening serially. Remove PSA if not needed.
6. When the load is not getiing processed due to huge volume of data, or more number of records per data packet, Please try the below option.
1) Reduce the IDOC size to 8000 and number of data packets per IDOC as 10. This can be done in info package settings.
2) Run the load only to PSA.
3) Once the load is succesfull , then push the data to targets.
In this way you can overcome this issue.
Ensure the data packet sizing and also the number range buffering, PSA Partition size, upload sequence i.e, always load master data first, perform change run and then transaction data loads.
Check this doc on BW data load perfomance optimization
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2
BI Performance Tuning
FAQ - The Future of SAP NetWeaver Business Intelligence in the Light of the NetWeaver BI&Business Objects Roadmap
Thanks,
JituK -
Loading through Process Chains 2 Delta Loads and 1 Full Load (ODS to Cube).
Dear All,
I am loading through Process chains with 2 Delta Loads and 1 Full load from ODS to Cube in 3.5. Am in the development process.
My loading process is:
Start - 2 Delta Loads - 1 Full Load - ODS Activation - Delete Index - Further Update - Delete overlapping requests from infocube - Creating Index.
My question is:
When am loading for the first am getting some data and for the next load i should get as Zero as there is no data for the next load but am getting same no of records for the next load. May be it is taking data from full upload, i guess. Please, guide me.
Krishna.Hi,
The reason you are getting the same no. of records is as you said (Full load), after running the delta you got all the changed records but after those two delta's again you have a full load step which will pick whole of the data all over again.
The reason you are getting same no. of records is:
1> You are running the chain for the first time.
2> You ran this delta ip's for the first time, as such while initializing these deltas you might have choosen "Initialization without data transfer", as such now when you ran these deltas for the first time they picked whole of the data.Running a full load after that will also pick the same no. of records too.
If the two delats you are talking are one after another then is say u got the data because of some changes, since you are loading for a single ods to a cube both your delta and full will pick same "For the first time " during data marting, for they have the same data source(ODS).
Hope fully this will serve your purpose and will be expedite.
Thax & Regards
Vaibhave Sharma
Edited by: Vaibhave Sharma on Sep 3, 2008 10:28 PM -
Automatic loading from ODS to Cube in 3.5
Hi All
I was under the impression that in version 3.5 in order to load delta from ODS to Cube you had to run the 8 series Ipak.
However I have recently noticed that this ipak is running automatically after a delta load into the ODS even when the load is not via a process chain.
Can somebody where and how this setting is maintained.
Regards
AHi,
Go to ODS display mode and check if "Update Data Automatically" is ticked in Settings.
Regards,
Kams
Maybe you are looking for
-
RH8 for Word not displaying images in WebHelp
Greetings. We recently upgraded from RH7 for Word to RH8 for Word. Images (bitmaps) that displayed perfectly when RH7 compiled the WebHelp do not display at all when RH8 compiles the WebHelp. All we get is an extra blank line -- not even a broken i
-
My responses are not coming back to me even after they have been submited
I have created a fill in form and have distributed it on a web page. It can be opened and information filled in. When they hit the submit button - It states that it hase been submitted. I don't receive any of them. What can be my problem?
-
Hi I'm trying to have production orders set with the "Blocked" status when a sales order is credit blocked. This is a MTO environment and as a default the system actually sets the "DLFL" status in such a situation. The problem with the standard setup
-
POST Function Import in Gateway Builder
Hi everyone, I am trying to create a new Function Import in SEGW transaction (Netweaver Gateway Builder). I have followed different blogs (For example: Let's code CRUDQ and Function Import operations in OData service!) and managed to create one. Howe
-
hello guys... how can i install linux on my macbook?help me