Dead lock error while updating data into cube
We have a scenario of daily truncate and upload of data into cube and volumes arrive @ 2 million per day.We have Parallel process setting (psa and data targets in parallel) in infopackage setting to speed up the data load process.This entire process runs thru process chain.
We are facing dead lock issue everyday.How to avoid this ?
In general dead lock occurs because of degenerated indexes if the volumes are very high. so my question is does deletion of Indexes of the cube everyday along with 'deletion of data target content' process help to avoiding dead lock ?
Also observed is updation of values into one infoobject is taking longer time approx 3 mins for each data packet.That infoobject is placed in dimension and defined it as line item as the volumes are very high for that specific object.
so this is over all scenario !!
two things :
1) will deletion of indexes and recreation help to avoid dead lock ?
2) any idea why the insertion into the infoobject is taking longer time (there is a direct read on sid table of that object while observed in sql statement).
Regards.
hello,
1) will deletion of indexes and recreation help to avoid dead lock ?
Ans:
To avoid this problem, we need to drop the indexes of the cube before uploading the data.and rebuild the indexes...
Also,
just find out in SM12 which is the process which is causing lock.... Delete that.
find out the process in SM66 which is running for a very long time.Stop this process.
Check the transaction SM50 for the number of processes available in the system. If they are not adequate, you have to increase them with the help of basis team
2) any idea why the insertion into the infoobject is taking longer time (there is a direct read on sid table of that object while observed in sql statement).
Ans:
Lie item dimension is one of the ways to improve data load as well as query performance by eliminationg the need for dimensin table. So while loading/reading, one less table to deal with..
Check in the transformation mapping of that chs, it any rouitne/formula is written.If so, this can lead to more time for processing that IO.
Storing mass data in InfoCubes at document level is generally not recommended because when data is loaded, a huge SID table is created for the document number line-item dimension.
check if your IO is similar to doc no...
Regards,
Dhanya
Similar Messages
-
Error while loading data into cube 0calday to 0fiscper (2lis_13_vdcon)
Hi all,
I m getting following error while loading the data into cube.
"Time conversion from 0CALDAY to 0FISCPER (fiscal year V3 ) failed with value 10081031"
amit shetyeHi Amit,
This conversion problem. Calender not maintained for Fiscal variant "V3", for Year: 1008.
Maintain calender for year: 1008 and transfer global setting from soruce(R/3).
RSA1--> Source systems --> from context menu --> transfer global settings > choose fiscal year variants and calender> execute
Hope it Helps
Srini -
Error while loading data into cube
hi BW gurus,
when ever i am trying to load data into the cube from flat file after scheduling iam getting short dump in BW system. I checked it in st22 it is giving me a error as exception add_partition_failed. please help me to sort out this problem. If you know the error recovery please give me the answer in detail.
I will assign points for good answers.This is what the note says:
Symptom
The process of loading transaction data fails because a new partition cannot be added to the F fact table.The loading process terminates with a short dump.
Other terms
RSDU_TABLE_ADD_PARTITION_ORA, RSDU_TABLE_ADD_PARTITION_FAILED, TABART_INCONSITENCY, TSORA, TAORA , CATALOG
Reason and Prerequisites
The possible causes are:
SQL errors when creating the partition
Inconsistencies in the Data Dictionary control tables TAORA and TSORA
Solution
BW 3.0A & BW 3.0B
In the case of SQL errors:Analyze the SQL code in the system log or short dump and if possible, eliminate the cause. The cause is often a disk space problem or lock situations on the database catalog or, less frequently: the partitioning option in the ORACLE database is not installed.
The most common cause of the problem is inconsistencies in the TAORA and TSORA tables. As of Support Package 14 for BW 3.0B/Support Package 8 for BW 3.1C, the TABART_INCONSITENCY exception is issued in this case. The reason is almost always missing entries in TSORA for the tablespaces of the DDIM, DFACT and DODS data classes.
The TAORA table contains the assignment of data classes to data tablespaces and their attributes, for example:
Tablespace Data class
DDIM PSAPDIMD ........
DFACT PSAPFACTD ........
DODS PSAPODSD .......
Foreach data tablespace, the TSORA table must contain an entry for the corresponding index tablespace, for example:
TABSPACE INDSPACE
PSAPDIMD PSAPDIMD
PSAPFACTD PSAPFACTD
PSAPODSD PSAPODSD
In most cases, these entries are missing and have to be added. See also notes 502989 and 46272. -
Error while uploading data into cube
I am trying to upload data into my content cube but I got an error it says"
"Time conversion from 0CALDAY to 0FISCPER (fiscal year S1 ) failed with value 20040303"
I checked the data in the PSA it's there but the first record is not green light it has red light. Could you please give me some idea how to solve this problems.
Thank you in advance
sajitaIf you don't know if you want to take over all settings (especially exchange rates may be critical) the problem is probably found in the fiscal year variant. So if you just take over the fiscal year variants.
If the problem remains you could check the following things:
In SPRO -> Global Settings -> Fiscal Year Variants (or similar) check:
Does a fiscal year variant S1 exist?
Is it time dependent? If yes, is it valid for Mar 3rd 2004?
If it is a self defined variant check if there is a period defined for March 3rd 2004.
Best regards
Dirk -
Error while updating data from DataStore object
Hi,
Currently we are upgrading BW3.5 to BI7.0 for technical only,
we found and errors during process chain run in further processing step. This step is basically a delta loading from DSO to Cube.
The error message are:
Error while updating data from DataStore object 0GLS_INV
Message no. RSMPC146
Job terminated in source system --> Request set to red
Message no. RSM078
That's all no further errors message can be explained clearly here from system.
I have applied SAP note 1152453 and reactivate the datasource, infosource, and data target.
Still no help here!?
Please advise if you encountered these errors before.
Thanks in advance.
Regards,
David
Edited by: David Tai Wai Tan on Oct 31, 2008 2:46 PM
Edited by: David Tai Wai Tan on Oct 31, 2008 2:50 PM
Edited by: David Tai Wai Tan on Oct 31, 2008 2:52 PMHi Vijay,
I got this error:
Runtime Errors MESSAGE_TYPE_X
Date and Time 04.11.2008 11:43:08
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Short text of error message:
No start information on process LOADING
Long text of error message:
Diagnosis
For process LOADING, variant ZPAK_0SKIJ58741F4ASCSIYNV1PI9U, the
end should be logged for instance REQU_D4FIDCEKO82JUCJ8RWK6HZ9KX
under the log ID D4FIDCBHXPLZMP5T71JZQVUWX. However, no start has
been logged for this process.
System Response
No log has been written. The process (and consequently the chain)
has been terminated.
Procedure
If possible, restart the process.
Procedure for System Administration
Technical information about the message:
Message class....... "RSPC"
Number.............. 004
Variable 1.......... "D4FIDCBHXPLZMP5T71JZQVUWX"
Variable 2.......... "LOADING"
Variable 3.......... "ZPAK_0SKIJ58741F4ASCSIYNV1PI9U"
Variable 4.......... "REQU_D4FIDCEKO82JUCJ8RWK6HZ9KX"
Any idea? -
Error while updating data from PSA to ODS
Hi Sap Gurus,
I am facing the error while updating data from PSA to ODS in BI 7.0
The exact error message is:
The argument 'TBD' cannot be interpreted as a number
The error was triggered at the following point in the program:
GP44QSI5RV9ZA5X0NX0YMTP1FRJ 5212
Please suggest how to proceed on this issue.
Points will be awarded.Hi ,
Try to simulate the process.That can give you exact error location.
It seems like while updating few records may be no in the format of the field in which it is updated.
Regards
Rahul Bindroo -
Background job finishes but error Error While Updating Material into Standard SAP5678
Dear
we run background job which finishes successfully but when we sqw logs it shows
Error While Updating Material into Standard SAP5678
Kindly share the reasons
RegardsThis is an ERP Upgrade space and you should consider raising threads in the right space for prompt replies. Next time consider using SAP NetWeaver Administrator space for issues like these. Also you should consider closing your previous thread with the correct answer for future reference.
What I see is a custom job. You should check what the job does and also the trace file of the work process and the consult with the application team or the developer for more information. Unfortunately with that screenshot there is nothing much we can advice you.
Regards
RB -
Getting error while loading Data into ASO cube by flat file.
Hi All,
i am getting this error Essbase error 1270040: Data load buffer[1] does not exist while loading data into ASO cube.
does anyone have solution.
Regards,
VMAre you using ODI to load the data or maxl? If you are using an ODI interface, are you using a load rule also which version of essbase and ODI are you using
Cheers
John
http://john-goodwin.blogspot.com/ -
Error while updating to target CUBE
Hi,gurus here.
I upload data from ODS into CUBE via DTP. The error happens:
Error while updating to target ZICMM0200 (type INFOCUBE)
Message no. RSBK241
Package 3 / 2011.11.21 07:05:07 / Status 'Processed with Errors'
Message no. RSBK257
I've implemented the note 0001148007 and 0001159978,they don't work.
Thanks,for your help.Hi,
Please try this....
Data is not updated to Infocube (BI 7.0) when i load from flat file
RSBK257 - Status 'Processed with Errors' message in DTP -
Update data into Cube from 2 ODS
Hello Guys !
I have error in updating data from my 2 ODS to Mu infoCube.
I get this error :
10 records sent (0 received)
Any one have some idea why it doesn't work.I have 2 ODS, I need to update my infocube with these 2 ODS.
on the ods context Menu : Update ODS data into Target ...
I get a Yellow Icon on the monitor.
on the details tab under Extraction, I get a yellow warning :
it says : 10 records sent (0 records received)
I'll check out the updates rules between ODS and my Cube.
thanks for your help. -
Error while loading data into External table from the flat files
HI ,
We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
While loading the data, we are encountering the following error.
Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: un) while loading data into table_ext
Please let us know what needs to be done in this case to solve this problem.
Thanks,
KartheekKartheek,
I used Google (mine still works).... please check those links:
http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
HTH,
Thierry -
Error while Inserting data into flow table
Hi All,
I am very new to ODI, I am facing lot of problem in my 1st interface. So I have many questions here, please forgive me if it has irritated to you.
========================
I am developing a simple Project to load a data from an input source file (csv) file into a staging table.
My plan is to achieve this in 3 interfaces:
1. Interface-1 : Load the data from an input source (csv) file into a staging table (say Stg_1)
2. Interface-2 : Read the data from the staging table (stg_1) apply the business rules to it and copy the processed records into another staging table (say stg_2)
3. Interface-3 : Copy the data from staging table (stg_2) into the target table (say Target) in the target database.
Question-1 : Is this approach correct?
========================
I don't have any key columns in the staging table (stg_1). When I tried to execute the Flow Control of this I got an error:
Flow Control not possible if no Key is declared in your Target Datastore
With one of the response (the response was - "FLOW control requires a KEY in the target table") in this Forum I have introduced a column called "Record_ID" and made it a Primary Key column into my staging table (stg_1) and my problem has been resolved.
Question-2 : Is a Key column compulsary in the target table? I am working in BO Data Integrator, there is no such compulsion ... I am little confused.
========================
Next, I have defined one Project level sequence. I have mapped the newly introduced key column Record_Id (Primary Key) with the Project level sequence. Now I am got another error of "CKM not selected".
For this, I have inserted "Insert Check (CKM)" knowledge module in my Project. With this the above problem of "CKM not selected" has been resolved.
Question-3 : When is this CKM knowledge module required?
========================
After this, the flow/interface is failing while loading data into the intermediar ODI created flow table (I$)
1 - Loading - SS_0 - Drop work table
2 - Loading - SS_0 - Create work table
3 - Loading - SS_0 - Load data
5 - Integration - FTE Actual data to Staging table - Drop flow table
6 - Integration - FTE Actual data to Staging table - Create flow table I$
7 - Integration - FTE Actual data to Staging table - Delete target table
8 - Integration - FTE Actual data to Staging table - Insert flow into I$ table
The Error is at Step-8 above. When opened the "Execution" tab for this step I found the message - "Missing parameter Project_1.FTE_Actual_Data_seq_NEXTVAL RECORD_ID".
Question-4 : What/why is this error? Did I made any mistake while creating a sequence?Everyone is new and starts somewhere. And the community is there to help you.
1.) What is the idea of moving data from stg_1 and then to stg_2 ? Do you really need it for any other purpose other than move data from SourceFile to Target DB.
Otherwise, its simple to move data from SourceFile -> Target Table
2.) Does your Target table have a Key ?
3.) CKM (Check KM) is required when you want to do constraint validation (Checking) on your data. You can define constraints (business rules) on the target table and Flow Control will check the data that is flowing from Source File to Target table using the CKM. All the records that donot satisfy the constraint will be added to E$ (Error table) and will not be added to the Target table.
4.) Try to avoid ODI sequences. They are slow and arent scalable. Try to use Database sequence wherever possible. And use the DB sequence is target mapping as
<%=odiRef.getObjectName( "L" , "MY_DB_Sequence_Row" , "D" )%>.nextval
where MY_DB_Sequence_Row is the oracle sequence in the target schema.
HTH -
Error while loading data into clob data type.
Hi,
I have created interface to load data from oracle table into oracle table.In target table we have attribute with clob data type. while loading data into clob field ODI gave below error. I use odi 10.1.3.6.0
java.lang.NumberFormatException: For input string: "4294967295"
at java.lang.NumberFormatException.forInputString(Unknown Source)
at java.lang.Integer.parseInt(Unknown Source)
at java.lang.Integer.parseInt(Unknown Source)
Let me know if anyone come across and resolved this kind of issue.
Thanks much,
Nishit GajjarMr. Gajjar,
You didnt mention what KMs you are using ?
have a read of
Re: Facing issues while using BLOB
and
Load BLOB column in Oracle to Image column in MS SQL Server
Try again.
And can you please mark the Correct/Helpful points to the answers too.
Edited by: actdi on Jan 10, 2012 10:45 AM -
Error while load data into Essbase using ODI
Hi ,
I'm getting the following error while loading measures into Essbase using ODI, I used the same LOG nd Error file and file path for all my Dimensions , this worked well but not sure why this is not working for measures....need help.
File "<string>", line 79, in ?
com.hyperion.odi.common.ODIHAppException: c:/temp/Log1.log (No such file or directory)
Thanks
VenuAre you definitely running it against an agent where that path exists.
Have you tried using a different location and filename, have you restarted the agent to make sure there is not a lock on the file.
Cheers
John
http://john-goodwin.blogspot.com/ -
Error while inserting data into a table.
Hi All,
I created a table.While inserting data into the table i am getting an error.Its telling "Create data Processing Function Module".Can any one help me regarding this?
Thanx in advance
anirudhHi Anirudh,
Seems there is already an entry in the Table with the same Primary Key.
INSERT Statement will give short dump if you try to insert data with same key.
Why dont you use MODIFY statement to achieve the same.
Reward points if this Helps.
Manish
Maybe you are looking for
-
Getting a word out of the select stmt
Hi, I have a query Select filed1 from table1 where CONDITION Field1 returns click the "Post Message" button to submit your message I need to get the "Post Message" part out of field1 Is there any way of doing it in the SELECT statement itself ? thank
-
How to specify the Process Flow Module with SQLPLUS_EXEC_TEMPLATE.SQL ?
Hi, we have a couple of process flow modules that have PF Packages and Process Flows with the same name. E.g PFMOD1 (Module) FILELOAD (Package) PF1 (Pf) PFMOD2 (Module) FILELOAD (Package) PF1 (Pf) Normally we can specify "FILELOAD/PF1" as a paramater
-
How to make volume work in FaceTime
I tried using facetime on my mac tonite talking with my mom also using Facetime on her mac. The video feature worked but we couldn't figure out how to make the volume work. can someone help me?
-
Changing list item limit has broken row hover effect
I have created a list and really like the way when you move your mouse of the an item the row highlights, the list I have has about 80 items and I wanted to show them all on the same page so I changed the item limit of the list to 80. The problem is
-
Trying to delete Active Directory but getting error's
Hi There, I am trying to delete an Active Directory that I have. I have removed all subscriptions from this Active Directory but now I get the message: Directory contains one or more applications that were added by a user or administrator. Under the