ODI Data load fails everytime with different resons
Hi All,
We are loading data from Oracle Views to MSSQL tables using ODI 11g.
one of the interface in our package is getting filed every time at different steps with different errors like temp space,Agent Failed, Connection closed.
It has 15 sources (Oracle Views) which are unioned to load the target table. And One of the view takes 2 hours to query the database.
This interface should take 4 hours approximately as per our previous loads but it is gets hang, runs for hours and then errored out.
Can any one help me on this?
Hi,
What you could do is create a new procedure, then add a new command, set the technology to O/S
Then use something like
essmsh -D /<enterlocation>/ordclract.mxls 3416683,1342131001
You might need to put the location of essmsh as well.
Cheers
John
http://john-goodwin.blogspot.com/
Similar Messages
-
Data load failing with a short-dump
Hello All,
There is a data load failing with the following short-dump:
Runtime Errors UNCAUGHT_EXCEPTION
Except. CX_FOEV_ERROR_IN_FUNCTION
Full text:
"UNCAUGHT_EXCEPTION" CX_FOEV_ERROR_IN_FUNCTIONC
"CL_RSAR_FUNCTION==============CP" or "CL_RSAR_FUNCTION==============CM004"
"DATE_WEEK"
It seems as if it is failing in a function module - please advise.
Regards,
Keith KibuukaDid you read OSS [Note 872193 - Error in the formula compiler|https://service.sap.com/sap/support/notes/872193] or OSS [Note 855424 - Runtime error UNCAUGHT_EXCEPTION during upload|https://service.sap.com/sap/support/notes/855424]
Regards -
Hi All,
ODI data loading step's speed is too low.
I am using;
LKM=LKM SQL to Oracle,
CKM=CKM SQL,
IKM=IKM SQL Incremental Update for replication SQL Server to Oracle.
I don't use Flow control in interfaces.
SQL Server and Oracle database are installed on the same server.
How can I do it faster?If the two database servers are on the same machine, and you are dealing with bulk data, you should use an LKM which uses bulk methods (BCP to extract the data from SQL Server and External tables to get the data into Oracle) - something like this KM https://s3.amazonaws.com/Ora/KM_MSSQL2ORACLE.zip (which is actually an IKM but the LKM is not so different.
Hope it help -
Essbase Error(1003050): Data Load Transaction Aborted With Error (1220000)
Hi
We are using 10.1.3.6, and got the following error for just one of the thousands of transactions last night.
cannot end dataload. Essbase Error(1003050): Data Load Transaction Aborted With Error (1220000)
The data seems to have loaded, regardless of the error. Should we be concerned, and does this suggest something is not right somewhere?
Your assistance is appreciated.
CheersHi John
Not using a load rule.
There were two other records which rejected based on absentee members. The error message was different for them, and easily fixed.
But this error doesn't tell us much. I will monitor the next run to see if the problem persists.
Thanks -
Data load failed while loading data from one DSO to another DSO..
Hi,
On SID generation data load failed while loading data from Source DSO to Target DSO.
Following are the error which is occuuring--
Value "External Ref # 2421-0625511EXP " (HEX 450078007400650072006E0061006C0020005200650066
Error when assigning SID: Action VAL_SID_CONVERT, InfoObject 0BBP
So, i'm not getting WHY in one DSO i.e Source it got successful but in another DSO i.e. Target its got failed??
While analyzing all i check that SIDs Generation upon Activation is ckecked in source DSO but not in Target DSO..so it is reason its got failed??
Please explain..
Thanks,
SnehaHi,
I hope your data flow has been designed in such a way where the 1st DSO as a staging Device and all transformation rules and routine are maintained in between 1st to 2nd dso and sid generation upon activation maintained in 2nd DSO. By doing so you will be getting your data 1st DSO same as your source system data since you are not doing any transformation rules and routine etc.. which helps to avoid data load failure.
Please analyze the following
Have you loaded masterdata before transaction data ... if no please do it first
go to the property of first dso and check whether there maintained sid generation up on activation (it may not be maintained I guess)
Goto the property of 2nd Dso and check whether there maintained sid generation up on activation (It may be maintained I hope)
this may be the reason.
Also check whether there is any special char involvement in your transaction data (even lower case letter)
Regards
BVR -
Alert Message When Data Load Fails
Hi Friends,
I have three process chains(p1,p2,p3)and all for transactional data loading(psa-ods-cube).suppose whenever data load fails,it sends message to my company mail id. is there any functionality to do like this.
i am not much aware of RSPCM(Process Chain Monitor).if anyone knows. tell me how much does it support my requirement?
thank in advance
samhi Sam,
welcome to sdn ...
you can add 'create message' by right click the process type, this will lead you to mail sending setting.
see thread
Getting Failed process chain info through email
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/b0afcd90-0201-0010-b297-9184845346ca
you may need to setup sapconnect for email, take a look
http://help.sap.com/saphelp_webas620/helpdata/en/2b/d925bf4b8a11d1894c0000e8323c4f/frameset.htm
http://help.sap.com/saphelp_webas620/helpdata/en/af/73563c1e734f0fe10000000a114084/content.htm
hope this helps. -
Load fails everytime if it has zero records
Hi friends,
1.Load fails everytime if it has zero records?.
how to overcome this issue.
2.HOw to Load field more than 60 chars length& how to represent in WEB Reports?
It's Very Urgent.
Thanks,
Basava Raju<i>1.Load fails everytime if it has zero records?.
how to overcome this issue.</i>
Check this - Open up a load in RSMO - Menu - Settings - Evaluation of Requests - Set it Green - This will act global.
To make is specific to a load - Infopackage - Scheduler - Jst look for similar option ( guess it is the penultimate one ) -
Segmentation fault error during data load in parallel with multiple rules
Hi,
I'm trying to do sql data load in parallel with multiple rules (4 or 5 rules, maybe), i'm getting a "segmentation fault" error. I tested 3 rules file and it worked fine. we're using Essbase system 9.3.2., with UDB (v8) as the sql data source. ODBC driver is DataDirect 5.2 DB2 Wire Protocol Driver (ARdb222). Please let me know if you have any information on this.
thx.
YHi Thad,
I was wondering, if system is unicode or non unicode that should not matter the amount and currency field . As currencies are defined by SAP and it is in pure English at least a currency code part of 3 Chars.
Could this because of some incosistency of data ??
I would like to know for Which currency had some special characters it it in that particular record ??
Hope that helps.
Regards
Mr Kapadia -
Infocube data loads fail with UNCAUGHT_EXCEPTION dump after BI 7.01 upgrade
Hi folks,
We're in the middle of upgrading our BW landscape from 3.5 to to 7.01. The Support pack we have in the system is SAPKW70103.
Since the upgrade any of the data loads going to infocubes are failing with the following
ABAP dump UNCAUGHT_EXCEPTION
CX_RSR_COB_PRO_NOT_FOUND.
Error analysis
An exception occurred which is explained in detail below.
The exception, which is assigned to class 'CX_RSR_COB_PRO_NOT_FOUND', was not
caught and
therefore caused a runtime error.
The reason for the exception is:
0REQUEST is not a valid characteristic for InfoProvider
Has anyone come accross this issue and if so any resolutions you adopted to fix this. Appreciate any help in this regard.
Thanks
Mujeebhi experts,
i have exactly the same problem.
But I can`t find a test for infoobjects in rsrv. How can I repair the infoobject 0REQUEST`?
Thx for all answers!
kind regards
Edited by: Marian Bielefeld on Jul 7, 2009 8:04 AM
Edited by: Marian Bielefeld on Jul 7, 2009 8:04 AM -
Data load failed at infocube level
Dear Experts,
I hve data loads from the ECC source system for datasource 2LIS_11_VAITM to 3 different datatargets in BI system. the data load is successful until PSA when comes to load to datatargets the load is successful to 2 datatargets and failed at one data target i.e. infocube. I got the following error message:
Error 18 in the update
Diagnosis
The update delivered the error code 18 .
Procedure
You can find further information on this error in the error message of
the update.
Here I tried to activate update rules once again by excuting the program and tried to reload using reconstruction fecility but I get the same error message.
Kindly, please help me to analyze the issue.
Thanks&Regards,
MannuHi,
Here I tried to trigger repeat delta in the impression that the error will not repeat but then I encountered the issues like
1. the data load status in RSMO is red but where as in the data target the status is showing green
2. when i try to analyze psa from rsmo Tcode PSA is giving me dump with the following.
Following analysis is from Tcode ST22
Runtime Errors GETWA_NOT_ASSIGNED
Short text
Field symbol has not yet been assigned.
What happened?
Error in the ABAP Application Program
The current ABAP program "SAPLSLVC" had to be terminated because it has
come across a statement that unfortunately cannot be executed.
What can you do?
Note down which actions and inputs caused the error.
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Error analysis
You attempted to access an unassigned field symbol
(data segment 32821).
This error may occur if
- You address a typed field symbol before it has been set with
ASSIGN
- You address a field symbol that pointed to the line of an
internal table that was deleted
- You address a field symbol that was previously reset using
UNASSIGN or that pointed to a local field that no
longer exists
- You address a global function interface, although the
respective function module is not active - that is, is
not in the list of active calls. The list of active calls
can be taken from this short dump.
How to correct the error
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"GETWA_NOT_ASSIGNED" " "
"SAPLSLVC" or "LSLVCF36"
"FILL_DATA_TABLE"
Here I have activated the include LSLVCF36
reactivated the transfer rules and update rules and retriggered the data load
But still I am getting the same error...
Could any one please help me to resolve this issue....
Thanks a lot,
Mannu
Thanks & Regards,
Mannu -
Multiple data loads in PSA with write optimized DSO objects
Dear all,
Could someone tell me how to deal with this situation?
We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
Does any of you have a solution for this?
Thanks in advance.
HaraldHi Ajax,
I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
Regards,
Harald -
Data load fails from DB table - No SID found for value 'MT ' of characteris
Hi,
I am loading data into BI from an external system (oracle Database).
This system has different Units like BG, ROL, MT (for Meter). While these units are not maintaned in R3/BW. They are respectively BAG, ROLL, M.
Now User wants a "z table" to be maintained in BW, which has "Mapping between external system Units and BW units".
So that data load does not fail. Source system will have its trivial Units, but at the time of loading, BW Units are loaded.
For example -
Input Unit (BG) -
> Loaded Unit in BW (BAG)
Regards,
Saurabh T.Hello,
The table T006 (BW Side) will have all the UOM, only thing is to make sure that all the Source System UOM are maintained in it. It also have fields for Source Units and target units as you have mentioned BG from Source will become BAG. See the fields MSEHI, ISOCODE in T006 table
If you want to convert to other units then you need to implement Unit Conversion.
Also see
[How to Report Data in Alternate Units of Measure|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/b7b2aa90-0201-0010-a480-a755eeb82b6f]
Thanks
Chandran -
Material Master Data load failed
Hi All,
I am loading material master data. I have around 8055 records in R/3. The load has failed with an error "Non-updated Idocs found in Business Information Warehouse" and asking me to process the IDocs manually. I have checked in WE05 for the IDocs and found 3 Idocs with status 64(IDoc ready to be transferred to application).
But when I checked in the manage screen of the 0MATERIAL I could find the Transfered and Updated records as 8055. I have even checked the data in 0MATERIAL and found that all the data(8055 records) has already been uploaded.
Why is it still showing an error(load failed) even when all the data has been uploaded? What should I do now?
Best Regards,
Nene.hi Nene/Sankar,
for material text no language please check Note 546346 - Material texts: no selection on languages
and for why idocs problem, check e.g Note 561880 - Requests hang because IDocs are not processed, Note 555229 - IDocs hang in status 64 for tRFC with immediate processing.
hope this helps.
Note 546346 - Material texts: no selection on languages
Summary
Symptom
When loading material texts from R/3 into the Business Information Warehouse, you cannot select languages.
Other terms
DataSource, 0MATERIAL_TEXT, InfoSource, InfoObject, 0MATERIAL, SPRAS, LANGU, 0LANGU, MAT_BW, extraction, selection field, delta extraction, ALE, delta, change pointer
Reason and Prerequisites
As of PI/PI-A 2001_2 Support Package 6, the selection option for the 'SPRAS' field in the OLTP system was undone in the 0MATERIAL_TEXT DataSource. (Refer here to note 505952).
Solution
As of PI/PI-A 2002_1 Support Package 4, the 'SPRAS' field is provided for selection again with the 0MATERIAL_TEXT DataSource in the OLTP system. When loading the material texts from BW, the language is still not provided for selection in the scheduler, instead all languages of the language vector in BW are implicitly requested from the source system. However, during the delta update in the source system, the change pointers for all languages in the source system are now set to processed, regardless of whether the language was requested by BW or not.
Import
Support Package 4 for PI/PI-A 2002_1_31I - PI/PI-A 2002_1_45B
Support Package 4 for PI 2002_1_46B - PI 2002_1_46C
Support Package 3 for PI 2002_1_470
In transaction RSA5, copy the D version of the 0MATERIAL_TEXT DataSource to the A version.
Then replicate the DataSources for 0MATERIAL in the BW system. The 'SPRAS' field is then flagged again as a selection field in the transfer structure of the 0MATERIAL InfoSource. The transfer rules remain unchanged. Activate the transfer rules and perform a delta initialization again.
Note 561880 - Requests hang because IDocs are not processed
Symptom
Data extraction in a BW or SEM BW system from an external OLTP System (such as R/3) or an internal (via DataMart) OLTP System hangs with the 'Yellow' status in the load monitor.
After a timeout, the request status finally switches to 'Red'.
Information IDocs with the status '64' are displayed in the 'Detail' tab.
Other terms
IDoc, tRFC, ALE, status 64
Reason and Prerequisites
Status information on load requests is transferred in the form of IDocs.
IDocs are processed in the BW ALE model using tRFC in online work processes (DIA).
IDoc types for BW (RSINFO, RSRQST, RSSEND) are processed immediately.
If no free online work process is available, the IDocs remain and must then be restarted to transfer the request information.With the conversion to asynchronous processing, it can often happen that no DIA is available for tRFC for a short period of time (see note 535172).
The IDoc status 64 can be caused by other factors such as a rollback in the application updating the IDocs. See the relevant notes.
Furthermore, you can also display these IDocs after the solution mentioned below, however, this is only intended as information.
You must therefore analyze the status text.
Solution
We recommend asynchronous processing for Business Warehouse.
To do this, you need the corrections from note 535172 as well as note 555229 or the relevant Support Packages.
The "BATCHJOB" entry in the TEDEF table mentioned in note 555229 is generated automatically in the BW system when you import Support Package 08 for BW 3.0B (Support Package 2 for 3.1 Content).For other releases and Support Package levels, you must manually implement the entry via transaction SE16.
Depending on the Basis Support Package imported, you may also have to implement the source code corrections from note 555229.
The following basic recommendations apply in avoiding bottlenecks in the dialog processing and checking of IDocs for BW:
1. Make sure there is always sufficient DIA, that is, at least 1 DIA more than all other work processes altogether, for example, 8 DIA for a total of 15 work processes (see also note 74141).
TIP: 2 UPD process are sufficient in BW, BW does not need any UP2.
2. Unprocessed Info IDocs should be processed manually within the request in BW;in the 'Detail' tab, you can start each IDoc again by selecting 'Update manually' (right mouse button).
3. Use BD87 to check the system daily (or whenever a problem arises) for IDocs that have not yet been processed and reactivate if necessary.
However, make sure beforehand that these IDocs can actually be assigned to the current status of requests.
TIP: Also check transaction SM58 for problematic tRFC entries.
IMPORTANT: Notes 535172, 555229 and the above recommendations are relevant (unless otherwise specified) both for BW and for SAP source systems. -
Data Load Fails due to duplicate records from the PSA
Hi,
I have loaded the Master Data Twice in to the PSA. Then, I created the DTP to load the data from the PSA to the InfoProvider. The data load is failing with an error "duplicate key/records found".
Is there any setting that I can configure by which even though I have duplicate records in PSA, I can be successfully able to load only one set of data (without duplicates) in to the InfoProvider?
How can I set up the process chains to do so?
Your answer to the above two questions is appreciated.
Thanks,Hi Sesh,
There are 2 places where the DTP checks for duplicates.
In the first, it checks previous error stacks. If the records you are loading are still contained in the error stack of a previous DTP run, it will throw the error at this stage. In this case you will first have to clean up the previous error stack.
The second stage will clean up duplicates across datapackeages, provided the option is set in your datasource. But you should note that this will not solve the problem if you have duplicates in the same datapackage. In that case you can do the filtering yourself in the start routine of your transformation.
Hope this helps,
Pieter -
Issue with Data Load to InfoCube with Nav Attrivutes Turned on in it
Hi,
I am having a issue with loading data to a InfoCube. When i turn
on the navgational attributes in the cube the data load is failing
and it just says "PROCESSED WITH ERRORS". when i turn them off
the data load is going fine. I have done a RSRV test both on
the infoobject as well as the cube and it shows no errors. What
could be the issue and how do I solve it.
Thanks
Rashmi.Hi,
To activate a navigation attribute in the cube the data need not be dropped from the cube.
You can always activate the navigation attribute in the cube with data in the cube.
I think you have tried to activate it in the master data as well and then in the cube or something like that??
follow the correct the procedure and try again.
Thanks
Ajeet
Maybe you are looking for
-
Is there a problem with JFrame and window listeners?
As the subject implies, i'm having a problem with my JFrame window and the window listeners. I believe i have implemented it properly (i copied it from another class that works). Anyway, none of the events are caught and i'm not sure why. Here's the
-
This should be an easy question to answer (I hope). I'm surprised that after using Final Cut for several projects that I still can't figure this out. Let's say that my project has five video tracks, and is several minutes long. Scattered through thes
-
How do I set up the wi fi with my Verizon hotspot?
I have a 13 inch Macbook Air and a Verizon MiFi 4G LTE hotspot device. How do I hook up the mac to the hotspot so I can use my mac anywhere. Thank you
-
IFS- Simplest Application step 9
Hi, I follow the instructions for creating the SIMPLEST (Hello, iFS) example in the "iFS Online Resources" document, everything seems to work until the very last step (step 9). After i uploaded first.xml at step 9, i point my web brower to simplest.j
-
Billing Document Ouput Control
Hi guys, I've set-up the output control for my billing document so it will call the relevant sapscript form to output the invoice. The problem I'm facing is when I go into a billing document choose <b>Extras->Messages->Header->Edit</b> and try an sel