DTP Error for Data Loads
Hello,
i'm using BI 7x version and we have a custom datasource and custom transformation/DTP process. My transformation bring the data into PSA and then DTP shall load the PSA data into infocube... instead my DTP is going against the datasource itself thereby brining double values.
Please let me know if i'm doing anything wrong here.
Thanks
Sameer
Sameer,
I'm a bit confused with your explaination. A transformation doesn't bring data into the PSA (DataSource), an infopackage does this. Then you have a transformation between the DataSource(PSA) to a InfoProvider which you use a DTP to move the data. Assuming all of this is true in your example and you don't have two same data packages in the PSA then what may be the issue is that your data source is a 3.x not 7.0. If so, then your infopackage can load the data to the target and the PSA just like in BI 3.x. If this is the case, then you run the DTP, you will double your data.
Regards,
Dae
Similar Messages
-
Can anybody tell me what are the common error we come across in BW/BI for following:-
1] Data Loading in Infocube/ODS/DSO
2] Aggregates
3] PSA
4] DTP
5] Transformations
Thanks in advanceHi,
Here are the list of common issues we face while data loading in to BW:
Data Loading in Infocube/ODS/DSO:
1) Missing SID issue or missing master data
2) Data load cancelled becasue of various exceptions. Eg: Message Type 'x', DBIF SQL Error,
3) Data records found in duplicate
4) Alpha Confirming Vaue error
5) Invalid time interval
6) Time overlap error
7) Job Cancelled in source system
8) RFC Connection Error
9) Data Source Replication error
10) Data load locked by active change run
11) Attributes to target by user 'xyz'
12) Job cancelled as the previous load is still running
13) Target locked by a change run
(Some times data loads dont fail but run for long time then usual with out any progess, because of struck tRFCs in source sytem, source sytem performance is poor and the job is in release state for long time, etc...)
Aggregate Rollups:
1) Nofilled aaggregates available, rollup not possible
2) Rollup locked by active change run
3) Job cancelled because of various exceptions
PSA Updations:
1) Job cancelled because of various exception
2) Missing SID value
ODS Activaitons:
1) Request to be activate shoud be green - (Loading of data in to ODS in still going on)
2) Key exists in duplicate
3) Job cancelled because of various exceptions (Short Dumps)
Attribue Change Run:
1) Locked by a rollup
2) Job cancelled because of various exceptions
3) Locked by another change run
Hope it helps....
Cheers,
Habeeb -
Hi Gurus,
I am getting error while data loading. At BI side when I check the error it says Background Job Cancelled and when I check in R3 side I got the following error
Job started
Step 001 started (program SBIE0001, variant &0000000065503, user ID R3REMOTE)
Asynchronous transmission of info IDoc 2 in task 0001 (0 parallel tasks)
DATASOURCE = 2LIS_11_V_ITM
Current Values for Selected Profile Parameters *
abap/heap_area_nondia......... 2000683008 *
abap/heap_area_total.......... 4000317440 *
abap/heaplimit................ 40894464 *
zcsa/installed_languages...... ED *
zcsa/system_language.......... E *
ztta/max_memreq_MB............ 2047 *
ztta/roll_area................ 6500352 *
ztta/roll_extension........... 3001024512 *
4 LUWs confirmed and 4 LUWs to be deleted with function module RSC2_QOUT_CONFIRM_DATA
ABAP/4 processor: DBIF_RSQL_SQL_ERROR
Job cancelled
Please help me out what should I do.
Regards,
MayankHi Mayank,
The log says it went to short dump due to temp space issue.as its the source system job ,check in the source system side for temp table space and also check at the BI side as well.
Check with your basis regarding the TEMP PSA table space - if its out of space ask them to increase the table space and try to repeat the load.
Check the below note
Note 796422 - DBIF_RSQL_SQL_ERROR during deletion of table BWFI_AEDAT
Regards
KP
Edited by: prashanthk on Jul 19, 2010 10:42 AM -
Essbase Error(1003050): Data Load Transaction Aborted With Error (1220000)
Hi
We are using 10.1.3.6, and got the following error for just one of the thousands of transactions last night.
cannot end dataload. Essbase Error(1003050): Data Load Transaction Aborted With Error (1220000)
The data seems to have loaded, regardless of the error. Should we be concerned, and does this suggest something is not right somewhere?
Your assistance is appreciated.
CheersHi John
Not using a load rule.
There were two other records which rejected based on absentee members. The error message was different for them, and easily fixed.
But this error doesn't tell us much. I will monitor the next run to see if the problem persists.
Thanks -
Missing Standard Dimension Column for data load (MSSQL to Essbase Data)
This is similar error to one posted by Sravan -- however I'm sure I have all dimensions covered -- going from MS SQL to SunOpsys Staging to Essbase. It is telling me missing standard dimension, however I have all accounted for:
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last): File "<string>", line 23, in ? com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
I'm using multiple time period inputs -- BegBalance,Jul,Aug,Sep,Oct,Nov,Dec,Jan,Feb,Mar,Apr,May,Jun (target has all of those in place of Time Periods)
I'm using hard coded input mapping for Metric, Scenario, Version, HSP_Rates and Currencies. -> 'Amount', 'Actual', 'Final', 'HSP_InputValue','Local' respectively.
The only thing I can think of is that since I'm loading to each of the months in the Time Periods dimension (the reversal was set up to accomodate that)... and now its somehow still looking for that? Time Periods as a dimension does not show up in the reversal -- only the individual months named above.
Any ideas on this one??John -- I extracted the data to a file and created a data load rule in Essbase to load the data. All dimensions present and accounted for (five header items as similar here) and everything loads fine.
So not sure what else is wrong -- still getting the missing dimension error.
Any other thoughts?? Here's the entire error message. Thanks for all your help on this.
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 23, in ?
com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
at org.python.core.PyMethod.__call__(PyMethod.java)
at org.python.core.PyObject.__call__(PyObject.java)
at org.python.core.PyInstance.invoke(PyInstance.java)
at org.python.pycode._pyx8.f$0(<string>:23)
at org.python.pycode._pyx8.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java)
at org.python.core.PyCode.call(PyCode.java)
at org.python.core.Py.runCode(Py.java)
at org.python.core.Py.exec(Py.java)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.k(e.java)
at com.sunopsis.dwg.cmd.g.A(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)
Caused by: com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.validateColumns(Unknown Source)
... 32 more
com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.k(e.java)
at com.sunopsis.dwg.cmd.g.A(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source) -
Short dump error for delta load to 0CFM_C10 via 0CFM_DELTA_POSITIONS flow
hi all,
i am getting short dump error for delta load to 0CFM_C10 via 0CFM_DELTA_POSITIONS flow.
not able to figure out the actual issue and how to solve it.
can anyone suggest?
below is the details of the short dump
Short text
Exception condition "UNKNOWN_TRANSCAT" raised.
What happened?
The current ABAP/4 program encountered an unexpected
situation.
What can you do?
Note down which actions and inputs caused the error.
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Error analysis
A RAISE statement in the program "CL_SERVICE_TRG================CP" raised the
exception
condition "UNKNOWN_TRANSCAT".
Since the exception was not intercepted by a superior
program, processing was terminated.
Short description of exception condition:
For detailed documentation of the exception condition, use
Transaction SE37 (Function Library). You can take the called
function module from the display of active calls.
How to correct the error
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"RAISE_EXCEPTION" " "
"CL_SERVICE_TRG================CP" or "CL_SERVICE_TRG================CM003"
"SORT_TRANSACTIONS"
or
"CL_SERVICE_TRG================CP" "UNKNOWN_TRANSCAT"
or
"SBIE0001 " "UNKNOWN_TRANSCAT"
If you cannot solve the problem yourself and want to send an error
notification to SAP, include the following information:
1. The description of the current problem (short dump)
To save the description, choose "System->List->Save->Local File
(Unconverted)".
2. Corresponding system log
Display the system log by calling transaction SM21.
Restrict the time interval to 10 minutes before and five minutes
after the short dump. Then choose "System->List->Save->Local File
(Unconverted)".
3. If the problem occurs in a problem of your own or a modified SAP
program: The source code of the program
In the editor, choose "Utilities->More
Utilities->Upload/Download->Download".
4. Details about the conditions under which the error occurred or which
actions and input led to the error.Hi,
It seems like some routine are there and check your rotines , transformation etc.
Regards
sivaraju -
Sample SOAP request for Data Loader API
Hi
Can anyone please help me out in giving a sample SOAP request for Data Loader API .This is to say import 1K records from my system to the CRM instance I have .Log into the application and then click on Training and Support there is a WS Library of Information within the application
-
Error regarding data load into Essbase cube for Measures using ODI
Hi Experts,
I am able to load metadata for dimensions into Essbase cube using ODI but when we are trying same for loading data for Measures encountring following errrors:
Time,Item,Location,Quantity,Price,Error_Reason
'07_JAN_97','0011500100','0000001001~1000~12~00','20','12200','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
'28_JAN_97','0011500100','0000001300~1000~12~00','30','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
'28_JAN_97','0011500100','0000001300~1000~12~00','500','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
Can anyone look into this and reply quickly as it's urgent requiremet.
Regards,
RohanWe are having a similar problem. We're using the IKM SQL to Hyperion Essbase (DATA) knowledge module. We are mapping the actual data to the field called 'Data' in the model. But it kicks everything out saying 'Unknown Member [Data] in Data Load', as if it's trying to read that field as a dimension member. We can't see what we missed in building the interface. I would think the knowledge module would just know that the Data field is, um, data; not a dimension member. Has anyone else encountered this?
Sabrina -
Error while data loading in real time cube
HI experts,
I have a problem. I am loading data from a flat file.The data is loading correctly till the DSO but when i am trying to load it into the cube it is giving an error.
The cube is a real time cube for PLANING. I have chnaged the status to allow data loading but still the DTP is giving an error.
It shows an error "error while extracting from DataStore" and some RSBK 224 ERROR and rsar 051 error.What was the resolution to this issue. We rae having the same issue only with external system (not a flat file). We get the RSAR 051 with a return code of 238 error message, like it is not even getting to the rfc connection (DI_SOURCE). We have been facing this issue for a while and even opened up a message with SAP.
-
Error in Data Load - Transformation inactive
Hi
While running DTPs to DSO or cube I am getting error saying that a Transformation called up by the request is inactive and hence the request cannot be executed.
But when I checked, I found all the transformations in the flow for the Data Load were active with Executable version = Active version.
I had been getting such error before also but at the time I executed the DTP using a Process Chain, then it ran fine without any errors, but now even the Process Chauin run is giving error.
Does anyone have an idea about what this error exactly is or the solution for the same?
Is this got anything to do with authorization issues?
Thanks,
NinadI had an experience with similar issue
Transformation Inactive;request cannot be executed
Message no. RSBK257
By debugging that DTP load, we found the root cause of problem, which is the following.
Error when writing the SID 10033950 (value "10033950") in table /BIC/SZPERSON
Since, a particular record can't be deleted from PSA, I modified that record in PSA and tried a load, but no use.
To fix this, I deleted (from SE14) complete data from the above SID table, and scheduled the main PC in Dev, and it worked fine this time. -
0HAP_SUBSTA_TEXT - Error in data load
Hi All,
I am trying to load the HR master data through 0HAP_SUBSTA_TEXT.
when i execute the DTP, I have an error,
0HAP_SUBSTA - Data Record 2( ' 0D ' ) :Dulpicate data record,
the details say,
There are duplicates of the data record 2 & with the key '0D &' for
characteristic 0HAP_SUBSTA &.
How come the language, has duplicate records,,,,???
I tried all possible ways, delting the request in PSA,
info object and so on,,,but in vain,,,
Pls advice,
Thanks in advance,
NeelaHi Subray,
Actually I m maintaining these in BI 7, hence i tried to maitain in -
In the infopackage, the processing tab, select "Only PSA" option. Then check "Update subsequently in data target" option and also "Ignore Double data records" option.
When i migrate the DS i get information: "Default deleting duplicate data records has changed"
which means i dont carry this option in migration.
also, i tied to have the opton Handle duplicate data records in the dtp, but even then, i am not getting al the records updated, like i have 4 records to be updated with EN, but i get only 2 records updated to the target.
pls advice,,,
Neela -
Error in data load from application server
Well, this problem again!
In datasource for dataload by flatfile, tab Extraction i selected Load Text-Type File From Application Server.
In tab Proposal:
Converter: Separated with Separator (for Example, CSV).
No. of Data Records: 9198
Data load sucefull.
I add 1 record:
Converter: Separated with Separator (for Example, CSV).
No. of Data Records: 9199
So appear a message error "Cannot convert character sets for one or more characters".
I look in line 9199 in file and not have any special character. I use AL11 and debug to see this line.
When i load file from local workstation, this error not occur.
What's happen?Hi Rodrigo,
What type of logical file path have yopu created in the application server for loading this file.
Is it a UNIX file path or an ASCII file path.
Did you check how the file looks in the application server AL11?
Prathish -
Error is data loading from 3rd party source system with DBCONNECT
Hi,
We have just finished an upgrade of SAP BW 3.10 to SAP NW 7.0 EHP1.
After the upgrade, we are facing a problem with data loads from a third party Oracle source system using DBConnect.
The connection is working OK and we can see the tables in the source system. But we cannot load the data.
The error in the monitor is as follows:
'Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Extractor .
Refer to the error message.'
But, unfortunately, the error message has no further information.
If we look at the job log in sm37, the job finished with the following log -
27.10.2009 12:14:19 Job started 00 516 S
27.10.2009 12:14:19 Step 001 started (program RSBATCH1, variant &0000000000119, user ID RXSAHA) 00 550 S
27.10.2009 12:14:23 Start InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG RSM1 797 S
27.10.2009 12:14:24 Element NOAUTHORITYCHECK is not available in the container OL 356 S
27.10.2009 12:14:24 InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG created request REQU_4FMXSQ6TLSK5CYLXPBOGKF31G RSM1 796 S
27.10.2009 12:14:24 Job finished 00 517 S
In a BW 3.10 system, there is no message related to element NOAUTHORITYCHECK. So, I am wondering if this is something new in NW 7.0.
Thanks in advance,
RajibThere will be three things to get the errors like this
1.RFC CONNECTION FAILED
2.CHECK THE SOURCE SYSTEM
3.CHECK IT OUT WITH Oracle Consultants WEATHER THEY ARE FILLING UP THE LOADS.TELL THEM TO STOP
4.CHECK I DOC PROCESSING
5.FINALLY MEMORY ISSUES.
6.CATCH THE DATA SOURCE FIRST CHANGE IT AND THEN ACTIVATE AND RUN THE LOAD
7.LAST IS MEMORY ISSUE.
and also Check the RFC connection in SM59 If it is ok then
check the SAP note : 692195 for authorization
Santosh -
Hi all,
should i know that what are the possible erros occur in data extraction & reporting .....,
how would we rectify it in real time ..........,
can anyone tell me tips please.....all masters please advice me some solutions based on real time plz..,
Regards
swethaHi Swetha,
I guess you are new to BW. Most of the performance problems are best learnt on the job. The more the time you would spend on the system the more you would learn. This forum has lots of info on the possible errors. You could search for the error you face.
The common errors during extraction & reporting are:
1. Timestamp error.
2. Invalid values
3. Tablespace Issue
4. Job getting cancelled due to lack of resources.
5. Deadlocks
6. No data available for reporting
7. Job running for longtime
etc..etc..etc...
The list is too long to be discussed here. You could refer to the web log by Siggi. /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
You can serach through the forum by using the key words I mentioned.
Hope this helps.
Bye
Dinesh -
Need help with Rollback data if there is error in Data load
Hi All,
We are trying to load data to Oracle 11g database. We want a trigger, procedure or something like that to rollback the data if there are errors in load. Is it possible to do rollback after all the records has been parsed ? So if we try to load 100 records and if there are 30 records with error, we want to rollback after all the 100 records are parsed.
Please advice.>
Thanks for the suggestion. I'll try that option. So currently we are only loading data that is validated and erroneous records are rejected using trigger. So we don't get any invalid data in table. But Now users are saying that all the records should be rejected if there is even one error in the data load.
>
I generally use a much simpler solution for such multi-stage ETL processes.
Each table has a IS_VALID column that defaults to 'N'. Each step of the process only pulls data with the flag set to 'Y'.
That allows me to leave data in the table but guarantee that it won't be processed by subsequent stages. Since most queries that move data from one stage to another ultimately have to read table rows (i.e. they can't just use indexes) it is not a performance issue to add a predicate such as "'AND IS_VALID = 'Y'" to the query that accesses data.
1. add new unvalidated data - automatically flagged an invalid by default of 'N" on IS_VALID column
2. run audit step #1 - capture the primary key/rowid of any row failing the audit.
3. run audit steps #2 through #n - capture error row ids
4. Final step - update the data setting IS_VALID to 'Y' only if a row passes ALL audits: that is, only if there are NO fatal errors for that row captured in the error table.
That process also allows me to capture every single problem that any row has so that I can produce reports for the business users that show everything that is wrong with the data. There are some problems that the business wan't to ignore, others that can be fixed in the staging tables then reprocessed and others that are rejected since they must be fixed in the source system and that can take several days.
For data that can be fixed in the staging tables the data is fixed and then the audit is rerun which will set the IS_VALID flag to 'Y' allowing those 'fixed' rows to be included in the data that feeds the next processing stage.
Maybe you are looking for
-
This is the best method to install a new version of Studio!
Do it cleanly. By this I mean create a new fresh startup disk for the install of Studio. Whether for Studio 1 ,2 or 3, trust me. this method will get you to a really stable FCS system. 1. Backup everything you want to keep if you're using the same dr
-
I cannot connect my I pad mini and iPhone using Bluetooth.
I cannot connect my I pad mini and iPhone using Bluetooth.
-
Xcelsius Reporting Services Gateway (XRS) and SQL Server 2008
Hello, I saw on a post somewhere that SAP support had stated that XRS wasn't compatible with SQL server 2008. Is that correct? I have hit an error. <?xml version="1.0" encoding="utf-8" ?> - <ROOT> <Message>Client found response content type of '',
-
Unable to see PDF option in Broadcaster
Hi, I am unable to see the option to send email attactment in PDF format through BEx Broadcaster. only HTML, MHTML options are available. Kindly help me. Regards, Suri.
-
Unable to find ejb-jar with uri X.jar in ear at c:/Y/split_src
Stack trace is below, but the ejb does exist at the location that the FileNotFoundException states. I've checked that application.xml, org.eclipse.wst.common.component both have the correct name for this dependency. java.lang.Exception: Exception rec