Error while setting data into table
I am getting an error while inerting a new record into table
error is
JBO-27010: Attribute set with value 9991431 for AckmntInd in EdMsgHeaderDetailsEO has invalid precision/scale oracle.jbo.AttrSetValException: JBO-27010: Attribute set with value 9991431 for AckmntInd in EdMsgHeaderDetailsEO has invalid
and followed by
java.lang.ArrayIndexOutOfBoundsException: 20 at oracle.jbo.server.ViewRowStorage.getViewAttributeDef
help to fix this error is appreciated !!
thanks,
Message was edited by:
user447047
Message was edited by:
user447047
As you have mentioned following error...
JBO-27010: Attribute set with value 9991431 for AckmntInd in EdMsgHeaderDetailsEO has invalid precision/scale oracle.jbo.AttrSetValException:
It comes when you are trying to insert a value to an attribute whose DB width is less then the value supplied.
For example If corresponding value of Number(2) is 123, then JBO-27010 will be thrown.
Change it in either place.
Atal
Similar Messages
-
Error While Inserting Data into table using OAF
Hi Experts,
I am learning OAF; i am trying into insert the data into table using OAF. I followed the below procedure.
My table(OLF_TEST_TBL) Columns:
EmpID (Number), Ename(VARCHAR2 100), Sal Number, and who columns.
1. created Application Module (AM).
package: oracle.apps.mfg.simplepg.server
name: oaf_test_tbl_am
2. created simple page
name:EmployeePG
package:oracle.apps.mfg.simplepg.webui
3. Assigned the Application Module to Page
4. Created Entity Object(EO)
name:oaf_test_tbl_eo
package:oracle.apps.mfg.simplepg.schema.server
schema:apps
table:OLF_TEST_TBL
note:
1. EMPID column is selected as primary key
2. selected create method, remove method and validation method.
3.checked generate default view object
VO:
name:olf_test_tbl_vo
note: Entity Object was assigned to VO
Coming To page:
page main region:EmployeeMainRN
1.under main region i created one more region using wizard
selected AM and VO, region style-default single column
2. under main region i created one more region
region style- pagebuttonbar, ID:pagebutoonsRN
3. under pagebuttonRN, created two submit buttons(ID:SUBMIT, ID:CANCEL).
In AM java page:
created a method to insert row and for commit.
Insert Method:
public void insertrow(){
OAViewObject vo=(OAViewObject)getoaf_test_tbl_vo1();
if(!vo.isPreparedForExecution()){
vo.executeQuery();
Row row=vo.createRow();
vo.insertRow(row);
row.setNewRowState(Row.STATUS_INITIALIZED);
Commit Method:
public void savaDataTooaftesttable(){
getDBTransaction().commit();
In EmployeeMainRN, created a controller.
In this controller process request method, 'insertrow' method was called.
import oracle.apps.fnd.framework.OAApplicationModule;
public void processRequest(OAPageContext pageContext, OAWebBean webBean)
super.processRequest(pageContext, webBean);
if (!pageContext.isFormSubmission())
OAApplicationModule am = pageContext.getApplicationModule(webBean);
am.invokeMethod("insertrow");
To commit the transaction when SUBMIT button pressed, commit method was called in process form request method.
import oracle.apps.fnd.framework.OAViewObject;
public void processformRequest(OAPageContext pageContext, OAWebBean webBean)
super.processFormRequest(pageContext, webBean);
OAApplicationModule am = pageContext.getApplicationModule(webBean);
if (pageContext.getParameter("SUBMIT") != null)
am.invokemethod("savaDataTooaftesttable");
Error After clicking the submit button_
I ran the page, page was opened successfully. Once i enter data and click submit button, it's giving the following error.
The requested page contains stale data. This error could have been caused through the use of the browser's navigation buttons (the browser Back button, for example). If the browser's navigation buttons were not used, this error could have been caused by coding mistakes in application code. Please check Supporting the Browser Back Button developer guide - View Object Primary Key Comparison section to review the primary causes of this error and correct the coding mistakes.
Cause:
The view object oaf_test_tbl_am.oaf_test_tbl_vo1700_oaf_test_tbl_vo1_practice_test_prc1_oracle_apps_mfg_simplepg_server_oaf_test_tbl_am.oaf_test_tbl_vo1 contained no record. The displayed records may have been deleted, or the current record for the view object may not have been properly initialized.
To proceed, please select the Home link at the top of the application page to return to the main menu. Then, access this page again using the application's navigation controls (menu, links, and so on) instead of using the browser's navigation controls like Back and Forward.
Experts, Kindly help me why i am getting this error.
Awating your replies.
Thanks in advance.If you dont want to create message. You can throw exception like below as well
throw new OAException("Emp Id is "+empId+" and employee name is "+empName, OAException.CONFIRMATION);Thanks
--Anil -
Error while creating data warehouse tables.
Hi,
I am getting an error while creating data warehouse tables.
I am using OBIA 7.9.5.
The contents of the generate_clt log are as below.
>>>>>>>>>>>>>>>>>>>>>>>>>>
Schema will be created from the following containers:
Oracle 11.5.10
Universal
Conflict(s) between containers:
Table Name : W_BOM_ITEM_FS
Column Name: INTEGRATION_ID.
The column properties that are different :[keyTypeCode]
Success!
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
There are two rows in the DAC repository schema for the column and the table.
The w_etl_table_col.KEY_TYPE_CD value for DW application is UNKNOWN and for the ORA_11i application it is NULL.
Could this be the cause of the issue? If yes, why could the values be different and how to resolve this?
If not, then what could be the problem?
Any responses will be appreciated.
Thanks and regards,
Manoj.Strange. The OBIA 7.9.5 Installation and Configuration Guide says the following:
4.3.4.3 Create ODBC Database Connections
Note: You must use the Oracle Merant ODBC driver to create the ODBC connections. The Oracle Merant ODBC driver is installed by the Oracle Business Intelligence Applications installer. Therefore, you will need to create the ODBC connections after you have run the Oracle Business Intelligence Applications installer and have installed the DAC Client.
Several other users are getting the same message creating DW tables. -
Error while loading data into External table from the flat files
HI ,
We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
While loading the data, we are encountering the following error.
Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: un) while loading data into table_ext
Please let us know what needs to be done in this case to solve this problem.
Thanks,
KartheekKartheek,
I used Google (mine still works).... please check those links:
http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
HTH,
Thierry -
Error while Inserting data into flow table
Hi All,
I am very new to ODI, I am facing lot of problem in my 1st interface. So I have many questions here, please forgive me if it has irritated to you.
========================
I am developing a simple Project to load a data from an input source file (csv) file into a staging table.
My plan is to achieve this in 3 interfaces:
1. Interface-1 : Load the data from an input source (csv) file into a staging table (say Stg_1)
2. Interface-2 : Read the data from the staging table (stg_1) apply the business rules to it and copy the processed records into another staging table (say stg_2)
3. Interface-3 : Copy the data from staging table (stg_2) into the target table (say Target) in the target database.
Question-1 : Is this approach correct?
========================
I don't have any key columns in the staging table (stg_1). When I tried to execute the Flow Control of this I got an error:
Flow Control not possible if no Key is declared in your Target Datastore
With one of the response (the response was - "FLOW control requires a KEY in the target table") in this Forum I have introduced a column called "Record_ID" and made it a Primary Key column into my staging table (stg_1) and my problem has been resolved.
Question-2 : Is a Key column compulsary in the target table? I am working in BO Data Integrator, there is no such compulsion ... I am little confused.
========================
Next, I have defined one Project level sequence. I have mapped the newly introduced key column Record_Id (Primary Key) with the Project level sequence. Now I am got another error of "CKM not selected".
For this, I have inserted "Insert Check (CKM)" knowledge module in my Project. With this the above problem of "CKM not selected" has been resolved.
Question-3 : When is this CKM knowledge module required?
========================
After this, the flow/interface is failing while loading data into the intermediar ODI created flow table (I$)
1 - Loading - SS_0 - Drop work table
2 - Loading - SS_0 - Create work table
3 - Loading - SS_0 - Load data
5 - Integration - FTE Actual data to Staging table - Drop flow table
6 - Integration - FTE Actual data to Staging table - Create flow table I$
7 - Integration - FTE Actual data to Staging table - Delete target table
8 - Integration - FTE Actual data to Staging table - Insert flow into I$ table
The Error is at Step-8 above. When opened the "Execution" tab for this step I found the message - "Missing parameter Project_1.FTE_Actual_Data_seq_NEXTVAL RECORD_ID".
Question-4 : What/why is this error? Did I made any mistake while creating a sequence?Everyone is new and starts somewhere. And the community is there to help you.
1.) What is the idea of moving data from stg_1 and then to stg_2 ? Do you really need it for any other purpose other than move data from SourceFile to Target DB.
Otherwise, its simple to move data from SourceFile -> Target Table
2.) Does your Target table have a Key ?
3.) CKM (Check KM) is required when you want to do constraint validation (Checking) on your data. You can define constraints (business rules) on the target table and Flow Control will check the data that is flowing from Source File to Target table using the CKM. All the records that donot satisfy the constraint will be added to E$ (Error table) and will not be added to the Target table.
4.) Try to avoid ODI sequences. They are slow and arent scalable. Try to use Database sequence wherever possible. And use the DB sequence is target mapping as
<%=odiRef.getObjectName( "L" , "MY_DB_Sequence_Row" , "D" )%>.nextval
where MY_DB_Sequence_Row is the oracle sequence in the target schema.
HTH -
Error while inserting data into a table.
Hi All,
I created a table.While inserting data into the table i am getting an error.Its telling "Create data Processing Function Module".Can any one help me regarding this?
Thanx in advance
anirudhHi Anirudh,
Seems there is already an entry in the Table with the same Primary Key.
INSERT Statement will give short dump if you try to insert data with same key.
Why dont you use MODIFY statement to achieve the same.
Reward points if this Helps.
Manish -
Error while loading data into clob data type.
Hi,
I have created interface to load data from oracle table into oracle table.In target table we have attribute with clob data type. while loading data into clob field ODI gave below error. I use odi 10.1.3.6.0
java.lang.NumberFormatException: For input string: "4294967295"
at java.lang.NumberFormatException.forInputString(Unknown Source)
at java.lang.Integer.parseInt(Unknown Source)
at java.lang.Integer.parseInt(Unknown Source)
Let me know if anyone come across and resolved this kind of issue.
Thanks much,
Nishit GajjarMr. Gajjar,
You didnt mention what KMs you are using ?
have a read of
Re: Facing issues while using BLOB
and
Load BLOB column in Oracle to Image column in MS SQL Server
Try again.
And can you please mark the Correct/Helpful points to the answers too.
Edited by: actdi on Jan 10, 2012 10:45 AM -
Error while uploading data in table t_499s through BDC Prog
Hi
am facing problem while uploading data in table t_499s through BDC Program , if there is more than 15 records in file its not allowing to upload kindly suggest what to do
Thanx
Mukesh sHi,
See if you want to update only single table, which has User maintenance allowed
Use Modify statement.
EX:
LOOP AT ITAB INTO WA_TAB.
MOVE-CORRESPONDING WA_TAB TO T499S.
MODIFY T499S.
CLEAR T499S.
ENDLOOP.
It will update the table, to check go to sm30 , and check in V_T499S.
Rgds
Aeda -
Getting error while loading Data into ASO cube by flat file.
Hi All,
i am getting this error Essbase error 1270040: Data load buffer[1] does not exist while loading data into ASO cube.
does anyone have solution.
Regards,
VMAre you using ODI to load the data or maxl? If you are using an ODI interface, are you using a load rule also which version of essbase and ODI are you using
Cheers
John
http://john-goodwin.blogspot.com/ -
Error while load data into Essbase using ODI
Hi ,
I'm getting the following error while loading measures into Essbase using ODI, I used the same LOG nd Error file and file path for all my Dimensions , this worked well but not sure why this is not working for measures....need help.
File "<string>", line 79, in ?
com.hyperion.odi.common.ODIHAppException: c:/temp/Log1.log (No such file or directory)
Thanks
VenuAre you definitely running it against an agent where that path exists.
Have you tried using a different location and filename, have you restarted the agent to make sure there is not a lock on the file.
Cheers
John
http://john-goodwin.blogspot.com/ -
Error while loading data into BW (BW as Target) using Data Services
Hello,
I'm trying to extract data from SQL Server 2012 and load into BW 7.3 using Data Services. Data Services shows that the job is finished successfully. But, when I go into BW, I'm seeing the below / attached error.
Error while accessing repository Violation of PRIMARY KEY constraint 'PK__AL_BW_RE_
Please let me know what this means and how to fix this. Not sure if I gave the sufficient information. Please let me know if you need any other information.
Thanks
PradeepHi Pradeep,
Regarding your query please refer below SCN thread for the same issue:
SCN Thread:
FIM10 to BW 73- Violation of PRIMARY KEY -table AL_BW_REQUEST
Error in loading data from BOFC to BW using FIM 10.0
Thanks,
Daya -
Error while importing data into Oracle 11gr2 with arcsde 9.3.1
I am getting error while importing the data into oracle 11g r2. We are using arcsde 9.3.1
It seems to be having some problem with spatial index creation.
kindly help
IMP-00017: following statement failed with ORACLE error 29855:
"CREATE INDEX "A3032_IX1" ON "DGN_POLYLINE_2D" ("SHAPE" ) INDEXTYPE IS "MDS"
"YS"."SPATIAL_INDEX""
IMP-00003: ORACLE error 29855 encountered
ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
ORA-13249: internal error in Spatial index: [mdidxrbd]
ORA-13249: Error in Spatial index: index build failed
ORA-13249: Error in spatial index: [mdrcrtxfergm]
ORA-13249: Error in spatial index: [mdpridxtxfergm]
ORA-13200: internal error [ROWID:AAAT5pAA9AACIy5AAQ] in spatial indexing.
ORA-13206: internal error [] while creating the spatial index
ORA-13033: Invalid data in the SDO_ELEM_INFO_ARRAY in SDO_GEOMETRY object
ORA-06512: at "MDSYSGuys,
I am also getting the same error and also my issue is like I am not even to analyze for which indexes I am getting error. It does not hve any indx name before error.
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/DOMAIN_INDEX/INDEX
ORA-39083: Object type INDEX failed to create with error:
ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
ORA-13249: internal error in Spatial index: [mdidxrbd]
ORA-13249: Error in Spatial index: index build failed
ORA-13249: Error in spatial index: [mdrcrtxfergm]
ORA-13249: Error in spatial index: [mdpridxtxfer]
ORA-29400: data cartridge error
ORA-12801: error signaled in parallel query server P000
ORA-13249: Error in spatial index: [mdpridxtxfergm]
ORA-13200: internal error [ROWID:AA
ORA-39083: Object type INDEX failed to create with error:
ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
ORA-13249: internal error in Spatial index: [mdidxrbd]
ORA-13249: Error in Spatial index: index build failed
ORA-13249: Error in spatial index: [mdrcrtxfergm]
ORA-13249: Error in spatial index: [mdpridxtxfer]
ORA-29400: data cartridge error
ORA-12801: error signaled in parallel query server P002
ORA-13249: Error in spatial index: [mdpridxtxfergm]
ORA-13200: internal error [ROWID:AA
ORA-39083: Object type INDEX failed to create with error:
ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
ORA-13249: internal error in Spatial index: [mdidxrbd]
ORA-13249: Error in Spatial index: index build failed
ORA-13249: Error in spatial index: [mdrcrtxfergm]
ORA-13249: Error in spatial index: [mdpridxtxfer]
ORA-29400: data cartridge error
stack cnt....
How can I find for which indexes it is failing?
Thank you,
Myra -
Dead lock error while updating data into cube
We have a scenario of daily truncate and upload of data into cube and volumes arrive @ 2 million per day.We have Parallel process setting (psa and data targets in parallel) in infopackage setting to speed up the data load process.This entire process runs thru process chain.
We are facing dead lock issue everyday.How to avoid this ?
In general dead lock occurs because of degenerated indexes if the volumes are very high. so my question is does deletion of Indexes of the cube everyday along with 'deletion of data target content' process help to avoiding dead lock ?
Also observed is updation of values into one infoobject is taking longer time approx 3 mins for each data packet.That infoobject is placed in dimension and defined it as line item as the volumes are very high for that specific object.
so this is over all scenario !!
two things :
1) will deletion of indexes and recreation help to avoid dead lock ?
2) any idea why the insertion into the infoobject is taking longer time (there is a direct read on sid table of that object while observed in sql statement).
Regards.hello,
1) will deletion of indexes and recreation help to avoid dead lock ?
Ans:
To avoid this problem, we need to drop the indexes of the cube before uploading the data.and rebuild the indexes...
Also,
just find out in SM12 which is the process which is causing lock.... Delete that.
find out the process in SM66 which is running for a very long time.Stop this process.
Check the transaction SM50 for the number of processes available in the system. If they are not adequate, you have to increase them with the help of basis team
2) any idea why the insertion into the infoobject is taking longer time (there is a direct read on sid table of that object while observed in sql statement).
Ans:
Lie item dimension is one of the ways to improve data load as well as query performance by eliminationg the need for dimensin table. So while loading/reading, one less table to deal with..
Check in the transformation mapping of that chs, it any rouitne/formula is written.If so, this can lead to more time for processing that IO.
Storing mass data in InfoCubes at document level is generally not recommended because when data is loaded, a huge SID table is created for the document number line-item dimension.
check if your IO is similar to doc no...
Regards,
Dhanya -
Error while fetching data into collection type.
Hi all,
I'm facing a problem while i'm fetching data into a table type.
Open c1;
open c2;
loop
Fetch c1 into partition_name_1;
fetch c2 into partition_name_2;
exit when c1%notfound or c2%notfound;
open C1_refcursor for 'select a.col1,b.col2 from table1 partition('||partition_name_1||') a, table2 partition('||partition_name_2) b
where a.col2=b.col2';
loop
fetch c1_refcursor BULK COLLECT into v1,v2 <-----This is the line where i'm getting the error as "ORA-01858: a non-numeric character was found where a numeric was expected"
limit 100000;
exit when v1.count=0;
forall i in 1..v1.count
end loop;
i also checked the data type of the table variable its same as the column selected in the refcursor.
Please help me out.
Message was edited by:
Sumit NarayanOk I see that, but I don't think you can use associative arrays in this manner then, because you cannot select data directly into an associative_array.
As far as I'm aware, they must be assigned an index and the error suggests to me that its missing an index.
You can select directly into records and maybe this is where you need to go. -
Error while loading data into persisted column
Hi All,
I have a table and i am trying to load some test data into it such as
Insert Into [Test]
values(000,'abc')
Below is my table structure
CREATE TABLE [Test]
[Id] [int] NOT NULL,
[Desc] [varchar](256) NOT NULL,
[IdFormal] AS CONVERT(varchar(5),[Id]) PERSISTED NOT NULL
CONSTRAINT [PK_Test] PRIMARY KEY CLUSTERED
[Id] ASC
Now, i am trying to enter the above data into this data such that the first column gives me 0 and the final column gives me the value of 000,but if i run the insert statament i receive the following error
INSERT failed because the following SET options have incorrect settings: 'ANSI_PADDING'
Can someone please help me with this.
ThanksTake a look at this article
SET
ANSI_PADDING Setting and Its Importance
Perhaps you would need to re-create the table by first specifying correct SET ANSI_PADDING ON setting.
For every expert, there is an equal and opposite expert. - Becker's Law
My blog
My TechNet articles
Maybe you are looking for
-
My phone is not jailbroken and I have had no problems until now. I updated the firmware a few weeks ago with no problem. I noticed yesterday I wasnt getting all of my emails because I get some of the same emails on multiple accounts and they were not
-
I have a copy of Windows 7, and I am trying to bring up Boot Camp on my MacBook with Snow Leopard. But my Snow Leopard install disk is too old for Windows 7, so attempted Boot Camp install of drivers fails. I have hand-installed the NVidia graphics d
-
Large OS 8.6 Compatible Firewire Hard Drive?
I'm needing a larger firewire hard drive, which must able to back up my older Macs (running 9.1 and 8.6) However the newest 500GB ones all seem to list OS 10.x as their minimum requirement. (Not sure whether this is the minimum for doing stuff like f
-
I uploaded several Visio (.vsd extension) files to a KM folder. When I try to open them, the browser detects the correct file type and asks if I want to open or save the file. When I click open, I receive a "page cannot be displayed" browser error. W
-
Microsoft word: rotate photos. Apple: rename photos
Hello, I am new to apple, and English... I have two problems: Microsoft word. I want to rotate a photo inserted in one of my word document, I can see the function "rotate" in the tool box but this is not working. When I click on the rotate option, ev