Easier INSERT INTO in 11g?
In Oracle, we can pass parameters to a procedure by name. Example:
procedure_name( param1 => 'some_value', param2 => 'some value', param3 => 'some value' );
Is there a similar way in the INSERT statement? Maybe something like:
INSERT INTO table_name (column1 = 'some values', column2 = 'some values', column3 = 'some values');
This way, it's easer to read INSERT statements, especially if there are more columns to be inserted (e.g more than 30 columns!). Any ideas if 11g has this similar feature now? Or maybe this exists already for previous versions as well?
Hmmm, actually, all I am asking is maybe INSERT should have something similar as UPDATE syntax?
Edited by: dongzky on Jun 30, 2010 3:06 PM
You should create your own procedure and work through.
create procedure insert_mytable(p_col1, p_col2...)
insert into mytable(col1,col2...) values (p_c1,p_c2...);
end;
exec insert_mytable(p_col1=>'text',p_col2=>'text'...)I'd say that's very good to have API like that for development to control all the insert.
Nicolas.
Similar Messages
-
All,
Im getting an error when i execute this statement
INSERT INTO target_table@database_link( column_list )
SELECT list_of_columns
FROM tables_on_local_database
WHERE some_conditions.
The only thing that i've noticed is that im executing it on an 11g db to a 10g db.
Is that really the problem?
Thanks.
YoyHi,
Apps Devel wrote:
Is that really the problem?
No. -
Best Practice to fetch SQL Server data and Insert into Oracle Tables
Hello,
I want to read sqlserver data everry half an hour and write into oracle tables ( in two different databases). What is the best practice for doing this?
We do not have any database dblinks from oracle to sqlserver and vice versa.
Any help is highly appreciable?
ThanksWell, that's easy:
use a TimerTask to do the following every half an hour:
- open a connection to sql server
- open two connections to the oracle databases
- for each row you read from the sql server, do the inserts into the oracle databases
- commit
- close all connections -
Insert into using a select and dynamic sql
Hi,
I've got hopefully easy question. I have a procedure that updates 3 tables with 3 different update statements. The procedure goes through and updates through ranges I pass in. I am hoping to create another table which will pass in those updates as an insert statement and append the data on to the existing data.
I am thinking of using dynamic sql, but I am sure there is an easy way to do it using PL/SQL as well. I have pasted the procedure below, and what I'm thinking would be a good way to do it. Below I have pasted my procedure and the bottom is the insert statement I want to use. I am faily sure I can do it using dynamic SQL, but I am not familiar with the syntax.
CREATE OR REPLACE PROCEDURE ACTIVATE_PHONE_CARDS (min_login in VARCHAR2, max_login in VARCHAR2, vperc in VARCHAR2) IS
BEGIN
UPDATE service_t SET status = 10100
WHERE poid_id0 in
(SELECT poid_id0 FROM service_t
WHERE poid_type='/service/telephony'
AND login >= min_login AND login <= max_login);
DBMS_OUTPUT.put_line( 'Service Status:' || sql%rowcount);
UPDATE account_t SET status = 10100
WHERE poid_id0 IN
(SELECT account_obj_id0 FROM service_t
WHERE poid_type = '/service/telephony'
AND login >= min_login AND login <= max_login);
DBMS_OUTPUT.put_line( 'Account Status:' || sql%rowcount);
UPDATE account_nameinfo_t SET title=Initcap(vperc)
WHERE obj_id0 IN
(SELECT account_obj_id0 FROM service_t
WHERE poid_type='/service/telephony'
AND login >=min_login AND login <= max_login);
DBMS_OUTPUT.put_line('Job Title:' || sql%rowcount);
INSERT INTO phone_card_activation values which = 'select a.status, s.status, s.login, to_char(d.sysdate,DD-MON-YYYY), ani.title
from account_t a, service_t s, account_nameinfo_t ani, dual d
where service_t.login between service_t.min_login and service_t.max_login
and ani.for_key=a.pri_key
and s.for_key=a.pri_key;'
END;
Thanks for any advice, and have a good weekend.
GeordieCorrect my if I am wrong but aren't these equal?
UPDATE service_t SET status = 10100
WHERE poid_id0 in
(SELECT poid_id0 FROM service_t
WHERE poid_type='/service/telephony'
AND login >= min_login AND login <= max_login);
(update all the records where there id is in the sub-query that meet the WHERE Clause)
AND
UPDATE service_t SET status = 10100
WHERE poid_type='/service/telephony'
AND login >= min_login AND login <= max_login);
(update all the records that meet the WHERE Clause)
This should equate to the same record set, in which case the second update would be quicker without the sub-query. -
ORA-00604 error when trying to insert into a XMLTYPE stored as BINARY
Hi. Here's the scenario.
Here's my Oracle version:
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
The database is encoded as AL32UTF8.
First I create the table...
create table binary_table (the_field XMLTYPE) XMLTYPE COLUMN the_field STORE AS BINARY XML;
Now I try and do an insert like this...
insert into binary_table values (xmltype('<?xml version="1.0" encoding="AL32UTF8"?>' || chr(10) || '<a>b</a>' || chr(10)));
and I get this error:
SQLState: 60000
ErrorCode: 604
Position: 122
Error: ORA-00604: error occurred at recursive SQL level 1
ORA-00942: table or view does not exist
If I create the table with a CLOB storage option for the XMLTYPE, the insert works fine. If I repeat these steps in another database instance, same Oracle version, that's encoded as WE8ISO8859P1, it also works fine. It behaves the same in several clients. I also tried it with several different values for NLS_LANG and that didn't help.
I do want to say that this database instance has just been set up especially for me so I can do some R&D on AL32UTF8 and XMLTYPE to see if it fits our needs. So it might be a problem with the database instance.
Thanks for taking a look at this.
Ralph
Edited by: stryder100 on Jul 24, 2009 12:11 PMHi,
Use this
Load data
append Into TABLE HS_HRMIG_EMP_PER_20MAR07 fields terminated by "," optionally enclosed by '"'
TRAILING NULLCOLS.
Here optional enclosed by is for doubles quotes which should needs to place in single quotes.
like '"'.
try with this.
--Basava.S -
Why doesn't this insert into XMLTYPE work?
Hi again. Hopefully I'll be answering questions soon, but meanwhile I've got another one.
I'm working in this environment...
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
The encoding for the database is WE8ISO8859P1.
in SQL Plus. I created a table with an XMLTYPE column stored as binary. Here's the desc...
PS: BWDSTG> desc bwddoc;
Name Null? Type
SUNAME VARCHAR2(100)
SOURCE_DOC_TEXT CLOB
DOC_TEXT SYS.XMLTYPE STORAGE BINARY
LAST_UPDATE_DATE DATE
PS: BWDSTG>
The following error also occurred when I created the same table with a storage type of CLOB for DOC_TEXT. Here's the error I can't figure out...
PS: BWDSTG> insert into bwddoc (doc_text) values ('<?xml version="1.0" encoding="UTF-8"?>
2 <a>–</a>
3 ');
insert into bwddoc (doc_text) values ('<?xml version="1.0" encoding="UTF-8"?>
ERROR at line 1:
ORA-31011: XML parsing failed
ORA-19202: Error occurred in XML processing
LPX-00217: invalid character 8211 (U+2013)
Error at line 2
It accepts the command if I replace the – with plain text. Why does it care what the character entity reference is? It's changing the encoding pseudoattribute in the xml declaration to US-ASCII anyway, and this character entity should be perfectly acceptable. I'd appreciate it if anyone knows the reason for this (or what I'm not understanding, which as always is a distinct possibility).Sorry, let me try again. SQLPlus doesn't have a problem with the multiple lines, so I'm just trying to insert the XML.
PS: BWDSTG> insert into bwddoc (doc_text) values (xmltype('<?xml version="1.0" encoding="US-ASCII"?>
2 <a>—</a>
3 '));
insert into bwddoc (doc_text) values (xmltype('<?xml version="1.0" encoding="US-ASCII"?>
ERROR at line 1:
ORA-31011: XML parsing failed
ORA-19202: Error occurred in XML processing
LPX-00217: invalid character 8212 (U+2014)
Error at line 2
ORA-06512: at "SYS.XMLTYPE", line 310
ORA-06512: at line 1
My problem is that...
<?xml version="1.0" encoding="US-ASCII"?>
<a>—</a>
should be perfectly good HTML. libxml2 and expat both have no problem parsing it. They just leave — (which is some kind of a dash) alone. But Oracle XMLType doesn't like it for some reason. I need to load a lot of data that has numeric character entities like this but I can't 'til I get this resolved. -
How can I INSERT INTO from Staging Table to Production Table
I’ve got a Bulk Load process which works fine, but I’m having major problems downstream.
Almost everything is Varchar(100), and this works fine.
Except for these fields:
INDEX SHARES, INDEX MARKET CAP, INDEX WEIGHT, DAILY PRICE RETURN, and DAILY TOTAL RETURN
These four fields must be some kind of numeric, because I need to perform sums on these guys.
Here’s my SQL:
CREATE
TABLE [dbo].[S&P_Global_BMI_(US_Dollar)]
[CHANGE]
VARCHAR(100),
[EFFECTIVE DATE]
VARCHAR(100),
[COMPANY]
VARCHAR(100),
[RIC]
VARCHAR(100),
Etc.
[INDEX SHARES]
NUMERIC(18, 12),
[INDEX MARKET CAP]
NUMERIC(18, 12),
[INDEX WEIGHT]
NUMERIC(18, 12),
[DAILY PRICE RETURN]
NUMERIC(18, 12),
[DAILY TOTAL RETURN]
NUMERIC(18, 12),
From the main staging table, I’m writing data to 4 production tables.
CREATE
TABLE [dbo].[S&P_Global_Ex-U.S._LargeMidCap_(US_Dollar)]
[CHANGE]
VARCHAR(100),
[EFFECTIVE DATE]
VARCHAR(100),
[COMPANY]
VARCHAR(100),
[RIC]
VARCHAR(100),
Etc.
[INDEX SHARES]
FLOAT(20),
[INDEX MARKET CAP]
FLOAT(20),
[INDEX WEIGHT] FLOAT(20),
[DAILY PRICE RETURN]
FLOAT(20),
[DAILY TOTAL RETURN]
FLOAT(20),,
INSERT
INTO [dbo].[S&P_Global_Ex-U.S._LargeMidCap_(US_Dollar)]
SELECT
[CHANGE],
Etc.
[DAILY TOTAL RETURN]
FROM
[dbo].[S&P_Global_BMI_(US_Dollar)]
WHERE
isnumeric([Effective Date])
= 1
AND
[CHANGE] is
null
AND
[COUNTRY] <>
'US'
AND ([SIZE] =
'L' OR [SIZE]
= 'M')
The Bulk Load is throwing errors like this (unless I make everything Varchar):
Bulk load data conversion error (truncation) for row 7, column 23 (INDEX SHARES).
Msg 4863, Level 16, State 1, Line 1
When I try to load data from the staging table to the production table, I get this.
Msg 8115, Level 16, State 8, Line 1
Arithmetic overflow error converting varchar to data type numeric.
The statement has been terminated.
There must be an easy way to overcome this, right.
Please advise!
Thanks!!
Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.Nothing is returned. Everything is VARCHAR(100). the problem is this.
If I use FLOAT(18) or REAL, I get exponential numbers, which is useless to me.
If I use DECIMAL(18,12) or NUMERIC(18,12), I get errors.
Msg 4863, Level 16, State 1, Line 41
Bulk load data conversion error (truncation) for row 7, column 23 (INDEX SHARES).
Msg 4863, Level 16, State 1, Line 41
Bulk load data conversion error (truncation) for row 8, column 23 (INDEX SHARES).
Msg 4863, Level 16, State 1, Line 41
Bulk load data conversion error (truncation) for row 9, column 23 (INDEX SHARES).
There must be some data type that fits this!
Here's a sample of what I'm dealing with.
-0.900900901
9.302325581
-2.648171501
-1.402805723
-2.911830584
-2.220960866
2.897762349
-0.219640074
-5.458448607
-0.076626094
6.710940231
0.287200186
0.131682908
0.124276221
0.790818723
0.420505119
Knowledge is the only thing that I can give you, and still retain, and we are both better off for it. -
Insert into table error - ora-01722 invalid number
Need some assistance with inserting data into a table. The date column keeps on failing to insert.
here is my insert statement
insert into tab_mod_history (TABLE_OWNER, TABLE_NAME, PARTITION_NAME, SUBPARTITION_NAME, INSERTS, UPDATES, DELETES, TIMESTAMP, TRUNCATED)
values ('$i_owner','$i_table','$i_part_name','$i_subpart_name','$i_ins','$i_upd','$i_del','$time','$trunc');Script loads data for partition tables, but not normal tables with the timestamp column
I select the data using this select statement:
select table_owner, table_name, partition_name, subpartition_name, inserts, updates, deletes, timestamp, truncated
from dba_tab_modifications
where table_owner in ('scott','MAC')
order by table_name;ok here are the errors:
values ('MAC','WC_MST','11','1','1','12/04/2011','NO','','')
ERROR at line 2:
ORA-01722: invalid number
Session altered.
values ('MAC','WF_05A','208','128','208','18/02/2011','NO','','')
ERROR at line 2:
ORA-01722: invalid numberHere is the table structure
SQL> desc tab_mod_history
Name Null? Type
TABLE_OWNER VARCHAR2(30)
TABLE_NAME VARCHAR2(30)
PARTITION_NAME VARCHAR2(30)
SUBPARTITION_NAME VARCHAR2(30)
INSERTS NUMBER
UPDATES NUMBER
DELETES NUMBER
TIMESTAMP DATE
TRUNCATED VARCHAR2(3)
DROP_SEGMENTS NUMBERI used the column names to create the variables..that is why $time was used.
How else could I have done it????
thought this was easy..but to my dismay... -
Sql Script containing INSERT INTO TABLE_NAME taking very long time
Version:11g
I have a .sql file which contains insert statements for the table ZIP_CODES like.
INSERT INTO ZIP_CODES (ZIP_CODE, CITY, PROV, COUNTRY_CODE, LONGITUDE, LATITUDE)
VALUES (..........);This sql file contains above 800,000 INSERT statements like these! Execution of this file takes around 20 minutes.
Our client insists that they need a script to create this table and not a dump file (export dump of just this table)
Is there any way i could speed up these INSERTs in this script. I have added a commit half way through this file because i was worried about UNDO tablespace.
This table (ZIP_CODES) is not dependant on any other table (no FKs, no FK references,..).
Edited by: Steve_74 on 03-Sep-2009 05:53One possible option is to use External Tables
1. Create a CSV file with the values to be stored in the table.
2. Create an directory object (The location where the CSV file will be stored)
3. Create an External Table pointing to the CSV file
4. Just do a INSERT INTO ZIP_CODES SELECT * FROM <external table> (may be try to use a APPEND hint)
5. Drop the Directory object and External Table. -
INSERT command in 11g not working
I m executing the following command in oracle 11g,but its saying column not allowed here.Is there any problem in command??interest_id is the primarry key. I tried all sorts.. but it isn't working... please some one tell me the solution.thank you.
INSERT INTO INTEREST_X(INTEREST_ID VARCHAR2(10),DESCRIPTION VARCHAR2(40)) VALUES (10,"General Crafts");user11302075 wrote:
I m executing the following command in oracle 11g,but its saying column not allowed here.Is there any problem in command??interest_id is the primarry key. I tried all sorts.. but it isn't working... please some one tell me the solution.thank you.
INSERT INTO INTEREST_X(INTEREST_ID VARCHAR2(10),DESCRIPTION VARCHAR2(40)) VALUES (10,"General Crafts");Why you are giving data types with the columns ?What happens when you run this,
INSERT INTO INTEREST_X(INTEREST_ID,DESCRIPTION) VALUES (10,"General Crafts"); HTH
Aman.... -
Data from 2 database blocks insert into the same base table?
Hello,
I have canvas C_NEW. On this canvas, items from 3 blocks are usable by the user.
1. Block NEW_HEAD: 3 items say X1,X2,X3
2. Block NEW : multi record text fields say Y1 thru Y10. Also a scrollbar.
3. Block NEW_ACTION: 6 buttons say B1 thru B6 (One of them is N_OK which is the item in question)
both the blocks NEW, NEW_HEAD are db blocks and have the same base table say BT. When the users click on the N_OK (after filling out the data in the NEW_HEAD block and then NEW in that order) I need the data from both the NEW, NEW_HEAD to go into the BT. Currently only the data from the NEW is going into BT. Fields in the BT table which correspond to the fields X1,X2,X3 in the NEW_HEAD remain null after clicking the N_OK button. I put commit_form in the N_OK code since both the blocks are db blocks( and as suggested by folks, it is easier to issue a commit_form than do a lot more work in writing my own SQL).
How can I achive this?
Thanks,
ChiruI tried doing what you suggested by putting the following code in the program unit which gets called when_button_pressed.
PROCEDURE P_SAVE IS
v_lc_no number;
v_lc_seq number;
v_dmg_allow number;
BEGIN
Go_Block('B_new_head');
v_lc_no:= :B_new_head.N_LC_NO;
v_lc_seq:= :B_new_head.N_LC_SEQ;
v_dmg_allow:= :B_new_head.N_LC_DMG_ALLOW;
Go_Block('B_new');
FIRST_RECORD;
LOOP
emessage('before insert'||v_lc_no||'-'||v_lc_seq||'-'||:B_new.order_no);
INSERT INTO ct_lc_import(
LC_NUMBER,
LC_SEQ,
DAMAGE_ALLOWANCE,
ORDER_NO,
QTY_SHIPPED,
UNIT_PRICE,
DRAFT_AMOUNT,
TICKET_DEDUCTION,
SHIPMENT_PENALTY,
TOTAL_DEBIT,
DEBIT_DATE)
VALUES (v_lc_no,
v_lc_seq,
v_dmg_allow,
:B_new.order_no,
:B_new.qty_shipped,
:B_new.unit_price,
:B_new.draft_amount,
:B_new.ticket_deduction,
:B_new.shipment_penalty,
:B_new.total_debit,
:B_new.debit_date);
commit;
NEXT_RECORD;
if :SYSTEM.LAST_RECORD='TRUE' then
emessage('last record');
EXIT;
end if;
--NEXT_RECORD;
END LOOP;
EXCEPTION
when FORM_TRIGGER_FAILURE then
raise;
when OTHERS then
emessage(SQLERRM);
raise FORM_TRIGGER_FAILURE;
END;But I can't see the records in the table eventhough the message pops up with the values before inserting the 1st record. It then exits. I would think it would atleast insert the very first record into the the table at the least.
Thanks,
Chiru -
I need to make a copy of an entire email, including the header w/the sender's info and time, etc. to insert into a letter, etc. How can I do this w/out cutting and pasting the header separately from the text. I know there is a way besides a screen shot but I've spend hours trying to find it.
Smurfslayer wrote:
For the particularly persnickety types you might want to expose the full headers in the email message as well. It's easy enough to do, from mail's 'menu' select "view"; then "message"; then all headers.
Another option you could use is a screen capture using Grab.
Activate Grab, then shift + command + w (for a window screen shot). Then confirm the window selection and click the mail message.
Dave
Why are you addressing this to me...?
I am not the OP... -
Use of sequence numbers while inserting into tables
Hi,
Consider i have a stored procedure where i insert records into 30 to 40 tables.
for example i have a table A and a sequence SEQ_A for that table.
Is there any difference in performance in the following two ways of
insertion,
1. select SEQ_A.NEXTVAL into variable_a FROM DUAL;
INSERT INTO A VALUES(variable_a);
2.INSERT INTO A VALUES (SEQ_A.NEXTVAL).
I have 30 to 40 insert statements like that in method 1.
if i replace it by method 2. will there be any improvement in performance?
Thanks & Regards
SundarThis could be interesting. I agree that triggers should be used for data integrity, though even better is to only allow write access to tables via stored procedures.
Anyway while you are waiting for the test case that shows triggers are faster, which could take some time, it is quick and easy to come up with a test case that shows the opposite.
SQL> create table t (id number, dummy varchar2(1));
Table created.
real: 40
SQL> insert into t select s.nextval, 'a' from all_objects;
29625 rows created.
real: 5038
SQL> rollback;
Rollback complete.
real: 40
SQL> create trigger tt before insert on t
2 for each row
3 begin
4 select s.nextval into :new.id from dual;
5 end;
6 /
Trigger created.
real: 291
SQL> insert into t select null, 'a' from all_objects;
29626 rows created.
real: 13350
SQL> rollback;
Rollback complete.
real: 1082
SQL>Note with triggers even the rollback took longer.
Martin -
Commit for every 1000 records in Insert into select statment
Hi I've the following INSERT into SELECT statement .
The SELECT statement (which has joins ) has around 6 crores fo data . I need to insert that data into another table.
Please suggest me the best way to do that .
I'm using the INSERT into SELECT statement , but i want to use commit statement for every 1000 records .
How can i achieve this ..
insert into emp_dept_master
select e.ename ,d.dname ,e.empno ,e.empno ,e.sal
from emp e , dept d
where e.deptno = d.deptno ------ how to use commit for every 1000 records .ThanksSmile wrote:
Hi I've the following INSERT into SELECT statement .
The SELECT statement (which has joins ) has around 6 crores fo data . I need to insert that data into another table.Does the another table already have records or its empty?
If its empty then you can drop it and create it as
create your_another_table
as
<your select statement that return 60000000 records>
Please suggest me the best way to do that .
I'm using the INSERT into SELECT statement , but i want to use commit statement for every 1000 records .That is not the best way. Frequent commit may lead to ORA-1555 error
[url http://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:275215756923]A nice artical from ASKTOM on this one
How can i achieve this ..
insert into emp_dept_master
select e.ename ,d.dname ,e.empno ,e.empno ,e.sal
from emp e , dept d
where e.deptno = d.deptno ------ how to use commit for every 1000 records .
It depends on the reason behind you wanting to split your transaction into small chunks. Most of the time there is no good reason for that.
If you are tying to imporve performance by doing so then you are wrong it will only degrade the performance.
To improve the performance you can use APPEND hint in insert, you can try PARALLEL DML and If you are in 11g and above you can use [url http://docs.oracle.com/cd/E11882_01/appdev.112/e25788/d_parallel_ex.htm#CHDIJACH]DBMS_PARALLEL_EXECUTE to break your insert into chunks and run it in parallel.
So if you can tell the actual objective we could offer some help. -
ORA-01461: can bind a LONG value only for insert into a LONG column
Good Afternoon,
I was just wiondering if my issue is database related rather SQL related, because my process works and the code has not been updated for sometime.
I receive the above error when tryig to insert into a table. I running Oracle 11g.
here is my insert command definition:
With insCmd
.CommandType = CommandType.Text
.CommandText = "INSERT INTO ORIGINAL_CLAIMS VALUES (orig_claim_seq.nextval,:1,:2,:3,:4,:5,:6,:7,:8,:9,:10,:11,xmltype(:12),:13,:14,:15)"
With .Parameters
.Add(":1", OracleDbType.Int64, ParameterDirection.Input) 'BATCH_ID
.Add(":2", OracleDbType.Varchar2, ParameterDirection.Input) 'REC_ID
.Add(":3", OracleDbType.Varchar2, ParameterDirection.Input) 'USER_ID
.Add(":4", OracleDbType.Varchar2, ParameterDirection.Input) 'FORM_NBR
.Add(":5", OracleDbType.Varchar2, ParameterDirection.Input) 'PATIENT_NBR
.Add(":6", OracleDbType.Date, ParameterDirection.Input) 'ADMIT_DATE
.Add(":7", OracleDbType.Varchar2, ParameterDirection.Input) 'TAX_ID
.Add(":8", OracleDbType.Varchar2, ParameterDirection.Input) 'CLAIM_TYPE
.Add(":9", OracleDbType.Date, ParameterDirection.Input) 'SUBMIT_DATE
.Add(":10", OracleDbType.Date, ParameterDirection.Input) 'RECEIVE_DATE
.Add(":11", OracleDbType.Varchar2, ParameterDirection.Input) 'CLAIM_SOURCE
.Add(":12", OracleDbType.Clob, ParameterDirection.Input) 'CLAIM_XML * XML Data
.Add(":13", OracleDbType.Date, ParameterDirection.Input) 'LOAD_DATE
.Add(":14", OracleDbType.Date, ParameterDirection.Input) 'UPDATE_DATE
.Add(":15", OracleDbType.Varchar2, ParameterDirection.Input) 'DOC_NO
End With
.Prepare()
End With
Here is the insert command to insert the record:
With insCmd.Parameters
.Item(":1").Value = batchID
.Item(":2").Value = RecID
.Item(":3").Value = UserID
.Item(":4").Value = ""
.Item(":5").Value = PatientNbr
.Item(":6").Value = AdmitDate
.Item(":7").Value = Left(TaxId, 9)
.Item(":8").Value = Left(ClaimType, 1)
.Item(":9").Value = SubmitDate
.Item(":10").Value = ReceiveDate
.Item(":11").Value = ClaimSource
.Item(":12").Value = claimNode.OuterXml
.Item(":13").Value = Now
.Item(":14").Value = Now
.Item(":15").Value = docNo
End With
insCmd.ExecuteNonQuery()
What is strange is that this will work for some records and not for others.
for example, i have a batch that has 50 recs, i should have the same records in my table, only 1 record gets inserted before the error occurs.
So in rec 2 there is something that is preventing the insert to execute.
There is either some 'junk' in the xml or some junk data on the other 14 fiields that would cause the insert to fail.
as i step thru the process, to me the data looks valid.
Any insight would be appreciated,
thanks.
Mikehere's the table DDL:
CREATE TABLE CCBH_TEST.ORIGINAL_CLAIMS
CLAIM_SEQ NUMBER(18),
BATCH_ID NUMBER(18),
REC_ID VARCHAR2(5 BYTE),
USER_ID VARCHAR2(10 BYTE),
FORM_NBR VARCHAR2(8 BYTE),
PATIENT_NBR VARCHAR2(50 BYTE),
ADMIT_DATE DATE,
TAX_ID VARCHAR2(20 BYTE),
CLAIM_TYPE VARCHAR2(1 BYTE),
SUBMIT_DATE DATE,
RECEIVE_DATE DATE,
CLAIM_SOURCE VARCHAR2(120 BYTE),
CLAIM_XML SYS.XMLTYPE,
LOAD_DATE DATE,
UPDATE_DATE DATE,
DOC_NO VARCHAR2(30 BYTE)
I got tied up in some production issues, will look at rec 2 some time next week.
since this is a TEST environment, i'm not that worried about it, but i would like find out.
It could be another field with 'bad' data.
thanks,
Maybe you are looking for
-
Windowserver consuming excessive memory
I'm starting to see massive slow downs in redrawing screens as I swap between desktops. Checking the activity monitor WindowServer is consuming a huge portion of the available memory in my system. At the moment 6.54G. That seems insane. After a reboo
-
Yahoo Search pops up each time I open a new page - Uninstall??
Yahoo Search pops up each time I open a new page - Uninstall??
-
Updated my mac to OsX 10.7.5 and now all my pics in aperture are gone any ideas
lost all my pictures and projects stores in aperture 5 years worth of work and investment any help out there any one knows what could have happened? or if they might be some where in the HDD?
-
Batery replacement for lumia 900
Can i change the batery of lumia 900.
-
How can i set photoshop image resolution
how can i set image resolution so that opening an image into photoshop CS5 the image will always be at a certain resolution, such as 240ppi?