OWB error log table
Hi All,
I am using OWB 11.2. In the target table property inspector under error table i have specified a error table and when i deploy my mapping it gets created on my target schema. Is there any way i can change the setting and get the error table created on some other schema.? We dont want to have the error table and reporting/target tables on the same reporting/target schema.
Please suggest.
Thanks
Kanwar
Hi Kanwar,
you cannot create the error table in another schema using owb. You have to do it manually:
as user1:
begin
dbms_errlog.create_error_log('DIM_DEMO','DIM_DEMO_ERR', 'USER2') ;
end;
/as user2:
grant select, insert, update , delete on USER2.dim_demo_err to USER1;In the mapping, you can specify the error table with schema: USER2.dim_demo_er
Regards,
Carsten.
Similar Messages
-
DML Error logging table in Set base mode
Hi all,
In ETL data errors can be handled thru error logging table. Like this example given in the document-
INSERT /*+ APPEND PARALLEL */
INTO sales SELECT product_id, customer_id, TRUNC(sales_date), 3,
promotion_id, quantity, amount
FROM sales_activity_direct
LOG ERRORS INTO sales_activity_errors('load_20040802')
REJECT LIMIT UNLIMITED
In this example bulk loading is possible even in the presence of errors. The data errors will go in table 'Sales_activity_errors'.
I want to generate this kind of coding in OWB in set base mode. For that i have to create a error logging table in target schema. How can I reference this table in a mapping? can it be done by shadow table? It's a very important feature in ETL for bulk loading. Is it available in OWB in set based mode?
Guyzz please check this out and help me..
Regards,
SumantaHi,
I am not sure if the DML error logging method is avaliable to be used in OWB 10g R2. You can use the data rule method. Create the data rules on the target table. Deploy it from OWB. Create your mappings and then execute them. Two insert statements will be generated. One for loading the target table for loading records where no rules are violated. The other for loading the <target table>_ERR for loading records where one or more error rules are violated.
Both these statements do a bulk loading. So your purpose will be solved.
Again you can use the splitter method that has already been mentioned.
Regards
-AP -
Error log table and output type
Hi All,
While creating an invoice(while saving billing document) is it poosible to capture the error in a log in table?
And how is this error log table related to the output type?
ThanksYou seem to be confusing saving of the billing document with creation of the output. These are two independent updates (LUWs). A document may be saved without any output.
If there are any errors in the billing document creation, the document just won't be created. Such errors should be captured in the log of the Blling Due list, if it's been run properly.
To update the processing log for the output, use FM NAST_PROTOCOL_UPDATE. -
DML Error Logging Tables?
How can I use DML Error Logging Tables with OWB10gR2 (in tab-definition and mappings)?
btw: what really is a shadow-table?Hi,
What you can do to solve this is to add a Pre-mapping to your mapping which calls a procedure that alters the constraint on the target table in which you use the "EXCEPTIONS INTO <error_table> " clause.
like:
PROCEDURE "ENABLE_CONSTR_WITH_EXCEPTIONS"("P_TABLE" IN VARCHAR2, "P_CONSTRAINT" IN VARCHAR2) IS
v_table varchar2(30) := p_table;
v_constraint varchar2(30) := p_constraint;
v_command varchar2(200);
e_CannotValidate exception;
PRAGMA EXCEPTION_INIT(e_CannotValidate, -2437);
-- main body
BEGIN
/* Enable Constraint and write error into exception table */
BEGIN
v_command := 'ALTER TABLE ' || v_table || ' ENABLE CONSTRAINT '|| v_constraint ||' EXCEPTIONS INTO exceptions';
execute immediate (v_command);
commit;
END;
EXCEPTION
WHEN e_CannotValidate THEN
-- In my case when Unique Constraints are violated I will delete the duplicates.
DELETE_DUPLICATES(v_table, v_constraint);
WHEN OTHERS THEN
NULL; -- enter any exception code here
END;
-- End of ENABLE_CONSTR_WITH_EXCEPTIONS;Greetz,
Ilona Tielke -
Error log table how to track the run details
Hi ,
I am using the log errors concept in oracle and recording the errors.
My table is employees and I used the
be low script to create the error log tables.
BEGIN
DBMS_ERRLOG.create_error_log (dml_table_name => 'employees');
END;
It has created a table called err$_employees;
I have run the procedure and the procedure has inserted some records into the table for unique constraint error .
but when i rerun the same job , again it has inserted the same records whcih has created a proble now .
How will i figure out that which records are from the first runa dn which are from the second run?
Also i want to fail my job based on entried in the error log table i created.
Is it possible to track whats records belong to what instance in error log table.
hope i am clear abt the requirement i'm looking for ?
thanks
sriIf you look at the documentation you will see you can tag records that are inserted in the error log table e.g.
insert into employees
select ...
from ...
where ...
log errors into err$_employees('My Run Identifier')
You need to construct the string used as a tag so you can identify your individual runs. -
Hi,
In Oracle 10g , maybe in older version too, there is the capability to create an error log table using the dbms_errlog package and create_error_log procedure...
For exaample:
exec dbms_errlog.create_error_log('EMP','EMP_ERROR')
How to find afterwards that the table EMP has a related table EMP_ERROR where errors during the inserts/updates on table EMP are registered...?????
Of course the column names in EMP_ERROR table are the same as in EMP .... but it's time consuming method to search like this way...!!!!!
SQL> exec dbms_errlog.create_error_log('EMP','EMP_ERROR');
PL/SQL procedure successfully completed
SQL> DESC EMP_ERROR;
Name Type Nullable Default Comments
ORA_ERR_NUMBER$ NUMBER Y
ORA_ERR_MESG$ VARCHAR2(2000) Y
ORA_ERR_ROWID$ UROWID(4000) Y
ORA_ERR_OPTYP$ VARCHAR2(2) Y
ORA_ERR_TAG$ VARCHAR2(2000) Y
EMPNO VARCHAR2(4000) Y
ENAME VARCHAR2(4000) Y
JOB VARCHAR2(4000) Y
MGR VARCHAR2(4000) Y
HIREDATE VARCHAR2(4000) Y
SAL VARCHAR2(4000) Y
COMM VARCHAR2(4000) Y
DEPTNO VARCHAR2(4000) Y
SQL> DESC EMP;
Name Type Nullable Default Comments
EMPNO NUMBER(4)
ENAME VARCHAR2(10) Y
JOB VARCHAR2(9) Y
MGR NUMBER(4) Y
HIREDATE DATE Y
SAL NUMBER(7,2) Y
COMM NUMBER(7,2) Y
DEPTNO NUMBER(2) Y Thanks...
Simexec dbms_errlog.create_error_log('EMP','EMP_ERROR')
How to find afterwards that the table EMP has a related table EMP_ERROR
System@Elic10> create table bla_bla (i int);
Table created.
System@Elic10> exec dbms_errlog.create_error_log('BLA_BLA', 'BLA_BLA_ERRS')
PL/SQL procedure successfully completed.
System@Elic10> select * from user_tab_comments where table_name = 'BLA_BLA_ERRS';
TABLE_NAME TABLE_TYPE COMMENTS
BLA_BLA_ERRS TABLE DML Error Logging table for "BLA_BLA" -
I am supposed to insert the records that are not uploaded to the main table into a error log table and email the users about the error records that was not inserted into the table. How am is supposed to do it ?
I have few more questions.
What is the best way to upload the data from a file .
1, I got to either do the batch processing or
2, I got to browse and uplaod the file thru the APEX application and
and insert the records.
I want to know about which tutorial could be the best to read to do the
about 2 methods and how do i create and insert records into the error log
table and send the user with the CSv or txt file that contains the error records in both the methods ?
Will following the below method be the right way for 2nd method ?
http://oraexplorer.blogspot.com/2007/11/apex-to-upload-text-file-and-write-into.htmlOk,
I am trying to insert the records to an existing table from CSV file.
I am using the below post to do so..
Re: File Browse, File Upload
I get some errors executing the htmldb tools package.
Error at line 27: PLS-00103: Encountered the symbol "/"
create or replace PACKAGE htmldb_tools
AS
-- Utility functions --{{{
PROCEDURE parse_textarea ( --{{{
-- Parse a HTML textarea element into the specified HTML DB collection
-- The c001 element from the collection is used
-- The parser splits the text into tokens delimited by newlines, spaces
-- and commas
p_textarea IN VARCHAR2,
p_collection_name IN VARCHAR2
PROCEDURE parse_file( --{{{
-- Generic procedure to parse an uploaded CSV file into the
-- specified collection. The first line in the file is expected
-- to contain the column headings, these are set in session state
-- for the specified headings item.
p_file_name IN VARCHAR2,
p_collection_name IN VARCHAR2,
p_headings_item IN VARCHAR2,
p_columns_item IN VARCHAR2,
p_ddl_item IN VARCHAR2,
p_table_name IN VARCHAR2 DEFAULT NULL
END htmldb_tools;
create or replace PACKAGE BODY htmldb_tools
AS
TYPE varchar2_t IS TABLE OF VARCHAR2(32767) INDEX BY binary_integer;
-- Private functions --{{{
PROCEDURE delete_collection ( --{{{
-- Delete the collection if it exists
p_collection_name IN VARCHAR2
IS
BEGIN
IF (htmldb_collection.collection_exists(p_collection_name))
THEN
htmldb_collection.delete_collection(p_collection_name);
END IF;
END delete_collection; --}}}
PROCEDURE csv_to_array ( --{{{
-- Utility to take a CSV string, parse it into a PL/SQL table
-- Note that it takes care of some elements optionally enclosed
-- by double-quotes.
p_csv_string IN VARCHAR2,
p_array OUT wwv_flow_global.vc_arr2,
p_separator IN VARCHAR2 := ','
IS
l_start_separator PLS_INTEGER := 0;
l_stop_separator PLS_INTEGER := 0;
l_length PLS_INTEGER := 0;
l_idx BINARY_INTEGER := 0;
l_quote_enclosed BOOLEAN := FALSE;
l_offset PLS_INTEGER := 1;
BEGIN
l_length := NVL(LENGTH(p_csv_string),0);
IF (l_length <= 0)
THEN
RETURN;
END IF;
LOOP
l_idx := l_idx + 1;
l_quote_enclosed := FALSE;
IF SUBSTR(p_csv_string, l_start_separator + 1, 1) = '"'
THEN
l_quote_enclosed := TRUE;
l_offset := 2;
l_stop_separator := INSTR(p_csv_string, '"', l_start_separator + l_offset, 1);
ELSE
l_offset := 1;
l_stop_separator := INSTR(p_csv_string, p_separator, l_start_separator + l_offset, 1);
END IF;
IF l_stop_separator = 0
THEN
l_stop_separator := l_length + 1;
END IF;
p_array(l_idx) := (SUBSTR(p_csv_string, l_start_separator + l_offset,(l_stop_separator - l_start_separator - l_offset)));
EXIT WHEN l_stop_separator >= l_length;
IF l_quote_enclosed
THEN
l_stop_separator := l_stop_separator + 1;
END IF;
l_start_separator := l_stop_separator;
END LOOP;
END csv_to_array; --}}}
PROCEDURE get_records(p_blob IN blob,p_records OUT varchar2_t) --{{{
IS
l_record_separator VARCHAR2(2) := chr(13)||chr(10);
l_last INTEGER;
l_current INTEGER;
BEGIN
-- Sigh, stupid DOS/Unix newline stuff. If HTMLDB has generated the file,
-- it will be a Unix text file. If user has manually created the file, it
-- will have DOS newlines.
-- If the file has a DOS newline (cr+lf), use that
-- If the file does not have a DOS newline, use a Unix newline (lf)
IF (NVL(dbms_lob.instr(p_blob,utl_raw.cast_to_raw(l_record_separator),1,1),0)=0)
THEN
l_record_separator := chr(10);
END IF;
l_last := 1;
LOOP
l_current := dbms_lob.instr( p_blob, utl_raw.cast_to_raw(l_record_separator), l_last, 1 );
EXIT WHEN (nvl(l_current,0) = 0);
p_records(p_records.count+1) := utl_raw.cast_to_varchar2(dbms_lob.substr(p_blob,l_current-l_last,l_last));
l_last := l_current+length(l_record_separator);
END LOOP;
END get_records; --}}}
-- Utility functions --{{{
PROCEDURE parse_textarea ( --{{{
p_textarea IN VARCHAR2,
p_collection_name IN VARCHAR2
IS
l_index INTEGER;
l_string VARCHAR2(32767) := TRANSLATE(p_textarea,chr(10)||chr(13)||' ,','@@@@');
l_element VARCHAR2(100);
BEGIN
l_string := l_string||'@';
htmldb_collection.create_or_truncate_collection(p_collection_name);
LOOP
l_index := instr(l_string,'@');
EXIT WHEN NVL(l_index,0)=0;
l_element := substr(l_string,1,l_index-1);
IF (trim(l_element) IS NOT NULL)
THEN
htmldb_collection.add_member(p_collection_name,l_element);
END IF;
l_string := substr(l_string,l_index+1);
END LOOP;
END parse_textarea; --}}}
PROCEDURE parse_file( --{{{
p_file_name IN VARCHAR2,
p_collection_name IN VARCHAR2,
p_headings_item IN VARCHAR2,
p_columns_item IN VARCHAR2,
p_ddl_item IN VARCHAR2,
p_table_name IN VARCHAR2 DEFAULT NULL
IS
l_blob blob;
l_records varchar2_t;
l_record wwv_flow_global.vc_arr2;
l_datatypes wwv_flow_global.vc_arr2;
l_headings VARCHAR2(4000);
l_columns VARCHAR2(4000);
l_seq_id NUMBER;
l_num_columns INTEGER;
l_ddl VARCHAR2(4000);
BEGIN
IF (p_table_name is not null)
THEN
BEGIN
execute immediate 'drop table '||p_table_name;
EXCEPTION
WHEN OTHERS THEN NULL;
END;
l_ddl := 'create table '||p_table_name||' '||v(p_ddl_item);
htmldb_util.set_session_state('P149_DEBUG',l_ddl);
execute immediate l_ddl;
l_ddl := 'insert into '||p_table_name||' '||
'select '||v(p_columns_item)||' '||
'from htmldb_collections '||
'where seq_id > 1 and collection_name='''||p_collection_name||'''';
htmldb_util.set_session_state('P149_DEBUG',v('P149_DEBUG')||'/'||l_ddl);
execute immediate l_ddl;
RETURN;
END IF;
BEGIN
select blob_content into l_blob from wwv_flow_files
where name=p_file_name;
EXCEPTION
WHEN NO_DATA_FOUND THEN
raise_application_error(-20000,'File not found, id='||p_file_name);
END;
get_records(l_blob,l_records);
IF (l_records.count < 3)
THEN
raise_application_error(-20000,'File must have at least 3 ROWS, id='||p_file_name);
END IF;
-- Initialize collection
htmldb_collection.create_or_truncate_collection(p_collection_name);
-- Get column headings and datatypes
csv_to_array(l_records(1),l_record);
csv_to_array(l_records(2),l_datatypes);
l_num_columns := l_record.count;
if (l_num_columns > 50) then
raise_application_error(-20000,'Max. of 50 columns allowed, id='||p_file_name);
end if;
-- Get column headings and names
FOR i IN 1..l_record.count
LOOP
l_headings := l_headings||':'||l_record(i);
l_columns := l_columns||',c'||lpad(i,3,'0');
END LOOP;
l_headings := ltrim(l_headings,':');
l_columns := ltrim(l_columns,',');
htmldb_util.set_session_state(p_headings_item,l_headings);
htmldb_util.set_session_state(p_columns_item,l_columns);
-- Get datatypes
FOR i IN 1..l_record.count
LOOP
l_ddl := l_ddl||','||l_record(i)||' '||l_datatypes(i);
END LOOP;
l_ddl := '('||ltrim(l_ddl,',')||')';
htmldb_util.set_session_state(p_ddl_item,l_ddl);
-- Save data into specified collection
FOR i IN 2..l_records.count
LOOP
csv_to_array(l_records(i),l_record);
l_seq_id := htmldb_collection.add_member(p_collection_name,'dummy');
FOR i IN 1..l_record.count
LOOP
htmldb_collection.update_member_attribute(
p_collection_name=> p_collection_name,
p_seq => l_seq_id,
p_attr_number => i,
p_attr_value => l_record(i)
END LOOP;
END LOOP;
DELETE FROM wwv_flow_files WHERE name=p_file_name;
END;
BEGIN
NULL;
END;
/ -
Error log table for LSMW-Direct input
Can any one tell me , What is the table for storing the error messages in LSMW-Direct input?
Regards,
Nageshfor LSMW-Direct input method you can use EXCUTE IN BACK GROUND. the job is running and see the job name in SM37 . after completion of your job , select you job name and press the SPOOL button there you can find the you job error log.
-
Use global temp table for DML error logging
our database is 11.2.0.4 enterprise edition on solaris 10
we are wondering if anyone has an opinion of or has done this before, to use a global temp table for DML error logging. We have a fairly busy transactional database with 2 hot tables for inserts. The regular error table created with dbms_errlog has caused many deadlocks which we don't quite understand yet. we have thought using global temp table for the purpose, and that seemed to work, but we can't read error from the GTT, the table is empty even reading from the same session as inserts. Does anyone have an idea why?
ThanksThe insert into the error logging table is done with a recursive transaction therefore it's private from your session which is doing the actual insert.
Adapted from http://oracle-base.com/articles/10g/dml-error-logging-10gr2.php
INSERT INTO dest
SELECT *
FROM source
LOG ERRORS INTO err$_dest ('INSERT') REJECT LIMIT UNLIMITED;
99,998 rows inserted.
select count(*) from dest;
COUNT(*)
99998
SELECT *
FROM err$_dest
WHERE ora_err_tag$ = 'INSERT';
1400 "ORA-01400: cannot insert NULL into ("E668983_DBA"."DEST"."CODE")" I INSERT 1000 Description for 1000
1400 "ORA-01400: cannot insert NULL into ("E668983_DBA"."DEST"."CODE")" I INSERT 10000 Description for 10000
1400 "ORA-01400: cannot insert NULL into ("E668983_DBA"."DEST"."CODE")" I INSERT 1000 Description for 1000
1400 "ORA-01400: cannot insert NULL into ("E668983_DBA"."DEST"."CODE")" I INSERT 10000 Description for 10000
rollback;
select count(*) from dest;
COUNT(*)
0
SELECT *
FROM err$_dest
WHERE ora_err_tag$ = 'INSERT';
1400 "ORA-01400: cannot insert NULL into ("E668983_DBA"."DEST"."CODE")" I INSERT 1000 Description for 1000
1400 "ORA-01400: cannot insert NULL into ("E668983_DBA"."DEST"."CODE")" I INSERT 10000 Description for 10000
1400 "ORA-01400: cannot insert NULL into ("E668983_DBA"."DEST"."CODE")" I INSERT 1000 Description for 1000
1400 "ORA-01400: cannot insert NULL into ("E668983_DBA"."DEST"."CODE")" I INSERT 10000 Description for 10000 -
EAI Error logging - Log input/output XML to table
Hi Gurus,
In our project we are designing error handling mechanism for our EAI workflows.
Along with the common fields, generally we use in such scenarios (Object type, Row_id, User Login, Workflow Process,Error Code, Error Message..etc) we would like to save the input and output XML to the EAI log table.
Here I have couple of questions.
1) Is it a best approach to store all I/O messages to the table, ofcourse XML message size is not hude, it is just customer profile information without any child objects.
2) Which data type will support in Siebel and Oracle DB of the table column to store the XML message. Do we need any conversion from XML to normal string to store the XML.
3) How others (other projects) make use of this kind of error log table. Do I need to design BC, BO, Applet and view for this?
Please provide me your valuable suggestions.
Regards
user4619223Please have a look at Metalink(metalink3.oracle.com) HowTo:
Are there vanilla tables existing in Siebel schema that log system errors? (Doc ID 860443.1)
If you are looking into doing this.
Axel -
Limitation on DML error logging
Can someone please advise what is not supported by DML error logging which was introduced in Oracle 10g?
one Oracle document says "You cannot track errors in the error logging table for LONG, LOB, or object type columns".
One says only non-scalar columns, like datatime is not support.
Another one says it doesn't support nested table.
Anyone has a more complete list of limitation?
Thanks!
Edited by: user611482 on Feb 9, 2010 3:32 PMPLease read the following,
http://download.oracle.com/docs/cd/B19306_01/server.102/b14231/tables.htm#sthref2234
HTH
Aman.... -
BULK INSERT into View w/ Instead Of Trigger - DML ERROR LOGGING Issue
Oracle 10.2.0.4
I cannot figure out why I cannot get bulk insert errors to aggregate and allow the insert to continue when bulk inserting into a view with an Instead of Trigger. Whether I use LOG ERRORS clause or I use SQL%BULK_EXCEPTIONS, the insert works until it hits the first exception and then exits.
Here's what I'm doing:
1. I'm bulk inserting into a view with an Instead of Trigger on it that performs the actual updating on the underlying table. This table is a child table with a foreign key constraint to a reference table containing the primary key. In the Instead of Trigger, it attempts to insert a record into the child table and I get the following exception: +5:37:55 ORA-02291: integrity constraint (FK_TEST_TABLE) violated - parent key not found+, which is expected, but the error should be logged in the table and the rest of the inserts should complete. Instead the bulk insert exits.
2. If I change this to bulk insert into the underlying table directly, it works, all errors get put into the error logging table and the insert completes all non-exception records.
Here's the "test" procedure I created to test my scenario:
View: V_TEST_TABLE
Underlying Table: TEST_TABLE
PROCEDURE BulkTest
IS
TYPE remDataType IS TABLE of v_TEST_TABLE%ROWTYPE INDEX BY BINARY_INTEGER;
varRemData remDataType;
begin
select /*+ DRIVING_SITE(r)*/ *
BULK COLLECT INTO varRemData
from TEST_TABLE@REMOTE_LINK
where effectiveday < to_date('06/16/2012 04','mm/dd/yyyy hh24')
and terminationday > to_date('06/14/2012 04','mm/dd/yyyy hh24');
BEGIN
FORALL idx IN varRemData.FIRST .. varRemData.LAST
INSERT INTO v_TEST_TABLE VALUES varRemData(idx) LOG ERRORS INTO dbcompare.ERR$_TEST_TABLE ('INSERT') REJECT LIMIT UNLIMITED;
EXCEPTION WHEN others THEN
DBMS_OUTPUT.put_line('ErrorCode: '||SQLCODE);
END;
COMMIT;
end;
I've reviewed Oracle's documentation on both DML logging tools and neither has any restrictions (at least that I can see) that would prevent this from working correctly.
Any help would be appreciated....
Thanks,
SteveThanks, obviously this is my first post, I'm desperate to figure out why this won't work....
This code I sent is only a test proc to try and troubleshoot the issue, the others with the debug statement is only to capture the insert failing and not aggregating the errors, that won't be in the real proc.....
Thanks,
Steve -
Error logging using DBMS_ERRLOG package
Hi All,
We have following tables, which are growing day-by-day and giving performance problems.
There are total 25 tables (1 parent and 24 child tables, with different structures).
Here i gave some Samples, NOT actual table structures.
Actually we don't require all the data for our regular activities.
So we thought of moving part of the data into Other (Archive) tables, on daily basis.
Using SOME criteria, we are finding eligible records to be moved into Archive tables.
All child records follows the Parent.
Original Tables
==================
create table customer (c_id number(5), c_name varchar2(10),c_address varchar2(10));
create table orders (o_id number(5),c_id number(5),o_info clob);
create table personal_info (p_id number(5),c_id number(5), age number(3), e_mail varchar2(25), zip_code varchar2(10)):
Archive Tables
==============
create table customer_arch (c_id number(5), c_name varchar2(10),c_address varchar2(10));
create table orders_arch (o_id number(5),c_id number(5),o_info varchar2(100));
create table personal_info_arch (p_id number(5),c_id number(5), age number(3), e_mail varchar2(25), zip_code varchar2(10)):
Temp table
==========
create table C_temp (rnum number(5), ids number(5));
Sample Code
============
PROCEDURE payment_arch
IS
l_range_records NUMBER (4) := 2000;
l_total_count NUMBER(10) := 0;
l_prev_count NUMBER(10) := 0;
l_next_count NUMBER(10) := 0;
BEGIN
--Finding eligible records to be moved into Archive tables.
INSERT INTO C_TEMP
SELECT ROWNUM,c_id FROM customer;
SELECT NVL(MAX(ID),0) INTO l_total_count FROM OPP_PAYMENT_ID_TEMP;
IF l_total_count > 0 -- Start Count check
THEN
LOOP -- Insert Single payments
IF ((l_total_count - l_prev_count) >= l_next_count )
THEN
l_next_count := l_prev_count + l_range_records;
ELSE
l_next_count := l_total_count;
END IF;
l_prev_count := l_prev_count ;
INSERT INTO customer_ARCH
SELECT * FROM customer a
WHERE c_id in (SELECT c_id
FROM C_TEMP b WHERE rnum BETWEEN l_prev_count AND l_next_count);
INSERT INTO orders_ARCH
SELECT * FROM orders a
WHERE c_id in (SELECT c_id
FROM C_TEMP b WHERE rnum BETWEEN l_prev_count AND l_next_count);
INSERT INTO personal_info_ARCH
SELECT * FROM personal_info a
WHERE c_id in (SELECT c_id
FROM C_TEMP b WHERE rnum BETWEEN l_prev_count AND l_next_count);
-- Delete Archived Single Payments
DELETE customer a
WHERE c_id in (SELECT c_id
FROM C_TEMP b WHERE ID BETWEEN l_prev_count AND l_next_count);
COMMIT;
IF l_next_count = l_total_count
THEN
EXIT;
else
l_prev_count := l_next_count;
END IF;
END LOOP; -- Insert Single payments
END IF; -- End Count check
EXCEPTION
WHEN NO_DATA_FOUND
THEN
NULL;
WHEN OTHERS
THEN
RAISE_APPLICATION_ERROR('-20002','payment_arch: ' || SQLCODE ||': ' || SQLERRM);
END Payment_Arch;
In production, we may require to archive 25000 Parent records and 25000*4*3 child records per day.
Now the problem is:
By any chance, if record fails, We want to know the Exact record ,just "c_id" for the particular table
where the error raised and Error message.
We thought of using DBMS_ERRLOG package, but we CAN NOT log errors of Different tables into SINGLE error log table.
In the above case It require 3 Different tables and it logs all columns from the original table.
It's a un-necessary burden on the database.
We would be very glad, if anyone can help us with some good thought.
Thanks in advance.
srinivasduplicate post
Insufficient privilege error when executing DBMS_ERRLOG through PLSQL -
When I tried
EXECUTE DBMS_ERRLOG.CREATE_ERROR_LOG('table');
in Oracle Database XE, I got error: ORA-00900: invalid SQL statement.
Error Logging Table is a new feature for 10g. Does XE has this new feature?
Thanks for the help!It works!!!
The issue only happens in Oracle SQL Developer and Oracle console opened in IE browser.
The issue also happens in SQL Command Line if using "begin" and "end" as the begining and the ending of the SQL statement
SQL> begin
2 EXECUTE DBMS_ERRLOG.CREATE_ERROR_LOG('MIGR_STOCK')
3 end;
4 /
EXECUTE DBMS_ERRLOG.CREATE_ERROR_LOG('MIGR_STOCK')
ERROR at line 2:
ORA-06550: line 2, column 9:
PLS-00103: Encountered the symbol "DBMS_ERRLOG" when expecting one of the
following:
:= . ( @ % ; immediate
The symbol ":=" was substituted for "DBMS_ERRLOG" to continue.
ORA-06550: line 3, column 1:
PLS-00103: Encountered the symbol "END" when expecting one of the following:
. ( * % & = - + ; < / > at in is mod remainder not rem
<an exponent (**)> <> or != or ~= >= <= <> and or like LIKE2_
LIKE4_ LIKEC_ between || multiset m
SQL>
However it works well if only execute one line command
SQL> DROP TABLE ERR$_MIGR_STOCK CASCADE CONSTRAINTS PURGE;
Table dropped.
SQL> EXECUTE DBMS_ERRLOG.CREATE_ERROR_LOG('MIGR_STOCK');
PL/SQL procedure successfully completed.
Don't know why? But Thank you very much!!! -
DML Error logging with delete restrict
Hi,
I am trying to log all DML errors while performing ETL process. We encountered a problem in the process when one of the on delete cascade is missing in child tables but I was curious to know why we got that exception to the calling environment because we are logging all DML errors in err$_ tables. Our expectation is when we get child record found violation then that error will be logged into ERR$_ tables and process will carry on without any interruption but it interrupted in the middle and terminated. I can illustrate with below example
T1 -> T2 -> T3
T1 is parent and it is s root
Create table t1 (id number primary key, id2 number);
Create table t2(id number references t1(id) on delete cascade, id2 number);
create table t3 (id number references t2(id)); -- Missing on delete cascade
insert into t1 as select level, level from dual connect by level < 20;
insert into t2 as select level, level from dual connect by level < 20;
insert into t3 as select level from dual connect by level < 20;
exec dbms_errlog(t1);
exec dbms_errlog(t2);
exec dbms_errlog(t3);
delete from t1 where id = 1 log errors into err$_t1 reject limit unlimited; -- Child record found violation due to t3 raised but I am expecting this error will be trapped in log tables.
delete from t2 where id =1 log errors into err$_t2 reject limit unlimited; -- Got the same error child record violation. My expectation error will be logged into log tables.
I am using Oracle 11gR2.
Also, Please let me know if there is any restrictions to use DML error logging in DBMS_PARALLEL_EXECUTE.
Please advise
Thanks,
UmakanthWhat is the error you want me to fix? Missing on delete cascade?
The Code you posted has multiple syntax errors and can't possibly run. You should post code that actually works.
My expectation is all the DML errors will be logged into error logging tables even if it is child record found violation.
delete from t1 where id = 1 log errors into err$_t1 reject limit unlimited; -- Child record found violation due to t3 raised but I am expecting this error will be trapped in log tables.
delete from t2 where id =1 log errors into err$_t2 reject limit unlimited; -- Got the same error child record violation. My expectation error will be logged into log tables.
DML error logging logs DATA. When you delete from T1 there is an error because the T2 child record can NOT be deleted. So the T1 row that was being deleted is logged into the T1 error log table. The request was to delete a T1 row so that is the only request that failed; the child rows in T2 and T3 will not be put into log tables.
Same when you try to delete from T2. The T3 child record can NOT be deleted so the T2 row that was being deleted is logged into the T2 error log table.
The exceptions that occur are NOT logged, only the data that the DML could not be performed on.
After I fixed your code your example worked fine for me and logged into the DML error tables as expected. But I wasn't doing it from a client.
Maybe you are looking for
-
How to change an mpeg4 song to a mp3????
I puchased a few songs lately on i tunes store but they are all in mpeg4 and i cant do nothing with them........ is it possible to change it into an mp3?
-
My iPad screen keeps flicking from page top page. Any suggestions?
-
How to drag out additional field in document printing in B1
Hi all expert, My customer need additional field or information to be show at the document printing window in standard B1. Example like customer name or user define field for warehouse. The standard B1 document printing is limited with the form setti
-
MIRO.......Quality doc not going into block
Hi, When we are posting invoice through MIRO against materials in Quality its not going into payment block.. We maintaing everything from QM point of view, control keys and select block invoice options. we are unable to trace/ fine the settting relat
-
How to Configure and Use Oracle's JMS Connector with Tibco Enterprise
Following the advice here: http://www.oracle.com/technology/tech/java/oc4j/1013/how_to/how-to-connect-to-tibco/doc/how-to-connect-to-tibco.html I believe I need to provide a username and password for the Tibco JMS provider, and would like to know the