Using an on update or insert trigger on a SAP table
Hi all,
A question for you regarding using a database trigger in and SAP system.
We are needing to export data for our datawarehouse. Currently we export all the data. That data is getting quite large tho and time is becoming an issue.
One suggestion has been to add a custome field to an existing SAP table to hold a flag to indicate the record has changed but not been extracted. Thenput a trigger on the table in question so that the new field is updated upon update or insert of the record. Then when the extract runs, it will only extract those records with the flag set, then reset the flag.
we are using ERP 6 and SQL Server 2005 (or soon to as we will be upgrading next month).
Any suggestions regarding triggers? Or other options for flagging changed records in large table so we don't have to extract all records every time?
Thanks
Laurie
Edited by: Laurie McGinley on Oct 27, 2008 10:51 PM
Hi Laurie,
just really shouldn't consider to put triggers into your application.
The problem here is the very nature of triggers to do things 'in the background'.
If you have problems due to the trigger, you won't see it anywhere in the SAP environment.
In a while you probably forget about the trigger at all and then nobody can see, what is happening to the database, just because something is changed "in the background".
Besides that you don't get support for this from SAP, you will make your life a lot harder.
Tabledefinitions may change - your trigger might break.
Put application logic where it belongs - to the application layer!
regards,
Lars
Similar Messages
-
Insert the data in sap tables from C SHARP application
Hi All,
I want to save some data in sap table from my CSharp(DOTNET) windows application.I tried with the help of SAP.Net Connector but that connector is not been supported visual studio 2005 or i not have visual studio 2003 .Soo plz any one can help how i can do this .
If any one have a idea then plz give me some example also how we did.
thanks
regards
sandeep DabralYou have to use SAP .NET connetor to make interface with SAP. This is better way of doing interface between .NET and SAP.
You create web service (wsdl) in .NET 2003 and try to use that in .NET 2005.
You're getting the two technologies confused ......
.Net Connector is used for RFC-type interfaces. It calls Bapi's directly. Web services are not involved for this type of interface.
WSDL files are used to generate proxies for the .Net client to call Web Services (typically web-enabled Bapi's). This type of interface uses SOAP protocol not RFC and does not use the .Net Connector. Enterprise Web Services may be discovered using the .Net Enterprise Service Explorer, which is a different component from the .Net Connector.
Regards,
D. -
Using Merge to Update to Insert, Update, and also deleting obsolete records
Hi all!!!
Suppose that a have the following table, called Orders:
ORDERID STATUS PRICE
1 0 100
2 0 200
3 0 300
4 0 350
5 0 390 Also, there is a procedure that will generate some data on a global temporary table, called Orders_Temp. This temporary table will be used to update the Orders table.+Orders+ have a trigger that will perform different operations when a Insert, Delete or Update command is performed.
Consider that the Orders_Temp table have the following data after running a procedure that is responsible for updating orders data:
ORDERID STATUS PRICE
1 1 100
2 2 200
3 0 300
5 0 390
6 0 350 As you can see, orders 1 and 2 were updated. Order 3 and 5 were untouched, order 4 was deleted and order 6 was inserted. Merge will handle the correctly handle orders 1, 2, 3, 5 and 6, even though generating a update to orders 3 and 5, that were not changed.
So, here are the questions:
1) How could I delete the order 4 from Orders, since this order was not present on the Orders_Temp? Is there a way to use the merge command to do this?
2) Is there any way to not generate a update operation for orders 3 and 5, that were not changed?
I´m using Oracle 10.2 SE.
Thanks a lot!
RegisHello G!
Thank you so much for your help.
Your code took only a modification:
MERGE INTO orders tgt
USING (SELECT A.*, 1 mask
FROM orders_temp A
UNION ALL
SELECT A.*, 0
FROM orders A
WHERE NOT EXISTS
(SELECT 1
FROM orders_temp b
WHERE A.order_id = b.order_id)) src
ON (tgt.order_id = src.order_id)
WHEN MATCHED
THEN
UPDATE SET tgt.status = src.status, tgt.price = src.price
/*WHERE src.mask = 1
AND tgt.status != src.status
AND tgt.price != src.price*/
DELETE
WHERE src.mask = 0
WHEN NOT MATCHED
THEN
INSERT VALUES (src.order_id, src.status, src.priceWithout doing this, the DELETE WHERE clause was never called. After reading the docs, I found why:
Specify the DELETE where_clause to clean up data in a table while populating or updating it. The only rows affected by this clause are those rows in the destination table that are updated by the merge operation. That is, the DELETE WHERE condition evaluates the updated value, not the original value that was evaluated by the UPDATE SET ... WHERE condition. If a row of the destination table meets the DELETE condition but is not included in the join defined by the ON clause, then it is not deleted. Any delete triggers defined on the target table will be activated for each row deletion.http://download.oracle.com/docs/cd/B19306_01/server.102/b14200/statements_9016.htm
>
From what I understand, if we restrict the records on update set_clause, the delete where_clause will no be able to process because they were not touched. So, as as workaroud, I´m checking if the record was changed inside my trigger.
Thanks a lot!
Regards.
Regis -
Use of DBMS_LOB.LoadCLOBFromFile on insert trigger
We have a system where most of the data is stored on disk as BFILES. However, in certain circumstances I'd like to read some of the files into the database as CLOBS as well. I'd like to do this on record creation and thought the easiest way to do this would be via an trigger on an insert (of the row containing the BFILE) using the DBMS_LOB.LoadCLOBFromFile. However, the (dbms_lob.fileexists(bfile)) returns 0 when executed in the trigger (below). The same command returns 1 after record creation. Can I not reference a BFILE until after its creation?
create or replace
TRIGGER "MADAM2"."TRIGGER_CLOB"
AFTER INSERT ON DATAFILES
FOR EACH ROW
declare
dest_clob CLOB;
src_clob BFILE;
dst_offset number := 1;
src_offset number := 1;
lang_ctx number := DBMS_LOB.DEFAULT_LANG_CTX;
warning number;
begin
if lower(:new.mime_type) like '%text%' and (:new.doc_size/ (1024 *1024)) < 1 then
src_clob := :new.path;
IF (dbms_lob.fileexists(src_clob)) = 1 THEN
insert into clob_data (fileid, file_text) values (:new.fileid, empty_clob()) returning file_text into dest_clob;
DBMS_LOB.OPEN(src_clob, DBMS_LOB.LOB_READONLY);
DBMS_LOB.LoadCLOBFromFile(
DEST_LOB => dest_clob
, SRC_BFILE => src_clob
, AMOUNT => DBMS_LOB.GETLENGTH(src_clob)
, DEST_OFFSET => dst_offset
, SRC_OFFSET => src_offset
, BFILE_CSID => DBMS_LOB.DEFAULT_CSID
, LANG_CONTEXT => lang_ctx
, WARNING => warning
DBMS_LOB.CLOSE(src_clob);
end if;
end if;
end;Please ignore - I think the problem is with some java code which is moving the file after the insert. Apologies.
-
Use of the "updlock" hint with update and insert statements
I have inherited some stored procedures and am trying to figure out why the developers decided to use the "updlock" hint on many of the update and insert statements. I have looked around everywhere and have found only one explanation of why "update...with
(updlock)" can be useful, namely when a table has no clustered index:
http://www.sqlnotes.info/2012/10/10/update-with-updlock/ I have found nothing yet that mentions why "insert into...with (updlock)" might be used. I understand why the hint
might be useful on select statements in some cases, but if all of the tables have clustered indexes, is there any good reason to use it on update and insert statements?
Thanks,
Ron
Ron RiceThis form of deadlock error can occur on a table which has a clustered index.
If you are doing updates on a table which has a clustered index and that table also has a nonclustered index and the nonclustered index is used to find the row to update you can see this type of deadlock. For example create a table with a clustered
primary key index and a nonclustered index by running
Create Table Foo(PK int primary key identity, OtherKey varchar(10), OtherData int);
go
Insert Foo Default Values;
go 10000
Update Foo Set OtherKey = 'C' + Cast(PK As varchar(10))
Create Unique Index FooIdx On Foo(OtherKey);
That creates a table with 10000 rows, a clustered index and a nonclustered index. Then run
Begin Transaction
Update Foo Set OtherData = 1 Where OtherKey = 'C5'
That will use the FooIdx index to find the row that needs to be updated. It will get a U lock on the index row in the FooIdx index, then an X lock on the row in the clustered index, update that row, then free the U lock on FooIdx, but keep the X lock
on the row in the clustered index. (There is other locking going on, but to simplify things, I'm only showing the locks that lead to the deadlock).
Then in another window, run
Begin Transaction
Update Foo Set OtherData = 2 Where OtherKey = 'C5'
This will get a U lock on the index row in the FooIdx index, then try to get an X lock on the row in the clustered index. But that row is already exclusively locked, so this second window will wait holding a U lock on FooIdx row and is waiting for
an X lock on the clustered index row.
Now go back to the first window and run
Update Foo Set OtherData = 3 Where OtherKey = 'C5'
This will once again try to get the U lock on the FooIdx row, but it is blocked by the U lock the second window holds. Of course the second window is blocked by the X lock on the clustered index row and you have a deadlock.
All that said, I certainly do not routinely code my updates with UPDLOCK. I try to design databases and write code so that deadlocks will be rare without holding excessive locks. The more locks you hold and the longer you hold them, the more
blocking you will get and the slower your system will run. So I write code that if a deadlock exception occurs, it is properly handled. Then if too many deadlocks occur, that is the time to go back to the code to see what changes are needed to
decrease the number of deadlocks (one way to do that may be to get locks earlier and/or hold them longer.
But I wouldn't worry much about this form of deadlock. It is, in my experience, vary rare. I don't recall ever seeing it in a production environment.
Tom -
Pre-insert-trigger ignoring assignments
Hi,
I have this code in a PRE-INSERT-TRIGGER of a database table block:
Select emp_seq.Nextval Into :EMP.EMPNO From dual;
Select Sysdate Into :EMP.LASTCHANGED From dual;
And i have this code in a PRE-UPDATE-TRIGGER of the same block:
Select Sysdate Into :EMP.LASTCHANGED From dual;
given scenario:
1) query records from the table into a block:
empno ename lastchanged
1 Smith 01.12.2008
2 Johnson 01.12.2008
2) change empname in any record except no. 1:
empno ename lastchanged
1 Smith 01.12.2008
2 Johannson 01.12.2008
3) create a new record somewhere above the changed record
empno ename lastchanged
1 Smith 01.12.2008
<null> <null> <null>
2 Johannson 01.12.2008
4) insert ename in new record
empno empname lastchanged
1 Smith 01.12.2008
<null> Obama <null>
2 Johannson 01.12.2008
5) do_key('commit_form')
with Forms 6.0.8.23.2 -> working fine
with Forms 6.0.8.27.0 -> ORA-01400:: cannot insert NULL into ("EMP"."EMPNO")
Any assignment in the pre-insert-trigger is ignored! Can anyone help me with this bug? Thanks in advancemessage(:EMP.EMPNO) is always Null ...
Everything works if there's no update of a record with higher record-number in the transaction. But the scenario in post 1 doesnt work with Forms 6.0.8.27.0Ugh -- that's ugly!
Unfortunately, opening a Service Request with Oracle will get you nowhere, since Oracle no longer supports Forms 6i.
Yesterday, I experienced something very similar with the same version of Forms, specifically this part: Everything works if there's no update of a record with higher record-number in the transaction.*
If I updated a higher record number in the block, I could NOT get Forms to subsequently store a value in a column in a prior record. I would set the value in the column, and immediately display the value, and it was null! Fortunately in my case, the stored value was only to enable skipping a database lookup in a subsequent pass, so I just skipped working on a solution.
However, in your case, the problem is a show-stopper.
What I found was that if I navigated back to the first record in the block, the problem went away. So maybe try this in your commit process:
Go_block('ABC');
First_Record;
Synchronize;
Commit_Form;Let us know if that works for you. -
Hi...I have a custom after insert trigger on the PO_DISTRIBUTIONS_ALL table
It is having an issue with getting the req_distribution_id
I took out all the code and left in just a test, which only inserts the id into a test table. But it is always null! Can someone tell me why? Because there is an ID when created, and I see it after I query the table when it is finished. But I figured, after insert it will be there!
Below is the trigger...
create or replace
TRIGGER APPS.xxmc_po_distributions_auir
AFTER INSERT OR UPDATE ON PO_DISTRIBUTIONS_ALL
REFERENCING NEW AS new OLD AS old
FOR EACH ROW
DECLARE
v_trigger_location VARCHAR2(2000):= 'Declaration';
v_error_message VARCHAR2(2000);
BEGIN
delete from test_table;
insert into test_table values(to_char(:new.req_distribution_id));
END;
This enters null every time into the table. Why is that?
-JOK my new trigger has the dml statements in a procedure and it still fails. The ID is always null. Below is the trigger...why is the id coming up null???
:new.req_distribution_id should never be a null value in my opinion, right?
create or replace
TRIGGER APPS.xxmc_po_distributions_auir
AFTER INSERT OR UPDATE ON PO_DISTRIBUTIONS_ALL
REFERENCING NEW AS new OLD AS old
FOR EACH ROW
DECLARE
v_trigger_location VARCHAR2(2000):= 'Declaration';
v_error_message VARCHAR2(2000);
/* Error Tracking local variables */
v_user_id NUMBER := FND_GLOBAL.User_Id;
v_inv_record_id_s NUMBER := 0;
v_mycall_status VARCHAR2(20);
v_myerror_msg VARCHAR2(200);
v_ric_code VARCHAR2(10);
v_tracking_id VARCHAR2(200);
v_source_system xxmc_track_summary.source_system%TYPE := 'XXMC_CRM';
v_destination_system xxmc_track_summary.destination_system%TYPE := 'XXMC_CRM';
v_rice_name xxmc_track_summary.rice_object_name%TYPE := 'E120';
v_failure_record_count xxmc_err_track_det.error_count%TYPE := 0;
v_module_name xxmc_err_track_det.procedure_name%TYPE := 'XXMC_PO_DISTRIBUTIONS_AUIR';
v_stg_tablename xxmc_track_summary.staging_table_name%TYPE := '';
v_processed INTEGER := 0;
v_failed INTEGER := 0;
v_total INTEGER := 0;
v_commit INTEGER := 0;
e_error_tracking_exception EXCEPTION;
CURSOR c_get_req_line_info(p_req_distribution_id IN NUMBER) IS
SELECT PORH.SEGMENT1
, porh.attribute1 po_heading_att1
, porl.line_num
, porl.destination_context
, porl.attribute1
, porl.attribute3
, porl.attribute4
, porl.attribute5
, porl.attribute6
, porl.attribute7
, porl.attribute8
, porl.attribute9
, porl.attribute10
, porl.attribute11
, porl.attribute12
, porl.requisition_line_ID
, to_char(porl.creation_date, 'mm/dd/yyyy hh24:mi:ss') format_create_date_line
FROM PO_REQ_DISTRIBUTIONS_ALL PORD
, PO_REQUISITION_LINES_ALL PORL
, PO_REQUISITION_HEADERS_ALL PORH
WHERE PORD.DISTRIBUTION_ID = p_REQ_DISTRIBUTION_ID
AND PORL.REQUISITION_LINE_ID = PORD.REQUISITION_LINE_ID
AND PORH.REQUISITION_HEADER_ID = PORL.REQUISITION_HEADER_ID
i number := 0;
TEMP NUMBER;
BEGIN
v_trigger_location := 'Beginning of trigger body';
KMG_WRITE('INSERTING IN XXMC TRIGGER ON PO DISTRIBUTIONS AT '
|| TO_CHAR(SYSDATE, 'MM/DD/YYYY HH24:MI:SS')
|| ' REQ_DISTRIBUTION_ID = '
|| NVL(TO_CHAR(:new.req_distribution_id), 'VALUE IS NULL') );
xxmc_tracking_pkg.create_source_tracking_record(
p_tracking_id => v_tracking_id
,p_source_system => 'XXMC_CRM'
,p_destination_system => 'XXMC_CRM'
,p_rice_object_name => v_rice_name
,p_creation_date => sysdate
,p_created_by => v_user_id
,p_transaction_type => 'XXMC_CRM'
,p_status => 'INFLIGHT'
,p_if_direction => 'O'
,p_sub_status => 'XXMC_PROCESSING_IN_EBS'
,p_src_total_records => 0
,p_call_status => v_mycall_status
,p_error_msg => v_myerror_msg);
IF v_mycall_status <> 'S' THEN
RAISE e_error_tracking_exception;
END IF;
TEMP:='AAA';
i := 0;
FOR r IN c_get_req_line_info(:new.req_distribution_id) LOOP
i := i + 1;
UPDATE po_lines_all pol
SET attribute_category = r.destination_context
, Attribute1 = r.attribute1
, Attribute3 = r.attribute3
, Attribute4 = r.attribute4
, Attribute5 = r.attribute5
, Attribute6 = r.attribute6
, Attribute7 = r.attribute7
, Attribute8 = r.attribute8
, Attribute9 = r.attribute9
, Attribute10 = r.attribute10
, Attribute11 = r.attribute11
, Attribute12 = r.attribute12
WHERE pol.po_line_id = :new.po_line_id;
UPDATE po_headers_all poh
SET attribute1 = r.po_heading_att1
WHERE poh.po_header_id = :new.po_header_id;
END LOOP;
EXCEPTION
WHEN e_error_tracking_exception THEN
kmg_write('Could not create source tracking record in XXMC PO DISTRIBUTIONS trigger. Status = '
|| v_mycall_status || ' Msg = ' || v_myerror_msg);
RAISE_APPLICATION_ERROR (
num=> -20003,
msg=> 'XXMC PO DISTRIBUTIONS trigger error in '
|| ' create_source_tracking_record '
|| v_myerror_msg);
WHEN OTHERS THEN
v_error_message := SQLERRM;
kmg_write('Error in XXMC PO DISTRIBUTIONS trigger. ' || v_error_message);
xxmc_tracking_pkg.log_then_report_errors(
p_tracking_id => v_tracking_id
,p_source_system => v_source_system
,p_destination_system => v_destination_system
,p_det_record_id => v_inv_record_id_s
,p_message_name => 'XXMC_PO_DISTRIBUTIONS'
,p_message_description => 'XXMC_PO_DISTRIBUTIONS'
,p_message_text => ltrim(rtrim(v_error_message))
,p_last_update_date => sysdate
,p_procedure_name => v_module_name
,p_staging_table_name => v_stg_tablename
,p_status => 'FAILED'
,p_sub_status => 'XXMC_PROCESSING_IN_EBS'
,p_src_success_records => v_processed
,p_src_failed_records => v_failed
,p_dest_success_records => v_processed
,p_dest_failed_records => v_failed
,p_error_count => v_failed
,p_call_status => v_mycall_status
,p_error_msg => v_myerror_msg
,p_key_field_value => v_tracking_id);
RAISE_APPLICATION_ERROR (
num=> -20001,
msg=> 'Error encountered in XXMC PO DISTRIBUTIONS AUIR Trigger '
|| ' Date = ' || to_char(sysdate, 'mm/dd/yyyy hh24:mi:ss')
|| ' SQLERRM = ' || v_error_message);
END; -
Can we insert data directly in standard table
hi
how can we insert data in standard table directly.
pls reply urgently.Hi sapna yes u can insert data to the SAP tables, as shown below...
use the table mara in the place of m
REPORT ZTEST_INSERT.
TABLES: <m>.
DATA: wa_m TYPE <m>.
wa_m-ernam = 'ZTEST'.
insert into <m> VALUES wa_m.
if sy-subrc = 0.
WRITE / 'Records inserted Successfully'.
ENDIF.
reward if usefull,
Vishnu. R
Edited by: vishnu ramanathan on Sep 18, 2008 2:17 PM -
Retrieving latest updated or inserted records without using a trigger
I have to retrieve the latest updated/inserted records from a set of database tables. Say, if 'n' sets of different records have been updated or inserted into one or more different database tables then all the 'n' records of data should be retrieved using a java code without using a trigger.
helpmeplz wrote:
Thanks for your reply.
But I don't know when or from where or what kind of data gets inserted/updated into the tables. I need a listener or a component which can handle events occured on the particular set of database tables, and get the event data. the java code should get the updated/inserted rows that have been inserted into a set of database tables by a third party.
Please lemme know how I can do this.Realistically you can't.
If and only if the tables have a modification timestamp then you could use that. Every table would need it.
Other than that the only othe possibility would require that you keep an entire copy of each table in the memory, poll at a set interval and then do an entire comparison for each table. For very small data volumes (on the target tables) that is practical. For larger volumes it isn't. -
Using Database Change Notification instead of After Insert Trigger
Hello guys! I have an after insert trigger that calls a procedure, which in turn is doing an update or insert on another table. Due to mutating table errors I declared the trigger and procedure as autonomously transactional. The problem is, that old values of my main tables are inserted into the subtable since the after insert/update trigger is fired before the commit.
My question is how can I solve that and how could I use the change notification package to call my procedure? I now that this notification is only started after a DML/DDL action has been commited on a table.
If you could show me how to carry out the following code with a Database Change Notification I'd be delighted. Furthermore I need to know if it suffices to set up this notification only once or for each client seperately?
Many thanks for your help and expertise!
Regards,
Sebastian
declare
cnumber number (6);
begin
select count(*) into cnumber from (
select case when (select date_datum
from
(select f.date_datum,
row_number() over (order by f.objectid desc) rn
from borki.fangzahlen f
where lng_falle = :new.lng_falle
and int_fallennummer = :new.int_fallennummer
and lng_schaedling = :new.lng_schaedling
and date_datum > '31.03.2010'
where rn=1) < (select date_datum
from
(select f.date_datum,
row_number() over (order by f.objectid desc) rn
from borki.fangzahlen f
where lng_falle = :new.lng_falle
and int_fallennummer = :new.int_fallennummer
and lng_schaedling = :new.lng_schaedling
and date_datum > '31.03.2010'
where rn=2) then 1 end as action from borki.fangzahlen
where lng_falle = :new.lng_falle
and int_fallennummer = :new.int_fallennummer
and lng_schaedling = :new.lng_schaedling
and date_datum > '31.03.2010') where action = 1;
if cnumber != 0 then
delete from borki.tbl_test where lng_falle = :new.lng_falle
and int_fallennummer = :new.int_fallennummer
and lng_schaedling = :new.lng_schaedling
and date_datum > '31.03.2010';
commit;
pr_fangzahlen_tw_sync_sk(:new.lng_falle, :new.int_fallennummer, :new.lng_schaedling);It looks like you have an error in line 37 of your code. Once you fix that the problem should be resolved.
-
After update insert trigger is not working correctly
Hello experts!
I created an after insert/update trigger and what strikes me is that it is not working as expected.
The trigger launches a procedure that does an insert in a second table if values in the triggered table ("my_table") are altered.
The problem is that the values in my second table, which are correlated to "my_table", are not changed with the correct values right away. The trigger and insert trails behind!
I have to update twice for the values to appear in my second table. Only then, the data of the first update will be inserted into the second table wheras the parent table ("my_table") will hold the latest values.
Do you have an idea what could be wrong?
create or replace
trigger myscheme.after_update_insert_set_tw
after update or insert
on myscheme.my_table
for each row
declare
begin
pr_my_table_tw_sync_sk(:new.lng_falle, :new.int_fallennummer, :new.lng_schaedling, :new.objectid);
end;Brgds,
SebOkay I'll give my best to explain what my procedure is supposed to do and what the table structure is like. I hope it'll help you to help me :-)
My parent table is called fangzahlen and is created as follows:
CREATE TABLE "BORKI"."FANGZAHLEN"
( "OBJECTID" NUMBER(10,0) NOT NULL ENABLE,
"LNG_FALLE" NUMBER(10,0) NOT NULL ENABLE,
"LNG_BEARBEITER" NUMBER(10,0),
"DATE_DATUM" DATE DEFAULT SYSDATE NOT NULL ENABLE,
"INT_FALLENNUMMER" NUMBER(4,0) NOT NULL ENABLE,
"LNG_SCHAEDLING" NUMBER(2,0) NOT NULL ENABLE,
"INT_VOLUMEN" NUMBER(10,0),
"INT_ANZAHL" NUMBER(10,0),
"INT_ANTEIL_JUNGKAEFER" NUMBER(3,0),
"BOOL_KOEDERWECHSEL" NUMBER(1,0),
CONSTRAINT "PK_FANGZAHLEN" PRIMARY KEY ("OBJECTID")
USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "USERS" ENABLE,
CONSTRAINT "CHECK_DATE_DATUM" CHECK ("DATE_DATUM" >= '1.apr.2006') ENABLE
) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "USERS" ;It holds values such as:
OBJECTID,"LNG_FALLE","LNG_BEARBEITER","DATE_DATUM","INT_FALLENNUMMER","LNG_SCHAEDLING","INT_VOLUMEN","INT_ANZAHL","INT_ANTEIL_JUNGKAEFER","BOOL_KOEDERWECHSEL"
97548,"39","1081","08.04.10","1","2","","220","","0"
97534,"39","1081","06.04.10","1","2","","100","","-1"My subtable is called tbl_test and created with:
CREATE TABLE "BORKI"."TBL_TEST"
( "OBJECTID" NUMBER(12,0) NOT NULL ENABLE,
"LNG_FALLE" NUMBER(10,0),
"DATE_DATUM" DATE,
"INT_FALLENNUMMER" NUMBER(4,0),
"LNG_SCHAEDLING" NUMBER(2,0),
"INT_VOLUMEN" NUMBER(10,0),
"INT_ANZAHL" NUMBER(10,0),
"LNG_FANGZAHLEN" NUMBER(12,0)
) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "USERS" ;Okay, this were the prerequisites!
Let's concentrate on the trigger and procedure:
The purpose of the procedure is to insert data into tbl_test once a record in "fangzahlen" has been updated or inserted.
Tbl_test holds the mean average values (int_anzahl) for a number of collected items (lng_schaedling).
I need to insert date values for the combination of each lng_fangzahlen, lng_schaedling, lng_falle and int_fallennr into tbl_test.
If I had the following data in table fangzahlen:
97548,"39","1081","08.04.10","1","2","","220","","0"
97534,"39","1081","06.04.10","1","2","","100","","-1"the query in the insert section of my procedure returns the following data when run manually:
97534 31.03.10 16,667 (null) 1 39 2
97548 06.04.10 110 (null) 1 39 2Now, I need to generate the dates between the first and the second objectid (column 1 in the data above).
Hence,
97534 31.03.10 16,667 (null) 1 39 2
97534 01.04.10 16,667 (null) 1 39 2
97534 02.04.10 16,667 (null) 1 39 2
97534 03.04.10 16,667 (null) 1 39 2
97534 04.04.10 16,667 (null) 1 39 2
97534 05.04.10 16,667 (null) 1 39 2
97534 06.04.10 16,667 (null) 1 39 2
97548 07.04.10 110 (null) 1 39 2
97548 08.04.10 110 (null) 1 39 2
My problem is thatthe values above will only appear after I do another update to table fangzahlen or insert a new record into it. Theoretically the result above should be inserted right away?
Below is my procedure, which might give you a better idea of what I am trying to do! Please excuse this rather awkward explanation. If you need more info please don't hesitate to ask for it!
create or replace
Procedure "PR_FANGZAHLEN_TW_SYNC_SK"
Pr_falle Number,
Pr_fallennummer Number,
Pr_schaedling Number,
Pr_objectid Number)
Is
Y Number (10);
pragma autonomous_transaction;
Begin
-- first all records should be deleted from the subtable i.e. tbl_test
delete from borki.tbl_test where lng_falle = pr_falle
and int_fallennummer = pr_fallennummer
and lng_schaedling = pr_schaedling
and date_datum > '31.03.2010';
commit;
For Rec In
(select objectid lng_fangzahlen,
date_datum,
prev_date,
(date_datum - prev_date) difference_in_days,
round((
case
when nvl(int_volumen,0) > 0
and lng_schaedling = 1
then int_volumen * 40
when nvl(int_volumen,0) > 0
and lng_schaedling = 2
then int_volumen * 550
when nvl(int_anzahl,0) > 0
then int_anzahl
end ) / (date_datum - prev_date),3) counted_bugs,
int_fallennummer,
lng_falle,
lng_schaedling
from
(select objectid,
date_datum,
case
when lag(date_datum) over(order by date_datum) is null
then to_date('31.03.2010')
else lag(date_datum) over(order by date_datum)
end as prev_date,
int_volumen,
int_anzahl,
int_fallennummer,
lng_falle,
lng_schaedling
from borki.fangzahlen
where lng_falle = pr_falle
and int_fallennummer = pr_fallennummer
and lng_schaedling = pr_schaedling
and date_datum > '31.03.2010'
-- and objectid = pr_objectid
order by date_datum desc
order by date_datum asc
Loop
Y := 1;
While Y < rec.difference_in_days + 1
Loop
Insert
Into tbl_test
lng_fangzahlen,
date_datum,
int_anzahl,
int_volumen,
int_fallennummer,
lng_falle,
lng_schaedling
select objectid lng_fangzahlen,
prev_date +Y,
round((
case
when nvl(int_volumen,0) > 0
and lng_schaedling = 1
then int_volumen * 40
when nvl(int_volumen,0) > 0
and lng_schaedling = 2
then int_volumen * 550
when nvl(int_anzahl,0) > 0
then int_anzahl
end ) / (date_datum - prev_date),3) counted_bugs,
int_volumen,
int_fallennummer,
lng_falle,
lng_schaedling
from
(select objectid,
date_datum,
case
when lag(date_datum) over(order by date_datum) is null
then to_date('31.03.2010')
else lag(date_datum) over(order by date_datum)
end as prev_date,
int_volumen,
int_anzahl,
int_fallennummer,
lng_falle,
lng_schaedling
from borki.fangzahlen
where lng_falle = pr_falle
and int_fallennummer = pr_fallennummer
and lng_schaedling = pr_schaedling
and date_datum > '31.03.2010'
order by date_datum desc
order by date_datum asc
commit;
Y := Y + 1;
End Loop;
End Loop; -- end of cursor
Exception
When No_data_found Then
Null;
When Others Then
-- Consider logging the error and then re-raise
Raise;
End "PR_FANGZAHLEN_TW_SYNC_SK"; -
Handling update operation in Post Insert Trigger
Hi All
We are recording all the data that is being updated or inserted into a Table, (say Table A) by inserting into a custom table whenever an insert or update comes in the trigger.Now suppose a user while updating the data in Table A just fetches the record on front end and click on apply directly without making any changes. This causes an update on Base Table and hence causes an insert into our custom table. As this data is of no use to us..
Is there any way to stop this data being inserted into our table.
I have came across two solutions till now.Please suggest which could be better in terms of performance
1) When the user updates any column in base table and trigger fires. We can check the new values of each column(around 50) with their Old values.In case there is a change we will insert the data into custom table else not.
2)When user updates any column then we are picking all the new values into a record type datatype and same way latest value from custom table.Then performing minus between these two records by making query from dual.
Please let me know if there could be any other way around..
Please Note: My base table could have around 200 columns too
Thanks
AJAjay Sharma wrote:
Hi All
We are recording all the data that is being updated or inserted into a Table, (say Table A) by inserting into a custom table whenever an insert or update comes in the trigger.Now suppose a user while updating the data in Table A just fetches the record on front end and click on apply directly without making any changes. This causes an update on Base Table and hence causes an insert into our custom table. As this data is of no use to us..
Is there any way to stop this data being inserted into our table.
I have came across two solutions till now.Please suggest which could be better in terms of performance
1) When the user updates any column in base table and trigger fires. We can check the new values of each column(around 50) with their Old values.In case there is a change we will insert the data into custom table else not.
2)When user updates any column then we are picking all the new values into a record type datatype and same way latest value from custom table.Then performing minus between these two records by making query from dual.
Please let me know if there could be any other way around..
Please Note: My base table could have around 200 columns too
Thanks
AJOption 3.
Fix the front end to disallow the operation. That option being submitting an update statement which doesn't actually change any data. -
Abort inserting a record in a table using a trigger
Hi there,
Is there any way to abort inserting a record in a table using a trigger?
For full details, I have the following table ("myTable"):
BSC INTEGER NOT NULL,
BTS VARCHAR2(20) NOT NULL,
INFO1 INTEGER,
INFO2 INTEGER
myTable_PK = PRIMARY KEY (BSC,BTS)
I have also a stored procedure that imports a data from text file and inserts them to the specified table (using UTL_FILE package). The stored procedure works great.
But the thing that in the text file itselft it might be (due to third-parity report generation bug) that the primary key will be violated or the BSC/BTS field has null value. In such case I just want to ignore the insertion statement using a trigger.
ThanksOk Jens, could you tell me what exception could I use?
Below a protion of my StoredProcedure.
CREATE OR REPLACE PROCEDURE update_myTable() IS
FHANDLE UTL_FILE.FILE_TYPE;
BSC INTEGER;
BTS VARCHAR2(20);
INFO1 INTEGER;
INFO2 INTEGER;
BEGIN
FHANDLE := UTL_FILE.FOPEN('LOG_FILE_DIR',FILENAME,'R',4000);
LOOP
UTL_FILE.GET_LINE(FHANDLE,STR);
-- Process the line STR and generates BSC, BTS, INFO1, and INFO2 values
EXECUTE IMMEDIATE 'INSERT INTO myTable VALUES(:1,:2,:3,:4)' USING BSC,BTS,INFO1,INFO2;
END LOOP;
EXCEPTION WHEN NO_DATA_FOUND THEN UTL_FILE.FCLOSE(FHANDLE);
END UPDATE_R205BTS;
Remember that I am already using an exception with NO_DATA_FOUND to indicate the end of file then closing it.
Thanks for your reply -
How to use the mirrored and log shipped secondary database for update or insert operations
Hi,
I am doing a DR Test where I need to test the mirrored and log shipped secondary database but without stopping the mirroring or log shipping procedures. Is there a way to get the data out of mirrored and log shipped database to another database for update
or insert operations?
Database snapshot can be used only for mirrored database but updates cannot be done. Also the secondary database of log shipping cannot used for database snapshot. Any ideas of how this can be implemented?
Thanks,
PreethaHmm in this case I think you need Merge Replication otherwise it breaks down the purpose of DR...again in that case..
Best Regards,Uri Dimant SQL Server MVP,
http://sqlblog.com/blogs/uri_dimant/
MS SQL optimization: MS SQL Development and Optimization
MS SQL Consulting:
Large scale of database and data cleansing
Remote DBA Services:
Improves MS SQL Database Performance
SQL Server Integration Services:
Business Intelligence -
Capturing value in after insert or update row level trigger
Hi Experts,
I'm working on EBS 11.5.10 and database 11g. I have trigger-A on wip_discrete_jobs table and trigger-B on wip_requirement_operations table.When ever i create discrete job, it inserts record in both wip_discrete_jobs and wip_requirement_operations.
Note:2 tables are like master-child relation.
Trigger-A: After Insert.Row Level trigger on wip_discrete_jobs
Trigger-B:After Insert,Row Level trigger on wip_requirement_operations
In Trigger A:
I'm capturing wip_entity_id and holding in global variable.
package.variable:=:new.wip_entity_id
In Trigger B:
I'm using the above global variable.
Issue: Let's say i have create discrete job,it's wip_entity_id is 27, but the global variable is holding the previous wip_entity_id(26),not current value.It looks like before trigger A event is complete, trigger B is also in process, i think this could be the reason it's not storing the current wip_entity_id in the global variable.
I need your help how to have the current value in the global variable so that i can use that in the trigger B.
Awaiting response at the earliest.
Thanks798616 wrote:
Hi Experts,
I'm working on EBS 11.5.10 and database 11g. I have trigger-A on wip_discrete_jobs table and trigger-B on wip_requirement_operations table.When ever i create discrete job, it inserts record in both wip_discrete_jobs and wip_requirement_operations.
Note:2 tables are like master-child relation.
Trigger-A: After Insert.Row Level trigger on wip_discrete_jobs
Trigger-B:After Insert,Row Level trigger on wip_requirement_operations
In Trigger A:
I'm capturing wip_entity_id and holding in global variable.
package.variable:=:new.wip_entity_id
In Trigger B:
I'm using the above global variable.
Issue: Let's say i have create discrete job,it's wip_entity_id is 27, but the global variable is holding the previous wip_entity_id(26),not current value.It looks like before trigger A event is complete, trigger B is also in process, i think this could be the reason it's not storing the current wip_entity_id in the global variable.
I need your help how to have the current value in the global variable so that i can use that in the trigger B.
Awaiting response at the earliest.
ThanksMy head hurts just thinking about how this is being implemented.
What's stopping you from creating a nice and simple procedure to perform all this magic?
Continue with the global/trigger magic at your own peril, as you can hopefully already see ... nothing good will come from it.
Cheers,
Maybe you are looking for
-
Problem in posting excise invoice(part2)
hello gurus, i am facing one serious problem. i am using 46c version. in case of import procurement . i am not able to post excise invoice .the step i am following are as follows 1.after doing MIRO i captured excise for CVD through j1iex with refern
-
Hi, I have defined a query with a Structure in Row and Key Figures in column. The result of query should look like this: Fiscal/Year Period | Quantity per month | Quantity cumulative 011.2010 (Variable Offset Value -3) | 5 | 5 012.2010 (Variable Offs
-
Hi All, I am new to SOA and I want to know how to assign application roles (Not global roles) through EM Console. As, I am unable to assign the roles through BPM workspace. I can go to the administrator tab and assign the roles to me. But in the tas
-
Purpose of text element in scripts
1. What is the purpose of text element in scripts. When do we use it ? 2. Is text element window specific or only pertaning to main window ?
-
Auto uodate of apps on the store
I am using a Lumia 920 since over a year now and I am pretty satisfied with WP 8.1. All what it offers is something that is good and although there is a lot of room for improvements.. I was disappointed today morning. My store settings say that apps