Merge statement and insert into a third table

I'm wondering if the merge statement will work in my case. I'm loading a file into a temporary table. I want to compare this temporary table with an existing source table (there are multiple source tables, I've thought of breaking them up into separate statements) and write out any differences to a third table. The third table would be used for future processing on the rows that are found to be different. Do you think this is possible using the merge command?

Why use the merge command for that purpose?
If you want to find out the differences of two tables and put the result into a third one then you should do
insert into table_c
query_a
minus
query_b
or
insert into table_c
(query_a
minus
query_b)
union all
(query_b
minus
query_a)
hth

Similar Messages

  • Select from two tables and insert into a third

    I'm trying to do a select from two tables and do an insert into a third table from the two resulting columns.
    I have the following....
    DECLARE
    tempsid number;
    temphostid number;
    BEGIN
    select "DBSID_ID","ID" into tempsid,temphostid from "DBSIDS","SERVERS"
    where "HOST_SID" like '%'||"DBSID_NAME"||'%'
    and "HOST_NAME" not like 'vio%'
    and exists (select "DBSID_NAME" from DBSIDS)
    order by "DBSID_NAME";
    insert into "DBSID_LOOKUP" ("SIDLOOKUP_ID", "SERVERLOOKUP_ID")
    values(tempsid, temphostsid);
    END;
    run;
    I get the error ....
    ORA-06550: line 11, column 18:
    PL/SQL: ORA-00984: column not allowed here
    ORA-06550: line 10, column 1:
    PL/SQL: SQL Statement ignored
    1. DECLARE
    2. tempsid number;
    3. temphostid number;

    okay ... I tried a different way ...
    DECLARE
    a number;
    b number;
    BEGIN
    select "DBSID_ID","ID" into a,b from "DBSIDS","SERVERS"
    where "HOST_SID" like '%'||"DBSID_NAME"||'%'
    and "HOST_NAME" not like 'vio%'
    and exists (select "DBSID_NAME" from DBSIDS)
    order by "DBSID_NAME";
    insert into "DBSID_LOOKUP" (SIDLOOKUP_ID, SERVERLOOKUP_ID) values (a, b);
    END;
    and now it whines about ...
    ORA-01422: exact fetch returns more than requested number of rows

  • Syntax for Merge statement to insert into target and update source

    Hello All,
    I want to use Merge statement to insert records when not matched in my target and update records in source when matched. Is it possible to do using Merge statement.
    create table a (aa number)
    create table b (bb number)
    alter table a add flg char(1)
    merge b as target
    using a as source
    on (target.bb = source.aa)
    when matched then
    update set source.flg = 'Y'
    WHEN NOT MATCHED THEN
    insert (target.bb)
    values
    (source.aa)
    Thanks.

    Hi,
    I have no idea about the version of DB, else some new features with respect version can be specified - just for informaitve purpose and to post across the verison of DB in future posts.
    Coming to your issue and requirement.
    if you check the syntax and functionality , then its on Merge on target base only - with respect to Update (matched o columns) and insert (for unmatched columns). Source - as the term is clear. It might not work out on source table.
    - Pavan Kumar N
    - ORACLE OCP - 9i/10g
    https://www.oracleinternals.blogspot.com

  • Read a csv file, fill a dynamic table and insert into a standard table

    Hi everybody,
    I have a problem here and I need your help:
    I have to read a csv file and insert the data of it into a standard table.
    1 - On the parameter scrreen I have to indicate the standard table and the csv file.
    2 - I need to create a dynamic table. The same type of the one I choose at parameter screen.
    3 - Then I need to read the csv and put the data into this dynamic table.
    4 - Later I need to insert the data from the dynamic table into the standard table (the one on the parameter screen).
    How do I do this job? Do you have an example? Thanks.

    Here is an example table which shows how to upload a csv file from the frontend to a dynamic internal table.  You can of course modify this to update your database table.
    report zrich_0002.
    type-pools: slis.
    field-symbols: <dyn_table> type standard table,
                   <dyn_wa>,
                   <dyn_field>.
    data: it_fldcat type lvc_t_fcat,
          wa_it_fldcat type lvc_s_fcat.
    type-pools : abap.
    data: new_table type ref to data,
          new_line  type ref to data.
    data: xcel type table of alsmex_tabline with header line.
    selection-screen begin of block b1 with frame title text .
    parameters: p_file type  rlgrap-filename default 'c:Test.csv'.
    parameters: p_flds type i.
    selection-screen end of block b1.
    start-of-selection.
    * Add X number of fields to the dynamic itab cataelog
      do p_flds times.
        clear wa_it_fldcat.
        wa_it_fldcat-fieldname = sy-index.
        wa_it_fldcat-datatype = 'C'.
        wa_it_fldcat-inttype = 'C'.
        wa_it_fldcat-intlen = 10.
        append wa_it_fldcat to it_fldcat .
      enddo.
    * Create dynamic internal table and assign to FS
      call method cl_alv_table_create=>create_dynamic_table
                   exporting
                      it_fieldcatalog = it_fldcat
                   importing
                      ep_table        = new_table.
      assign new_table->* to <dyn_table>.
    * Create dynamic work area and assign to FS
      create data new_line like line of <dyn_table>.
      assign new_line->* to <dyn_wa>.
    * Upload the excel
      call function 'ALSM_EXCEL_TO_INTERNAL_TABLE'
           exporting
                filename                = p_file
                i_begin_col             = '1'
                i_begin_row             = '1'
                i_end_col               = '200'
                i_end_row               = '5000'
           tables
                intern                  = xcel
           exceptions
                inconsistent_parameters = 1
                upload_ole              = 2
                others                  = 3.
    * Reformt to dynamic internal table
      loop at xcel.
        assign component xcel-col of structure <dyn_wa> to <dyn_field>.
        if sy-subrc = 0.
         <dyn_field> = xcel-value.
        endif.
        at end of row.
          append <dyn_wa> to <dyn_table>.
          clear <dyn_wa>.
        endat.
      endloop.
    * Write out data from table.
      loop at <dyn_table> into <dyn_wa>.
        do.
          assign component  sy-index  of structure <dyn_wa> to <dyn_field>.
          if sy-subrc <> 0.
            exit.
          endif.
          if sy-index = 1.
            write:/ <dyn_field>.
          else.
            write: <dyn_field>.
          endif.
        enddo.
      endloop.
    REgards,
    RIch Heilman

  • How to read data from flatfile and insert into other relevant tables ? Please suggest me the query ?

    Hi to all,
    I have flat files in different location through FTP i need to fetch those files and load in the relavant table of the database.
    Please share me the query to do it ..

    You would need a ForEach Loop to iterate though the files. Initially the FTP task will pull the files from locations to a landing folder. Once thats done the ForEachLoop will iterate through files in the folder and will have a data flow task inside to transfer
    file data to tables.
    If you want a more secure option you can also use SFTP (Secured FTP) and can implement it using free WinSCP clinet. I've explained a method of doing it fo dynamic files here
    http://visakhm.blogspot.in/2012/12/implementing-dynamic-secure-ftp-process.html
    for iterating through files see this example
    http://visakhm.blogspot.in/2012/05/package-to-implement-daily-processing.html
    you may not need the validation step inside the loop in your case
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Read data from E$ table and insert into staging table

    Hi all,
    I am a new to ODI. I want your help in understanding how to read data from an "E$" table and insert into a staging table.
    Scenario:
    Two columns in a flat file, employee id and employee name, needs to be loaded into a datastore EMP+. A check constraint is added to allow data with employee names in upper case only to be loaded into the datastore. Check control is set to flow and static. Right click on the datastore, select control and then check. The rows that have violated the check constraint are kept in E$_EMP+ table.
    Problem:
    Issue is that I want to read the data from the E$_EMP+ table and transform the employee name into uppercase and move the corrected data from E$_EMP+ into EMP+. Please advise me on how to automatically handle "soft" exceptions in ODI.
    Thank you

    Hi Himanshu,
    I have tried the approach you suggested already. That works. However, the scenario I described was very basic. Based on the error logged into the E$_EMP table, I will have to design the interface to apply more business logic before the data is actually merged into the target table.
    Before the business logic is applied, I need to read each row from the E$EMP table first and I do not know how to read from E$EMP table.

  • How to extract data from xml and insert into Oracle table

    Hi,
    I have a large xml file. which will have hundreds of the following transaction tags having column names and there values.
    There is a table one of the schema with coulums "actualCostRate","billRate"....etc.
    I need to extract the values of these columns and insert into the table
    <Transaction actualCostRate="0" billRate="0" chargeable="1" clientID="NikuUK" chargeCode="LCOCD1" externalID="L-RESCODE_UK1-PROJ_UK_CNT_GBP-37289-8" importStatus="N" projectID="TESTPROJ" resourceID="admin" transactionDate="2002-02-12" transactionType="L" units="11" taskID="5017601" inputTypeCode="SALES" groupId="123" voucherNumber="ABCVDD" transactionClass="ABCD"/>
    <Transaction actualCostRate="0" billRate="0" chargeable="1" clientID="NikuEU" chargeCode="LCOCD1" externalID="L-RESCODE_US1-PROJ_EU_STD2-37291-4" importStatus="N" projectID="TESTPROJ" resourceID="admin" transactionDate="2002-02-04" transactionType="L" units="4" taskID="5017601" inputTypeCode="SALES" groupId="124" voucherNumber="EEE222" transactionClass="DEFG"/>

    Re: Insert from XML to relational table
    http://www.google.ae/search?hl=ar&q=extract+data+from+xml+and+insert+into+Oracle+table+&btnG=%D8%A8%D8%AD%D8%AB+Google&meta=

  • Fetch data from one table and insert into two tables in desired format

    I have similar to the following data in a table and it is not normalized. The groupID is being used to group two records of similar nature.
    DECLARE @OldDoc TABLE (oldDocID INT, groupID INT, deptID INT)
    INSERT INTO @OldDoc (oldDocID, groupID) VALUES (1, NULL, 111),(2,NULL,111),(3,1,111),(4,NULL,333),(5,1,222),(6,NULL,333),(7,2,222),(8,2,333),(9,NULL,111),(10,3,222),(11,NULL,333),(12,3,444)
    I need to process the data from the above table (@OldDoc) and write into two new tables (@NewDoc and @NewDocGroup) as follows.
    oldDocID should be stored as newDocID when inserting to @NewDoc table. Only records with groupID NULL and one record (first one) per group should be considered (For example, oldDocID 5 is not considered as 3 and 5 belong to the same groupID 1) for insertion. 
    DECLARE @NewDoc TABLE (newDocID INT)
    INSERT INTO @NewDoc (newDocID) VALUES (1),(2),(3),(4),(6),(7),(9),(10),(11)
    All records from @OldDoc should be considered for insertion into @NewDocGroup table. OldDocID is inserted as NewDocID and deptID is as-is. Instead of groupID, the ID of the first record in the 
    group should be considered as parentNewDocID (For example, 3 is considered as parentNewDocID for newDocID 5 as 3 and 5 belong to the same groupID in @OldDoc table) for the newDocID.
    DECLARE @NewDocGroup (newDocID INT, parentNewDocID INT, deptID INT)
    INSERT INTO @NewDocGroup (newDocID, parentNewDocID, deptID) VALUES (1,1,111),(2,2,111),(3,3,111),(4,4,333),(5,3,222),(6,6,333),(7,7,222),(8,7,333),(9,9,111),(10,10,222),(11,11,333),(12,10,444)
    How do I accomplish the above using SQL ? Thanks for the help.

    >> I have similar to the following data in a table and it is not normalized. The group_id is being used to group two records [sic] of similar nature. <<
    Rows are not records. Tables have to have a key by definition. You do not do math with identifiers, so they should not be numeric. Let's ignore that error for now. In short, you are posting garbage. If you had followed Forum Netiquette, would you have posted
    this? 
    CREATE TABLE Old_Documents
    (old_doc_id INTEGER NOT NULL PRIMARY KEY, 
     group_id INTEGER, 
     dept_nbr INTEGER NOT NULL
       REFERENCES Departments (dept_nbr));
    INSERT INTO Old_Documents(old_doc_id, group_id, dept_nbr) 
    VALUES  (1, NULL, 111), 
    (2, NULL, 111), 
    (3, 1, 111), 
    (4, NULL, 333), 
    (5, 1, 222), 
    (6, NULL, 333), 
    (7, 2, 222), 
    (8, 2, 333), 
    (9, NULL, 111), 
    (10, 3, 222), 
    (11, NULL, 333), 
    (12, 3, 444);
    >> I need to process the data from the above table (Old_Documents) and write into two new tables (New_Documents and New_Documents_Groups) as follows. <<
    Just like punch cards and mag tape data processing! Being old and being new are a status, not another kind of entity. But that is how mag tapes work. And you even use the verb "fetch" from tape files. This design flaw is called  attribute splitting.
    Do you have a Male_Personnel and Female_Personnel table? NO! It is just Personnel! 
    >> old_doc_id should be stored as new_doc_id when inserting to New_Documents table. Only records [sic] with group_id NULL and one record [sic] (first [sic; no ordering in a table] one) per group should be considered (For example, old_doc_id 5 is not considered
    as 3 and 5 belong to the same group_id =1) for insertion. <<
    Think about your punch card mindset. Why did you physically materialize that redundant New_Documents table? Let me answer that: this is how you work with punch cards! In SQL we use a VIEW:
    CREATE VIEW New_Documents (new_doc_id)
    AS 
    SELECT old_doc_id 
      FROM Old_Documents;
    >> All records [sic] from Old_Documents should be considered for insertion into New_Documents_Groups table. The old_doc_id is inserted as new_doc_id and dept_nbr is as-is. Instead of group_id, the ID [sic: which identifier??] of the first [sic: tables
    have no ordering like a deck of punch cards] record [sic] in the group should be considered as parent_new_doc_id (For example, 3 is considered as parent_new_doc_id for new_doc_id 5 as 3 and 5 belong to the same group_id in Old_Documents table) for the new_doc_id.
    <<
    Why not use 5 as the parent? My guess is that you are trying to form equivalence classes. See:
    https://www.simple-talk.com/content/print.aspx?article=2020
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • Looking for an insert statement to insert into a table around 500 rows

    Hi  Gurus,
    Your help is greatly appreciated !!,
    I need to have a query to insert into the mer_spec_feature table with the following , Please sugesst apart from the Utl_file ,input thing
    as the data is differnt in the regions and am lokking if it workes out with a insert and select query .
    1)I have to create a  script for inserting records into  mer_spec_feature where  on appl_id ‘VXRR’
    having feature ID 786 and have to Exclude TIDs which already have the 786 feature added for this appl_id like in the eblwo select query .
    2)There are 509 such TIDs where it doesnt have the 786 feature with appl_id -'VXRR' in the mer_spec_feature table
    3)
    select distinct terminal_id,  appl_id, from  mer_spec_feature
    where appl_id = 'VXRR'
    minus
    select distinct terminal_id, appl_id from mer_spec_feature
    where appl_id = 'VXRR'
    and terminal_feature_id = 786
    desc mer_spec_feature :
    Terminal_id  --varachr2(9)
    appl_id    --varcahr2(10 )
    terminal_feature-id --number(8)
    TERMINAL_FEATURE_ID
    TERMINAL_APPL_ID
    TERMINAL_ID
    299
    405T330a
    1004665
    786
    VXRR
    1004665

    @Frank Kulash !!
    Thanks always for your replys .!!
    i have tried with the above sql , and am having issue with the pk constraint after inserting  232 rows.
    And a Tertminal_id  will have 100 feature_id's for one  appl_id.
    becoz it has pk constraint on  -- TERMINAL_FEATURE_ID, APPL_ID, TERMINAL_ID
    TERMINAL_FEATURE_ID
    APPL_ID
    TERMINAL_ID
    299
    405T330a
    1004665
    786
    VXRR
    1004665
    can you please sugeest any other option.

  • How to read ecel sheet and insert into oracle table

    hi all,
    am working on forms builder 6i. and i want , from a trigger to read from a a sheet excel file the data and insert into a table that i had aleady created.
    i whrite a code that can till now open the excel file but i cant read the data to insert it into the table.
    am using TEXT_IO.IS_OPEN to open the file
    TEXT_IO.GET_LINE to take each line
    and subst(x as variable) to read the first line , but the subsrt return nohing
    any solution??

    There's already a topic made on this: how to copy data from excel to oracle forms

  • Reg: read excel column and insert into table.

    hi Friends,
          i wanted to read the data from Excel and insert into in my oracle tables.
          can you provide the link or example script.
        how to read the column value from excel and insert into table.
      please help.

    < unnecessary reference to personal blog removed by moderator >
    Here are the steps:
    1) First create a directory and grant read , write , execute to the user from where you want to access the flat files and load it.
    2) Write a generic function to load PIPE delimited flat files:
    CREATE OR REPLACE FUNCTION TABLE_LOAD ( p_table in varchar2,
    p_dir in varchar2 DEFAULT ‘YOUR_DIRECTORY_NAME’,
    P_FILENAME in varchar2,
    p_ignore_headerlines IN INTEGER DEFAULT 1,
    p_delimiter in varchar2 default ‘|’,
    p_optional_enclosed in varchar2 default ‘”‘ )
    return number
    is
    – FUNCTION TABLE_LOAD
    – PURPOSE: Load the flat files i.e. only text files to Oracle
    – tables.
    – This is a generic function which can be used for
    – importing any text flat files to oracle database.
    – PARAMETERS:
    – P_TABLE
    – Pass name of the table for which import has to be done.
    – P_DIR
    – Name of the directory where the file is been placed.
    – Note: The grant has to be given for the user to the directory
    – before executing the function
    – P_FILENAME
    – The name of the flat file(a text file)
    – P_IGNORE_HEADERLINES
    – By default we are passing 1 to skip the first line of the file
    – which are headers on the Flat files.
    – P_DELIMITER
    – Dafault “|” pipe is been passed.
    – P_OPTIONAL_ENCLOSED
    – Optionally enclosed by ‘ ” ‘ are been ignored.
    – AUTHOR:
    – Slobaray
    l_input utl_file.file_type;
    l_theCursor integer default dbms_sql.open_cursor;
    l_lastLine varchar2(4000);
    l_cnames varchar2(4000);
    l_bindvars varchar2(4000);
    l_status integer;
    l_cnt number default 0;
    l_rowCount number default 0;
    l_sep char(1) default NULL;
    L_ERRMSG varchar2(4000);
    V_EOF BOOLEAN := false;
    begin
    l_cnt := 1;
    for TAB_COLUMNS in (
    select column_name, data_type from user_tab_columns where table_name=p_table order by column_id
    ) loop
    l_cnames := l_cnames || tab_columns.column_name || ‘,’;
    l_bindvars := l_bindvars || case when tab_columns.data_type in (‘DATE’, ‘TIMESTAMP(6)’) then ‘to_date(:b’ || l_cnt || ‘,”YYYY-MM-DD HH24:MI:SS”),’ else ‘:b’|| l_cnt || ‘,’ end;
    l_cnt := l_cnt + 1;
    end loop;
    l_cnames := rtrim(l_cnames,’,');
    L_BINDVARS := RTRIM(L_BINDVARS,’,');
    L_INPUT := UTL_FILE.FOPEN( P_DIR, P_FILENAME, ‘r’ );
    IF p_ignore_headerlines > 0
    THEN
    BEGIN
    FOR i IN 1 .. p_ignore_headerlines
    LOOP
    UTL_FILE.get_line(l_input, l_lastLine);
    END LOOP;
    EXCEPTION
    WHEN NO_DATA_FOUND
    THEN
    v_eof := TRUE;
    end;
    END IF;
    if not v_eof then
    dbms_sql.parse( l_theCursor, ‘insert into ‘ || p_table || ‘(‘ || l_cnames || ‘) values (‘ || l_bindvars || ‘)’, dbms_sql.native );
    loop
    begin
    utl_file.get_line( l_input, l_lastLine );
    exception
    when NO_DATA_FOUND then
    exit;
    end;
    if length(l_lastLine) > 0 then
    for i in 1 .. l_cnt-1
    LOOP
    dbms_sql.bind_variable( l_theCursor, ‘:b’||i,
    ltrim(rtrim(rtrim(
    regexp_substr(l_lastLine,’([^|]*)(\||$)’,1,i),p_delimiter),p_optional_enclosed),p_optional_enclosed));
    end loop;
    begin
    l_status := dbms_sql.execute(l_theCursor);
    l_rowCount := l_rowCount + 1;
    exception
    when OTHERS then
    L_ERRMSG := SQLERRM;
    insert into BADLOG ( TABLE_NAME, ERRM, data, ERROR_DATE )
    values ( P_TABLE,l_errmsg, l_lastLine ,systimestamp );
    end;
    end if;
    end loop;
    dbms_sql.close_cursor(l_theCursor);
    utl_file.fclose( l_input );
    commit;
    end if;
    insert into IMPORT_HIST (FILENAME,TABLE_NAME,NUM_OF_REC,IMPORT_DATE)
    values ( P_FILENAME, P_TABLE,l_rowCount,sysdate );
    UTL_FILE.FRENAME(
    P_DIR,
    P_FILENAME,
    P_DIR,
    REPLACE(P_FILENAME,
    ‘.txt’,
    ‘_’ || TO_CHAR(SYSDATE, ‘DD_MON_RRRR_HH24_MI_SS_AM’) || ‘.txt’
    commit;
    RETURN L_ROWCOUNT;
    end TABLE_LOAD;
    Note: when you run the function then it will also modify the source flat file with timestamp , so that we can have the track like which file was loaded .
    3) Check if the user is having UTL_FILE privileges or not :
    SQL> SELECT OWNER,
    OBJECT_TYPE
    FROM ALL_OBJECTS
    WHERE OBJECT_NAME = ‘UTL_FILE’
    AND OWNER =<>;
    If the user is not having the privileges then grant “UTL_FILE” to user from SYS user:
    SQL> GRANT EXECUTE ON UTL_FILE TO <>;
    4) In the function I have used two tables like:
    import_hist table and badlog table to track the history of the load and another to check the bad log if it occurs while doing the load .
    Under the same user create an error log table to log the error out records while doing the import:
    SQL> CREATE TABLE badlog
    errm VARCHAR2(4000),
    data VARCHAR2(4000) ,
    error_date TIMESTAMP
    Under the same user create Load history table to log the details of the file and tables that are imported with a track of records loaded:
    SQL> create table IMPORT_HIST
    FILENAME varchar2(200),
    TABLE_NAME varchar2(200),
    NUM_OF_REC number,
    IMPORT_DATE DATE
    5) Finally run the PLSQL block and check if it is loading properly or not if not then check the badlog:
    Execute the PLSQL block to import the data from the USER:
    SQL> declare
    P_TABLE varchar2(200):=<>;
    P_DIR varchar2(200):=<>;
    P_FILENAME VARCHAR2(200):=<>;
    v_Return NUMBER;
    BEGIN
    v_Return := TABLE_LOAD(
    P_TABLE => P_TABLE,
    P_DIR => P_DIR,
    P_FILENAME => P_FILENAME,
    P_IGNORE_HEADERLINES => P_IGNORE_HEADERLINES,
    P_DELIMITER => P_DELIMITER,
    P_OPTIONAL_ENCLOSED => P_OPTIONAL_ENCLOSED
    DBMS_OUTPUT.PUT_LINE(‘v_Return = ‘ || v_Return);
    end;
    6) Once the PLSQL block is been executed then check for any error log table and also the target table if the records are been successfully imported or not.

  • Retriving unique records and insert into cust and cust_det table

    Hi,
    uniq Table : EMP
    column name
    name
    addr
    city
    addr1
    city1
    EMP table contains only unique records.
    we neet to get the records from emp and insert into cust and cust_det accordingly
    table : cust
    empno : auto generated
    name
    addr
    tabel : cust_det
    empno : refer from cust
    addr1
    city1
    plese help me to do this.

    user10069916 wrote:
    can some one help me to resolve this plz. bit urgent..My boss says my work is more urgent than yours.
    As I said, please read: {message:id=9360002} (especially point 2)
    If you post sufficient information, then people can help you. the better the information you provide the faster people will be able to help... but still, if it's "urgent" then we can only assume you have a live production system that is failing, in which case you need to raise an SR with Oracle Support, as the forums are not the place for logging "urgent" production system issues.

  • Trigger on Insert into a custom table from external application

    This is the problem I am facing.
    External Application using a different schema lets say extdb on a different server. This writes into a table in a schema called appscustom schema. Appscustom schema is under Oracle Apps.
    A trigger in Oracle Apps to be executed on insert into the appscustom schema table.
    This trigger inturn will run a concurrent job to create the invoice and gl interface tables.
    THe problem I am facing is, The trigger get executed but the concurrent job does not get triggered/run.
    Here is the trigger. Let me know, what might be wrong.
    THe values for vRESP_APPL_Id,vRESP_ID, vUSER_ID becomes null in the test_pims table.
    Whereas when I manually insert from backend into appscustom schema table the concurrent job gets submitted.
    ~~~~~~~~~~~~~~~~~~~~~~~~~~
    -- Start of DDL Script for Trigger APPS.RWJF_PIMS_ORACLE_INT
    -- Generated 5/25/2005 22:10:52 from APPS@HENRY_DEVL
    CREATE OR REPLACE TRIGGER rwjf_pims_oracle_int
    AFTER
    INSERT
    ON rwjf_pimstxnbatch
    REFERENCING NEW AS NEW OLD AS OLD
    FOR EACH ROW
    WHEN (new.status = 'U' )
    Declare
    ReturnCode BOOLEAN;
    ConcReqID NUMBER := 0;
    vRESP_APPL_ID NUMBER;
    vRESP_ID NUMBER;
    vUSER_ID NUMBER;
    LoadToAp NUMBER :=0;
    LoadToGl NUMBER :=0;
    Begin
    -- vRESP_APPL_ID := apps.fnd_profile.value(200); -- AP --('RESP_APPL_ID');
    -- vRESP_ID := apps.fnd_profile.value(20639); -- 20639 for payables mgr. for rwjf_payables Mgr (50001); --('RESP_ID');
    -- vUSER_ID := apps.fnd_profile.value(1247); -- 1247 for sury 1065 for interface -- ('USER_ID');
    LoadToAp := 0;
    SELECT COUNT(*)
    INTO LoadToAp
    FROM rwjf.RWJF_PimsTxnBatchDtl
    WHERE GL_OR_AP = 'A'
    AND pims_txn_batch_id = :NEW.Pims_Txn_Batch_id;
    LoadToGl := 0;
    SELECT COUNT(*)
    INTO LoadToGl
    FROM rwjf.RWJF_PimsTxnBatchDtl
    WHERE GL_OR_AP = 'G'
    AND pims_txn_batch_id = :NEW.Pims_Txn_Batch_id;
    IF LoadToAp > 0 THEN
    vRESP_APPL_ID := 200; -- Application Id 200 Account payables
    vRESP_ID := 50001; -- RWJF_Payables Mgr for user interface
    vUSER_ID := 1065; -- user id for user name interface
    apps.fnd_global.apps_initialize(vUSER_ID,vRESP_ID,vRESP_APPL_ID);
    ConcReqID := 0;
    ReturnCode := FND_REQUEST.SET_MODE(TRUE);
    ConcReqID := FND_REQUEST.SUBMIT_REQUEST('RWJF',
    'RWJF_PIMS_INT',
    '', '', FALSE,
    :NEW.Pims_Txn_Batch_id,'AP', chr(0),
    END IF;
    IF LoadToGl > 0 THEN
    vRESP_APPL_ID := 20003;
    vRESP_ID := 50003;
    vUSER_ID := 1065;
    apps.fnd_global.apps_initialize(vUSER_ID,vRESP_ID,vRESP_APPL_ID);
    ConcReqID := 0;
    ReturnCode := FND_REQUEST.SET_MODE(TRUE);
    ConcReqID := FND_REQUEST.SUBMIT_REQUEST('RWJF',
    'RWJF_PIMS_INT',
    '', '', FALSE,
    :NEW.Pims_Txn_Batch_id,'GL',chr(0),
    END IF;
    insert into test_pims
    values
    ('Test3',:New.Pims_Txn_Batch_id,vUSER_ID,vRESP_ID,vRESP_APPL_ID);
    insert into test_pims
    values
    ('Test3',ConcReqID,vUSER_ID,vRESP_ID,vRESP_APPL_ID);
    IF ConcReqID = 0 THEN
    DBMS_OUTPUT.PUT_LINE('Problem Submitting Program to get pims txn batch'); /* Handle Error */
    END IF;
    End;
    -- End of DDL Script for Trigger APPS.RWJF_PIMS_ORACLE_INT

    don't quite understand the structure of your tables but it sounds similiar to a merge statement.
    (contiditonally insert, update records depending on whether or not corresponding records exist)
    create table custom_table
    (cust_no number,
    line_no number,
    party_id number
    create table hz_parties
    as (
    select 1 party_id, 'A' bill_to_location, 10 cust_no, 100 line_no from dual union
    select 2, 'B', 20, 200 from dual union
    select 3, 'C', 30, 300 from dual union
    select 4, 'D', 40, 400 from dual
    merge into custom_table t1
    using (select * from hz_parties) t2
    on (t1.party_id = t2.party_id)
    when not matched then insert (t1.cust_no, t1.line_no, t1.party_id)
    values (t2.cust_no, t2.line_no, t2.party_id);
    Edited by: pollywog on Mar 5, 2010 6:52 AM

  • SQL Loader and Insert Into Performance Difference

    Hello All,
    Im in a situation to measure performance difference between SQL Loader and Insert into. Say there 10000 records in a flat file and I want to load it into a staging table.
    I know that if I use PL/SQL UTL_FILE to do this job performance will degrade(dont ask me why im going for UTL_FILE instead of SQL Loader). But I dont know how much. Can anybody tell me the performance difference in % (like 20% will decrease) in case of 10000 records.
    Thanks,
    Kannan.

    Kannan B wrote:
    Do not confuse the topic, as I told im not going to use External tables. This post is to speak the performance difference between SQL Loader and Simple Insert Statement.I don't think people are confusing the topic.
    External tables are a superior means of reading a file as it doesn't require any command line calls or external control files to be set up. All that is needed is a single external table definition created in a similar way to creating any other table (just with the additional external table information obviously). It also eliminates the need to have a 'staging' table on the database to load the data into as the data can just be queried as needed directly from the file, and if the file changes, so does the data seen through the external table automatically without the need to re-run any SQL*Loader process again.
    Who told you not to use External Tables? Do they know what they are talking about? Can they give a valid reason why external tables are not to be used?
    IMO, if you're considering SQL*Loader, you should be considering External tables as a better alternative.

  • MERGE statement without INSERT clause....is possible...?

    Hi everybody...
    MERGE statement without UPDATE or INSERT clause is possible or not
    I want to select from one table and update in another table. So i dont want insert statement and i want to do it in single query....possible solutions are requested.
    Thanks in advance
    pal

    Hi..
    Thanks for ur reply. For MERGE statement, we have to give UPDATE and INSERT clause. this MERGE statement works without INSERT or UPDATE clause.
    Why i am asking is, I want to select how many rows (count(*)) from table1 and based on this count, i have to update in table2
    Both tables r different and for select and for update where clauses are different. Both tables r totally different.
    I want to do it in single query, so i asked this is possible with MERGE statement without INSERT clause.
    Thanks for ur reply

Maybe you are looking for

  • Simplel complex report query

    hi guru's i had prepared two complex reports seperately having the same Selection-screen , internal tables and declerations...now i have to combine both the reports into one single report....based upon <b>one field (i.e, filed PROCESS_TYPE)</b> of Se

  • Photoshop CC Tranform lag in Windows 8.1

    Hi All aspects of transform are lagging under Windows 8.1, including rotate, free transform and warp.  This is different from Windows 8.0, where Transform was working fine.  On a 360 degree image rotation using transform I am gettng about 5-6 animati

  • Quick Question on Smartforms

    Hi Guys, Is there any way to insert check box in smartforms. or is there an ascii code. In this case it is to be inserted in between paragraph. thanks.

  • Hi Adobe: Please Help !!!

    Hi, I am building a PDF form in  LC Desginer9.0. I have a requirement to browse a file and display its complete path in a textField. (No need to attach it, just need the complete filename with path) I am not getting any way to do so. I am stucked in

  • TS4006 After installing IOS 6.1, and the ICloud I am unable to find my contact list on the Iphone. Help!!

    I am unable to find my acontact list after downloading IOS6.1 and the ICloud.  When I open contacts up on the Iphone is says that I don't have any????