Insert into a temp table is to slow

i'd like to know why if i created a temp table out of my procedure the insert into it get slower than if i create that temp table inside my procedure. follows an example:
create table #Test (col1 varchar(max))
go
create proc dbo.test
as
begin
truncate table #Test
insert into #Test
select 'teste'
FROM sys.tables
cross join sys.columns
end
go
exec dbo.test
go
create table #Test2 (col1 varchar(max))
go
truncate table #Test2
insert into #Test2
select 'teste'
FROM sys.tables
cross join sys.columns
At test, we get duration 71700, reads 45220, CPU 26052 At test2, we get duration 49636, reads 45166, cpu 24960
best regards

There should be no difference.
You would have to repeat the test you designed a few times to take readings and then reverse the order.
BOL: "
Benefits of Using Stored Procedures
The following list describes some benefits of using procedures.
Reduced server/client network traffic          
The commands in a procedure are executed as a single batch of code. This can significantly reduce network traffic between the server and client because only the call to execute the procedure is sent across the network. Without the code encapsulation provided
by a procedure, every individual line of code would have to cross the network.
Stronger security          
Multiple users and client programs can perform operations on underlying database objects through a procedure, even if the users and programs do not have direct permissions on those underlying objects. The procedure controls what processes and activities
are performed and protects the underlying database objects. This eliminates the requirement to grant permissions at the individual object level and simplifies the security layers.
The EXECUTE AS
clause can be specified in the CREATE PROCEDURE statement to enable impersonating another user, or enable users or applications to perform certain database activities without needing direct permissions on the underlying objects and commands. For example,
some actions such as TRUNCATE TABLE, do not have grantable permissions. To execute TRUNCATE TABLE, the user must have ALTER permissions on the specified table. Granting a user ALTER permissions on a table may not be ideal because the user will effectively
have permissions well beyond the ability to truncate a table. By incorporating the TRUNCATE TABLE statement in a module and specifying that module execute as a user who has permissions to modify the table, you can extend the permissions to truncate the table
to the user that you grant EXECUTE permissions on the module.
When calling a procedure over the network, only the call to execute the procedure is visible. Therefore, malicious users cannot see table and database object names, embed Transact-SQL statements of their own, or search for critical data.
Using procedure parameters helps guard against SQL injection attacks. Since parameter input is treated as a literal value and not as executable code, it is more difficult for an attacker to insert a command into the Transact-SQL statement(s) inside
the procedure and compromise security.
Procedures can be encrypted, helping to obfuscate the source code. For more information, see SQL Server Encryption.
Reuse of code          
The code for any repetitious database operation is the perfect candidate for encapsulation in procedures. This eliminates needless rewrites of the same code, decreases code inconsistency, and allows the code to be accessed and executed by any user or application
possessing the necessary permissions.
Easier maintenance          
When client applications call procedures and keep database operations in the data tier, only the procedures must be updated for any changes in the underlying database. The application tier remains separate and does not have to know how about any changes
to database layouts, relationships, or processes.
Improved performance          
By default, a procedure compiles the first time it is executed and creates an execution plan that is reused for subsequent executions. Since the query processor does not have to create a new plan, it typically takes less time to process the procedure.
If there has been significant change to the tables or data referenced by the procedure, the precompiled plan may actually cause the procedure to perform slower. In this case, recompiling the procedure and forcing a new execution plan can improve performance.
LINK: http://technet.microsoft.com/en-us/library/ms190782.aspx
Kalman Toth Database & OLAP Architect
Free T-SQL Scripts
New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

Similar Messages

  • Query is taking too much time for inserting into a temp table and for spooling

    Hi,
    I am working on a query optimization project where I have found a query which takes hell lot of time to execute.
    Temp table is defined as follows:
    DECLARE @CastSummary TABLE (CastID INT, SalesOrderID INT, ProductionOrderID INT, Actual FLOAT,
    ProductionOrderNo NVARCHAR(50), SalesOrderNo NVARCHAR(50), Customer NVARCHAR(MAX), Targets FLOAT)
    SELECT
    C.CastID,
    SO.SalesOrderID,
    PO.ProductionOrderID,
    F.CalculatedWeight,
    PO.ProductionOrderNo,
    SO.SalesOrderNo,
    SC.Name,
    SO.OrderQty
    FROM
    CastCast C
    JOIN Sales.Production PO ON PO.ProductionOrderID = C.ProductionOrderID
    join Sales.ProductionDetail d on d.ProductionOrderID = PO.ProductionOrderID
    LEFT JOIN Sales.SalesOrder SO ON d.SalesOrderID = SO.SalesOrderID
    LEFT JOIN FinishedGoods.Equipment F ON F.CastID = C.CastID
    JOIN Sales.Customer SC ON SC.CustomerID = SO.CustomerID
    WHERE
    (C.CreatedDate >= @StartDate AND C.CreatedDate < @EndDate)
    It takes almost 33% for Table Insert when I insert the data in a temp table and then 67% for Spooling. I had removed 2 LEFT joins and made it as JOIN from the above query and then tried. Query execution became bit fast. But still needs improvement.
    How I can improve further. Will it be good enough if I create Indexes on the columns for the temp table and try.or what If I use derived tables?? Please suggest.
    -Pep

    How I can improve further. Will it be good enough if I create Indexes on the columns for the temp table and try.or what If I use derived tables??
    I suggest you start with index tuning.  Specifically, make sure columns specified in the WHERE and JOIN columns are properly indexed (ideally clustered or covering, and unique when possible).  Changing outer joins to inner joins is appropriate
    if you don't need outer joins in the first place.
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • Inserts into Global Temporary Table

    I'm working on using a global temporary table in one of my apps. I have a small test run here to isolate the problem. It simply creates the global temporary table, inserts a row, commits and then does a select to see if the insert worked. No data shows in the table when running this. I don't know much about global temp tables, so any help would be appreciated.
    CREATE GLOBAL TEMPORARY TABLE AGENT_SILO.AS_TEMP_VALIDATE (
    SBI_EMPLOYEE_ID NUMBER,
    CURRENT_FLAG char(1),
    EFFECTIVE_START date,
    EFFECTIVE_END date
    ) ON COMMIT DELETE ROWS;
    INSERT INTO AGENT_SILO.AS_TEMP_VALIDATE(SBI_EMPLOYEE_ID, CURRENT_FLAG, EFFECTIVE_START, EFFECTIVE_END)
    VALUES(0, '', SYSDATE, SYSDATE);
    commit;
    SELECT * FROM AGENT_SILO.AS_TEMP_VALIDATE;

    So I wonder what else I'm doing wrong that's really obvious. Here's what i'm trying to accomplish and maybe there's a better way of going about it.
    I have a trigger that is supposed to do some validation before the insert is allowed to go through. So here's my approach. I have a trigger fired when there's an insert into the AS_Employee_history table. This passes some of the fields from this insert into a proc (the id, a flag and a couple of dates). Within the proc, i create a global temp table, insert these passed values into the temp table. Then I have a cursor to basically copy the rows from the as_employee_history table that have the same id. Then I can do some selects on the temp table to see if it passes the validation.
    I have outputs throughout for debugging and it gets to right after the inserts into the temp table, then the rest of the code doesn't appear to be executed. So it looks like it's failing at the execution of select statements on the temp table. Anything else obvious that I"m missing here?
    Here's my proc.
    PROCEDURE "PAS_VALIDATE" (STATUS OUT VARCHAR2, v_status OUT BOOLEAN, NEW_SBI_EMPLOYEE_ID IN NUMBER,
    NEW_CURRENT_FLAG IN CHAR, NEW_EFFECTIVE_START IN DATE,
    NEW_EFFECTIVE_END IN DATE)
    IS
    v_prev_effective_end date;
    v_flag_count number;
    v_flag_count_date number;
    --variables to store dynamic sql returns
    v_sql_flag_count_date varchar2(255);
    v_sql_flag_count varchar2(255);
    v_sql_prev_eff_end varchar2(255);
    cursor c_row is
    select * from AGENT_SILO.AS_EMPLOYEE_HISTORY EMP
    where (EMP.SBI_EMPLOYEE_ID = NEW_SBI_EMPLOYEE_ID);
    r_row c_row%ROWTYPE;
    BEGIN
    Status := 'Started';
    v_status := true;
    DBMS_OUTPUT.PUT_LINE('Creating temporary table...');
    execute immediate 'CREATE GLOBAL TEMPORARY TABLE AGENT_SILO.AS_TEMP_VALIDATE (
    SBI_EMPLOYEE_ID NUMBER,
    CURRENT_FLAG char(1),
    EFFECTIVE_START date,
    EFFECTIVE_END date
    ) ON COMMIT PRESERVE ROWS';
         DBMS_OUTPUT.PUT_LINE('Validating the data...');
         --DBMS_OUTPUT.PUT_LINE('Inserting submitted row into temp table');
    --Insert the new row being submitted from user into the temp table
    execute immediate 'INSERT INTO AGENT_SILO.AS_TEMP_VALIDATE(SBI_EMPLOYEE_ID, CURRENT_FLAG, EFFECTIVE_START, EFFECTIVE_END)
    VALUES(' || NEW_SBI_EMPLOYEE_ID || ',
    ''' || NEW_CURRENT_FLAG || ''',
    to_date(''' || to_char(NEW_EFFECTIVE_START, 'mm/dd/yyyy hh:mi:ss') || ''', ''mm/dd/yyyy hh:mi:ss''),
    to_date(''' || to_char(NEW_EFFECTIVE_END, 'mm/dd/yyyy hh:mi:ss') || ''', ''mm/dd/yyyy hh:mi:ss''))';
    --Insert the other rows to we end up with a subset of the employee history table
    --with only rows that match the sbi_employee_id of the submitted row
         --DBMS_OUTPUT.PUT_LINE('Inserting into temp table...');
    open c_row;
    loop
    fetch c_row into r_row;
    exit when c_row%NOTFOUND;
    execute immediate 'INSERT INTO AGENT_SILO.AS_TEMP_VALIDATE(SBI_EMPLOYEE_ID, CURRENT_FLAG, EFFECTIVE_START, EFFECTIVE_END)
    VALUES(' || r_row.SBI_EMPLOYEE_ID || ',
    ''' || r_row.CURRENT_FLAG || ''',
    to_date(''' || to_char(r_row.EFFECTIVE_START, 'mm/dd/yyyy hh:mi:ss') || ''', ''mm/dd/yyyy hh:mi:ss''),
    to_date(''' || to_char(r_row.EFFECTIVE_END, 'mm/dd/yyyy hh:mi:ss') || ''', ''mm/dd/yyyy hh:mi:ss''))';
    end loop;
    close c_row;
    DBMS_OUTPUT.PUT_LINE('After inserts');
    -----Store queries to determine values for validation--------------------------
    v_sql_prev_eff_end := 'SELECT to_char(max(effective_end), ''dd-mon-yy'')
    FROM AGENT_SILO.AS_TEMP_VALIDATE
    where to_char(EFFECTIVE_END, ''dd-mon-yy'') != ''31-dec-99'' AND
    SBI_EMPLOYEE_ID = NEW_SBI_EMPLOYEE_ID';
    --Find the largest effective_end, besides the 9999 value
    execute immediate v_sql_prev_eff_end into v_prev_effective_end;
    DBMS_OUTPUT.PUT_LINE('The highest previous end date: ' || v_prev_effective_end);
    --...........Validation testing...........
    execute immediate 'DROP TABLE AGENT_SILO.AS_TEMP_VALIDATE'; --Drop temp table
    DBMS_OUTPUT.PUT_LINE('Validation Procedure Complete');
    COMMIT;
    status:='Success';
    EXCEPTION
    When Others Then
    ROLLBACK;
    Status := SQLERRM;
    END;
    Thanks a bunch for helping a noob out.

  • How to insert into two differents tables at the same time

    Hi
    I'm newer using JDev, (version 3.1.1.2 cause the OAS seems to support just the JSP 1.0)
    and I want to insert into two differents tables at the same time using one view.
    How can I do that ?
    TIA
    Edgar

    Oracle 8i supports 'INSTEAD OF' triggers on object views so you could use a process similar to the following:
    1. Create an object view that joins your two tables. 'CREATE OR REPLACE VIEW test AS SELECT d.deptno, d.deptname, e.empname FROM DEPT d, EMP E'.
    2. Create an INSTEAD OF trigger on the view.
    3. Put code in the trigger that looks at the :NEW values being processed and determines which columns should be used to INSERT or UPDATE for each table. Crude pseudo-code might be:
    IF :NEW.deptno NOT IN (SELECT deptno FROM DEPT) THEN
    INSERT INTO dept VALUES(:NEW.deptno, :NEW.deptname);
    INSERT INTO emp VALUES (:NEW.deptno, :NEW.empname);
    ELSE
    IF :NEW.deptname IS NOT NULL THEN
    UPDATE dept SET deptname = :NEW.deptname
    WHERE deptno = :NEW.deptno;
    END IF;
    IF :NEW.empname IS NOT NULL THEN
    UPDATE emp SET empname = :NEW.empname
    WHERE deptno = :NEW.deptno;
    Try something along those lines.
    null

  • Inserting into a base table of a materialized view takes lot of time.......

    Dear All,
    I have created a materialized view which refreshes on commit.materialized view is enabled query rewrite.I have created a materialized view log on the base table also While inserting into the base table it takes lot of time................Can u please tell me why?

    Dear Rahul,
    Here is my materialized view..........
    create materialized view mv_test on prebuilt table refresh force on commit
    enable query rewrite as
    SELECT P.PID,
    SUM(HH_REGD) AS HH_REGD,
    SUM(INPRO_WORKS) AS INPRO_WORKS,
    SUM(COMP_WORKS) AS COMP_WORKS,
    SUM(SKILL_WAGE) AS SKILL_WAGE,
    SUM(UN_SKILL_WAGE) AS UN_SKILL_WAGE,
    SUM(WAGE_ADVANCE) AS WAGE_ADVANCE,
    SUM(MAT_AMT) AS MAT_AMT,
    SUM(DAYS) AS DAYS,
    P.INYYYYMM,P.FIN_YEAR
    FROM PROG_MONTHLY P
    WHERE SUBSTR(PID,5,2)<>'PP'
    GROUP BY PID,P.INYYYYMM,P.FIN_YEAR;
    Please help me if query enable rewrite does any performance degradation......
    Thanks & Regards
    Kris

  • Multi-table mapping is not inserting into the primary table first.

    I have an inheritance mapping where the children are mapped to the parent table oids with the "Multi-Table Info" tab.
    One of children is not inserting properly. Its insert is attempting to insert into one of the tables from the "Additional Tables" of "Multi-Table Info" instead of the primary table that it is mapped to.
    The other children insert correctly. This child is not much different from those.
    I looked through the forums, but found nothing similiar.

    I would expect the Children to be inserted into both the primary table and the Additional Table? Is the object in question inserted into the primary table at all? Is the problem that it is being inserted into the Additional Table first? If it is, are the primary key names different? Is it a foreign key relationship?
    If the object in question has no fields in the additional table is it mapped to the additional table? (it should not be)
    Perhaps providing the deployment XML will help determine the problem,
    --Gordon                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • How to select data from 3rd row of Excel to insert into Sql server table using ssis

    Hi,
    Iam having Excel files with headers in first two rows , i want two skip that two rows and select data from 3rd row to insert into Sql Server table using ssis.3rd row is having column names.

                                                         CUSTOMER DETAILS
                         REGION
    COL1        COL2        COL3       COL4           COL5          COL6          COL7
           COL8          COL9          COL10            COL11      
    1            XXX            yyyy         zzzz
    2            XXX            yyyy        zzzzz
    3           XXX            yyyy          zzzzz
    4          XXX             yyyy          zzzzz
    First two rows having cells merged and with headings in excel , i want two skip the first two rows and select the data from 3rd row and insert into sql server using ssis
    Set range within Excel command as per below
    See
    http://www.joellipman.com/articles/microsoft/sql-server/ssis/646-ssis-skip-rows-in-excel-source-file.html
    Please Mark This As Answer if it solved your issue
    Please Mark This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Nologging direct-path insert into an indexed table

    Hello,
    Does anyone have an idea how I can suppress generation of undo logs for direct-path insert into an indexed table on 11.2.0.1.0:
    CREATE TABLE TBL(ID NUMBER) NOLOGGING;
    CREATE INDEX IDX ON TBL(ID) NOLOGGING;
    INSERT /*+ APPEND */ INTO TBL SELECT /*+ APPEND */ ROWNUM FROM ...; -- Source table has 400,000,000+ rows
    Regards,
    Angel Tsankov

    Pl do not post duplicates - Why does Oracle not use direct-path insert when instructed to do so - pl continue the discussion in your original thread

  • XSQL Insert into Schema based Table

    Hi,
    I am struggling with this one:
    My test XML is well formed:
    <?xml version="1.0"?>
    <SC_ESP>
    <ResultSet>
         <ApplicationID>App0001</ApplicationID>
         <Reason>Sucess</Reason>
    </ResultSet>
    </SC_ESP>
    Whan I try to POST this XML data via the XSQL Servlet
    I get the following error:
    "Character '$' is not allowed in a XML tag name."
    Anyone (Mark ?) know what it means ?
    Thanks
    Martin

    Hello Mark,
    I don't know which time zone you are in but thanks for the very prompt replies.
    All I am trying to with this test is provide an XML input test facility via HTTP. The users are supposed to paste the XML into a TextAre using their browser and have that data inserted into the corresponding Table(s) via the XSQL servlet.
    The XML schema have already been registered but I note from your other reply that I have not added the annotation xdb:storeVarrayAsTable="true" to the schema tags.
    It is worth noting that the whole thing works if I create a simple table that is NOT related to an XML Shema.
    You say that you have not had to use the XSQl Servlet. Would you care to say what approach you would use for the above requirement. Shall I set about writing my own Servlet to do this.

  • How to execute a Java method when row inserted into a database table?

    I have the need to fire off a java method when a row is inserted into a database table. I am unfortunately working with MySQL which just recently supported triggers but these new triggers can not execute a Java application on any event.
    What I am looking for is an event driven approach such that when a row is inserted into a specific table I can fire off a java method (sitting in a tomcat container) that will take the contents and send it to a web service.
    It has been mentioned that JMS may have the ability to poll and monitor a database table. Just wondering if anyone could point me in the right direction.
    thanks
    JavaTek

    A service handler might be the right way to run some code at the end of a service call (another way would be to make use of filters).
    First, make sure your static table is merged with ServiceHandlers.
    Secondary, change your custom method name into one that is not already in the service definition of COLLECTION_COPY_LOT (preferably a unique method name like collectionCopyLotLastAction that describes its purpose) and remove the following line from your code:
    m_service.doCodeEx("",this);Now create a service definition for COLLECTION_COPY_LOT in your custom component based on the original COLLECTION_COPY_LOT (copy paste from the original service definition) and add you own method collectionCopyLotLastAction as the last step in the service. Play with the load order to make sure CS is using your service definition of COLLECTION_COPY_LOT instead of the original.
    regards,
    Fabian

  • Need to increase performance-bulk collect in cursor with limit and in the for loop inserting into the trigger table

    Hi all,
    I have a performance issue in the below code,where i am trying to insert the data from table_stg into target_tab and in parent_tab tables and then to child tables via cursor with bulk collect .the target_tab and parent_tab are huge tables and have a row wise trigger enabled on it .the trigger is mandatory . This timetaken for this block to execute is 5000 seconds.Now my requirement is to reduce it to 5 to 10 mins.
    can someone please guide me here.Its bit urgent .Awaiting for your response.
    declare
    vmax_Value NUMBER(5);
      vcnt number(10);
      id_val number(20);
      pc_id number(15);
      vtable_nm VARCHAR2(100);
      vstep_no  VARCHAR2(10);
      vsql_code VARCHAR2(10);
      vsql_errm varchar2(200);
      vtarget_starttime timestamp;
      limit_in number :=10000;
      idx           number(10);
              cursor stg_cursor is
             select
                   DESCRIPTION,
                   SORT_CODE,
                   ACCOUNT_NUMBER,
                     to_number(to_char(CORRESPONDENCE_DATE,'DD')) crr_day,
                     to_char(CORRESPONDENCE_DATE,'MONTH') crr_month,
                     to_number(substr(to_char(CORRESPONDENCE_DATE,'DD-MON-YYYY'),8,4)) crr_year,
                   PARTY_ID,
                   GUID,
                   PAPERLESS_REF_IND,
                   PRODUCT_TYPE,
                   PRODUCT_BRAND,
                   PRODUCT_HELD_ID,
                   NOTIFICATION_PREF,
                   UNREAD_CORRES_PERIOD,
                   EMAIL_ID,
                   MOBILE_NUMBER,
                   TITLE,
                   SURNAME,
                   POSTCODE,
                   EVENT_TYPE,
                   PRIORITY_IND,
                   SUBJECT,
                   EXT_PRD_ID_TX,
                   EXT_PRD_HLD_ID_TX,
                   EXT_SYS_ID,
                   EXT_PTY_ID_TX,
                   ACCOUNT_TYPE_CD,
                   COM_PFR_TYP_TX,
                   COM_PFR_OPT_TX,
                   COM_PFR_RSN_CD
             from  table_stg;
    type rec_type is table of stg_rec_type index by pls_integer;
    v_rt_all_cols rec_type;
    BEGIN
      vstep_no   := '0';
      vmax_value := 0;
      vtarget_starttime := systimestamp;
      id_val    := 0;
      pc_id     := 0;
      success_flag := 0;
              vstep_no  := '1';
              vtable_nm := 'before cursor';
        OPEN stg_cursor;
              vstep_no  := '2';
              vtable_nm := 'After cursor';
       LOOP
              vstep_no  := '3';
              vtable_nm := 'before fetch';
    --loop
        FETCH stg_cursor BULK COLLECT INTO v_rt_all_cols LIMIT limit_in;
                  vstep_no  := '4';
                  vtable_nm := 'after fetch';
    --EXIT WHEN v_rt_all_cols.COUNT = 0;
        EXIT WHEN stg_cursor%NOTFOUND;
    FOR i IN 1 .. v_rt_all_cols.COUNT
      LOOP
       dbms_output.put_line(upper(v_rt_all_cols(i).event_type));
        if (upper(v_rt_all_cols(i).event_type) = upper('System_enforced')) then
                  vstep_no  := '4.1';
                  vtable_nm := 'before seq sel';
              select PC_SEQ.nextval into pc_id from dual;
                  vstep_no  := '4.2';
                  vtable_nm := 'before insert corres';
              INSERT INTO target1_tab
                           (ID,
                            PARTY_ID,
                            PRODUCT_BRAND,
                            SORT_CODE,
                            ACCOUNT_NUMBER,
                            EXT_PRD_ID_TX,         
                            EXT_PRD_HLD_ID_TX,
                            EXT_SYS_ID,
                            EXT_PTY_ID_TX,
                            ACCOUNT_TYPE_CD,
                            COM_PFR_TYP_TX,
                            COM_PFR_OPT_TX,
                            COM_PFR_RSN_CD,
                            status)
             VALUES
                            (pc_id,
                             v_rt_all_cols(i).party_id,
                             decode(v_rt_all_cols(i).product_brand,'LTB',2,'HLX',1,'HAL',1,'BOS',3,'VER',4,0),
                             v_rt_all_cols(i).sort_code,
                             'XXXX'||substr(trim(v_rt_all_cols(i).ACCOUNT_NUMBER),length(trim(v_rt_all_cols(i).ACCOUNT_NUMBER))-3,4),
                             v_rt_all_cols(i).EXT_PRD_ID_TX,
                             v_rt_all_cols(i).EXT_PRD_HLD_ID_TX,
                             v_rt_all_cols(i).EXT_SYS_ID,
                             v_rt_all_cols(i).EXT_PTY_ID_TX,
                             v_rt_all_cols(i).ACCOUNT_TYPE_CD,
                             v_rt_all_cols(i).COM_PFR_TYP_TX,
                             v_rt_all_cols(i).COM_PFR_OPT_TX,
                             v_rt_all_cols(i).COM_PFR_RSN_CD,
                             NULL);
                  vstep_no  := '4.3';
                  vtable_nm := 'after insert corres';
        else
              select COM_SEQ.nextval into id_val from dual;
                  vstep_no  := '6';
                  vtable_nm := 'before insertcomm';
          if (upper(v_rt_all_cols(i).event_type) = upper('REMINDER')) then
                vstep_no  := '6.01';
                  vtable_nm := 'after if insertcomm';
              insert into parent_tab
                 (ID ,
                 CTEM_CODE,
                 CHA_CODE,            
                 CT_CODE,                           
                 CONTACT_POINT_ID,             
                 SOURCE,
                 RECEIVED_DATE,                             
                 SEND_DATE,
                 RETRY_COUNT)
              values
                 (id_val,
                  lower(v_rt_all_cols(i).event_type), 
                  decode(v_rt_all_cols(i).product_brand,'LTB',2,'HLX',1,'HAL',1,'BOS',3,'VER',4,0),
                  'Email',
                  v_rt_all_cols(i).email_id,
                  'IADAREMINDER',
                  systimestamp,
                  systimestamp,
                  0);  
         else
                vstep_no  := '6.02';
                  vtable_nm := 'after else insertcomm';
              insert into parent_tab
                 (ID ,
                 CTEM_CODE,
                 CHA_CODE,            
                 CT_CODE,                           
                 CONTACT_POINT_ID,             
                 SOURCE,
                 RECEIVED_DATE,                             
                 SEND_DATE,
                 RETRY_COUNT)
              values
                 (id_val,
                  lower(v_rt_all_cols(i).event_type), 
                  decode(v_rt_all_cols(i).product_brand,'LTB',2,'HLX',1,'HAL',1,'BOS',3,'VER',4,0),
                  'Email',
                  v_rt_all_cols(i).email_id,
                  'CORRESPONDENCE',
                  systimestamp,
                  systimestamp,
                  0); 
            END if; 
                  vstep_no  := '6.11';
                  vtable_nm := 'before chop';
             if (v_rt_all_cols(i).ACCOUNT_NUMBER is not null) then 
                      v_rt_all_cols(i).ACCOUNT_NUMBER := 'XXXX'||substr(trim(v_rt_all_cols(i).ACCOUNT_NUMBER),length(trim(v_rt_all_cols(i).ACCOUNT_NUMBER))-3,4);
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)
              values
                (id_val,
                 'IB.Correspondence.AccountNumberMasked',
                 v_rt_all_cols(i).ACCOUNT_NUMBER);
             end if;
                  vstep_no  := '6.1';
                  vtable_nm := 'before stateday';
             if (v_rt_all_cols(i).crr_day is not null) then 
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)
              values
                (id_val,
                 --'IB.Correspondence.Date.Day',
                 'IB.Crsp.Date.Day',
                 v_rt_all_cols(i).crr_day);
             end if;
                  vstep_no  := '6.2';
                  vtable_nm := 'before statemth';
             if (v_rt_all_cols(i).crr_month is not null) then 
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)
              values
                (id_val,
                 --'IB.Correspondence.Date.Month',
                 'IB.Crsp.Date.Month',
                 v_rt_all_cols(i).crr_month);
             end if;
                  vstep_no  := '6.3';
                  vtable_nm := 'before stateyear';
             if (v_rt_all_cols(i).crr_year is not null) then 
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)
              values
                (id_val,
                 --'IB.Correspondence.Date.Year',
                 'IB.Crsp.Date.Year',
                 v_rt_all_cols(i).crr_year);
             end if;
                  vstep_no  := '7';
                  vtable_nm := 'before type';
               if (v_rt_all_cols(i).product_type is not null) then
                  insert into child_tab
                     (COM_ID,                                            
                     KEY,                                                                                                                                        
                     VALUE)
                  values
                    (id_val,
                     'IB.Product.ProductName',
                   v_rt_all_cols(i).product_type);
                end if;
                  vstep_no  := '9';
                  vtable_nm := 'before title';         
              if (trim(v_rt_all_cols(i).title) is not null) then
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE )
              values
                (id_val,
                 'IB.Customer.Title',
                 trim(v_rt_all_cols(i).title));
              end if;
                  vstep_no  := '10';
                  vtable_nm := 'before surname';
              if (v_rt_all_cols(i).surname is not null) then
                insert into child_tab
                   (COM_ID,                                            
                   KEY,                                                                                                                                          
                   VALUE)
                values
                  (id_val,
                  'IB.Customer.LastName',
                  v_rt_all_cols(i).surname);
              end if;
                            vstep_no  := '12';
                            vtable_nm := 'before postcd';
              if (trim(v_rt_all_cols(i).POSTCODE) is not null) then
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)                              
               values
                (id_val,
                 'IB.Customer.Addr.PostCodeMasked',
                  substr(replace(v_rt_all_cols(i).POSTCODE,' ',''),length(replace(v_rt_all_cols(i).POSTCODE,' ',''))-2,3));
              end if;
                            vstep_no  := '13';
                            vtable_nm := 'before subject';
              if (trim(v_rt_all_cols(i).SUBJECT) is not null) then
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)                              
               values
                (id_val,
                 'IB.Correspondence.Subject',
                  v_rt_all_cols(i).subject);
              end if;
                            vstep_no  := '14';
                            vtable_nm := 'before inactivity';
              if (trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) is null or
                  trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) = '3' or
                  trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) = '6' or
                  trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) = '9') then
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)                              
               values
                (id_val,
                 'IB.Correspondence.Inactivity',
                  v_rt_all_cols(i).UNREAD_CORRES_PERIOD);
              end if;
                          vstep_no  := '14.1';
                          vtable_nm := 'after notfound';
        end if;
                          vstep_no  := '15';
                          vtable_nm := 'after notfound';
        END LOOP;
        end loop;
                          vstep_no  := '16';
                          vtable_nm := 'before closecur';
        CLOSE stg_cursor;
                          vstep_no  := '17';
                          vtable_nm := 'before commit';
        DELETE FROM table_stg;
      COMMIT;
                          vstep_no  := '18';
                          vtable_nm := 'after commit';
    EXCEPTION
    WHEN OTHERS THEN
      ROLLBACK;
      success_flag := 1;
      vsql_code := SQLCODE;
      vsql_errm := SUBSTR(sqlerrm,1,200);
      error_logging_pkg.inserterrorlog('samp',vsql_code,vsql_errm, vtable_nm,vstep_no);
      RAISE_APPLICATION_ERROR (-20011, 'samp '||vstep_no||' SQLERRM:'||SQLERRM);
    end;
    Thanks

    Its bit urgent
    NO - it is NOT urgent. Not to us.
    If you have an urgent problem you need to hire a consultant.
    I have a performance issue in the below code,
    Maybe you do and maybe you don't. How are we to really know? You haven't posted ANYTHING indicating that a performance issue exists. Please read the FAQ for how to post a tuning request and the info you need to provide. First and foremost you have to post SOMETHING that actually shows that a performance issue exists. Troubleshooting requires FACTS not just a subjective opinion.
    where i am trying to insert the data from table_stg into target_tab and in parent_tab tables and then to child tables via cursor with bulk collect .the target_tab and parent_tab are huge tables and have a row wise trigger enabled on it .the trigger is mandatory . This timetaken for this block to execute is 5000 seconds.Now my requirement is to reduce it to 5 to 10 mins.
    Personally I think 5000 seconds (about 1 hr 20 minutes) is very fast for processing 800 trillion rows of data into parent and child tables. Why do you think that is slow?
    Your code has several major flaws that need to be corrected before you can even determine what, if anything, needs to be tuned.
    This code has the EXIT statement at the beginning of the loop instead of at the end
        FETCH stg_cursor BULK COLLECT INTO v_rt_all_cols LIMIT limit_in;
                  vstep_no  := '4';
                  vtable_nm := 'after fetch';
    --EXIT WHEN v_rt_all_cols.COUNT = 0;
        EXIT WHEN stg_cursor%NOTFOUND;
    The correct place for the %NOTFOUND test when using BULK COLLECT is at the END of the loop; that is, the last statement in the loop.
    You can use a COUNT test at the start of the loop but ironically you have commented it out and have now done it wrong. Either move the NOTFOUND test to the end of the loop or remove it and uncomment the COUNT test.
    WHEN OTHERS THEN
      ROLLBACK;
    That basically says you don't even care what problem occurs or whether the problem is for a single record of your 10,000 in the collection. You pretty much just throw away any stack trace and substitute your own message.
    Your code also has NO exception handling for any of the individual steps or blocks of code.
    The code you posted also begs the question of why you are using NAME=VALUE pairs for child data rows? Why aren't you using a standard relational table for this data?
    As others have noted you are using slow-by-slow (row by row processing). Let's assume that PL/SQL, the bulk collect and row-by-row is actually necessary.
    Then you should be constructing the parent and child records into collections and then inserting them in BULK using FORALL.
    1. Create a collection for the new parent rows
    2. Create a collection for the new child rows
    3. For each set of LIMIT source row data
      a. empty the parent and child collections
      b. populate those collections with new parent/child data
      c. bulk insert the parent collection into the parent table
      d. bulk insert the child collection into the child table
    And unless you really want to either load EVERYTHING or abandon everything you should use bulk exception handling so that the clean data gets processed and only the dirty data gets rejected.

  • How to Insert into a temporary table

    Hi Experts,
    I have created a temp table in sql from vfp using the #prefix. But when I issue an insert command sqlexec returns negative.
    The code which I have used :
    =sqlexec(oConn, "Create table #smenu (code_ varchar(1), name_ varchar(50))")
    Table created successfully under tempdb database of sql
    m_result=sqlexec(oConn, "insert into #smenu values ('100','Police')")
    m_result returns negative.
    Kindly help.

    Try to create a permanent table:
    =sqlexec(oConn, "Create table temp_smenu (code_ varchar(1), name_ varchar(50))")
    m_result=sqlexec(oConn, "insert into temp_smenu values ('100','Police')")
    Temporary table has limited visibility.
    You need to drop table temp_smenu when finished.
    Kalman Toth Database & OLAP Architect
    SQL Server 2014 Design & Programming
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

  • T-sql 2008 r2 place results from calling a stored procedure with parameters into a temp table

    I would like to know if the following sql can be used to obtain specific columns from calling a stored procedure with parameters:
    /* Create TempTable */
    CREATE TABLE #tempTable
    (MyDate SMALLDATETIME,
    IntValue INT)
    GO
    /* Run SP and Insert Value in TempTable */
    INSERT INTO #tempTable
    (MyDate,
    IntValue)
    EXEC TestSP @parm1, @parm2
    If the above does not work or there is a better way to accomplish this goal, please let me know how to change the sql?

    declare @result varchar(100), @dSQL nvarchar(MAX)
    set @dSQL = 'exec @res = TestSP '''+@parm1+''','' '''+@parm2+' '' '
    print @dSQL
      EXECUTE sp_executesql @dSQL, N'@res varchar(100) OUTPUT', @res = @result OUTPUT
    select @result
    A complicated way of saying
    EXEC @ret = TestSP @parm1, @parm2
    SELECT @ret
    And not only compliacated, it introduces a window for SQL injection.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • RE:ora-28500 when trying to insert into SQL Server table

    Hi all!
    I'm getting this when i attempt to insert into my table test in a SQL Server database. As u can see from the select, the database link seems ok. Any ideas plz. Thnx..
    SQL> select "idAges" from test@try2;
    no rows selected
    SQL> insert into test@try2 ("idAges","nARAges","nAPAges") values(10,'','');
    insert into test@try2 ("idAges","nARAges","nAPAges") values(10,'','')
    ERROR at line 1:
    ORA-28500: connection from ORACLE to a non-Oracle system returned this message:
    [Generic Connectivity Using ODBC][Microsoft][ODBC SQL Server Driver][SQL
    Server]Cannot insert explicit value for identity column in table 'test' when
    IDENTITY_INSERT is set to OFF. (SQL State: 23000; SQL Code: 544)
    ORA-02063: preceding 2 lines from TRY2

    IDENTITY_INSERT on works only for particular session and for one table. It's not global variable.
    The only idea what I have is to create link from MS Sql to Oracle (right now You have oracle ->ms sql)and then before insert You will set identity_insert on and will run insert.

  • How can I INSERT INTO from Staging Table to Production Table

    I’ve got a Bulk Load process which works fine, but I’m having major problems downstream.
    Almost everything is Varchar(100), and this works fine. 
    Except for these fields:
    INDEX SHARES, INDEX MARKET CAP, INDEX WEIGHT, DAILY PRICE RETURN, and DAILY TOTAL RETURN
    These four fields must be some kind of numeric, because I need to perform sums on these guys.
    Here’s my SQL:
    CREATE
    TABLE [dbo].[S&P_Global_BMI_(US_Dollar)]
    [CHANGE]     
    VARCHAR(100),
    [EFFECTIVE DATE]  
    VARCHAR(100),
    [COMPANY]  
    VARCHAR(100),
    [RIC]  
    VARCHAR(100),
    Etc.
    [INDEX SHARES]
    NUMERIC(18, 12),
    [INDEX MARKET CAP]
    NUMERIC(18, 12),
    [INDEX WEIGHT]
    NUMERIC(18, 12),
    [DAILY PRICE RETURN]
    NUMERIC(18, 12),
    [DAILY TOTAL RETURN]
    NUMERIC(18, 12),
    From the main staging table, I’m writing data to 4 production tables.
    CREATE
    TABLE [dbo].[S&P_Global_Ex-U.S._LargeMidCap_(US_Dollar)]
    [CHANGE]     
    VARCHAR(100),
    [EFFECTIVE DATE]  
    VARCHAR(100),
    [COMPANY]  
    VARCHAR(100),
    [RIC]  
    VARCHAR(100),
    Etc.
    [INDEX SHARES]
    FLOAT(20),
    [INDEX MARKET CAP]
    FLOAT(20),
    [INDEX WEIGHT] FLOAT(20),
    [DAILY PRICE RETURN]
    FLOAT(20),
    [DAILY TOTAL RETURN]
    FLOAT(20),,
    INSERT
    INTO [dbo].[S&P_Global_Ex-U.S._LargeMidCap_(US_Dollar)]
      SELECT
    [CHANGE],
    Etc.
    [DAILY TOTAL RETURN]
      FROM
    [dbo].[S&P_Global_BMI_(US_Dollar)]
      WHERE
    isnumeric([Effective Date])
    = 1
      AND
    [CHANGE] is
    null
      AND
    [COUNTRY] <>
    'US'
      AND ([SIZE] =
    'L' OR [SIZE]
    = 'M')
    The Bulk Load is throwing errors like this (unless I make everything Varchar):
    Bulk load data conversion error (truncation) for row 7, column 23 (INDEX SHARES).
    Msg 4863, Level 16, State 1, Line 1
    When I try to load data from the staging table to the production table, I get this.
    Msg 8115, Level 16, State 8, Line 1
    Arithmetic overflow error converting varchar to data type numeric.
    The statement has been terminated.
    There must be an easy way to overcome this, right.
    Please advise!
    Thanks!!
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

    Nothing is returned.  Everything is VARCHAR(100).  the problem is this.
    If I use FLOAT(18) or REAL, I get exponential numbers, which is useless to me.
    If I use DECIMAL(18,12) or NUMERIC(18,12), I get errors. 
    Msg 4863, Level 16, State 1, Line 41
    Bulk load data conversion error (truncation) for row 7, column 23 (INDEX SHARES).
    Msg 4863, Level 16, State 1, Line 41
    Bulk load data conversion error (truncation) for row 8, column 23 (INDEX SHARES).
    Msg 4863, Level 16, State 1, Line 41
    Bulk load data conversion error (truncation) for row 9, column 23 (INDEX SHARES).
    There must be some data type that fits this!
    Here's a sample of what I'm dealing with.
    -0.900900901
    9.302325581
    -2.648171501
    -1.402805723
    -2.911830584
    -2.220960866
    2.897762349
    -0.219640074
    -5.458448607
    -0.076626094
    6.710940231
    0.287200186
    0.131682908
    0.124276221
    0.790818723
    0.420505119
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

Maybe you are looking for

  • Can't install new itunes and mobile device support keeps crashing

    a while ago, i tried getting new quicktime, but my computer crashed in the middle of the updating (also a problem- anyone know why it keeps lagging, freezing, and crashing on me?). i ignored it, and plugged in my ipod a few days later. itunes wouldn'

  • Video Producer needs help with code error

    Hi all! At the risk of getting long winded, I am a video producer who has in the last 6 months gotten into web development for myself and to offer to clients. I am in the process of learning DreamWeaver and like it very much, but I must say that the

  • New 'Sleep Display' in Leopard

    New to Leopard, the "Hot Corners" options in Screensaver prefs now includes the ability to set one or more screen corners to "Sleep Display" to make the energy saver kick in. Is anyone using this? Is anyone having problems with it? I have dual displa

  • How to transfer the http request from applet to servlet/jsp

    I use the JTree component to create a navigation of a website,but i don't know how to connect the tree's nodes with the jsp/servlet. I mean how to transfer the http request from the applet to a jsp. I use the "<frameset>" mark which will divide the w

  • Mule, Weblogic and MQ JMS : deadlock problem

    Dear Oracle community, We are hosting our Mule ESB (3.1) application on a Weblogic 10.3 (11g) server and are using IBM Websphere MQ's JMS solution (with libraries version 7.0.1.7). The problem we are facing is that JMS connections are created by one