Cursor v/s direct insert

hi all,
which one is better...?
defining a cursor for an insert or using the select stmt in the insert?
ex1:
declare cursor c1 as
select a,b from table1
begin
for i in c1
loop
insert into table2 values i.a, i.b ;
end loop;
end;
ex2
begin
insert into table2 values (select a, b from table1);
end;
thnks

...and here are the facts:
SQL> set timing on
SQL> declare
  2  cursor c is
  3  select *
  4  from all_Tables;
  5  begin
  6  for r in c loop
  7  insert into test values(
  8  r.OWNER                    ,
  9  r.TABLE_NAME               ,
10  r.TABLESPACE_NAME          ,
11  r.CLUSTER_NAME             ,
12  r.IOT_NAME                 ,
13  r.PCT_FREE                 ,
14  r.PCT_USED                 ,
15  r.INI_TRANS                ,
16  r.MAX_TRANS                ,
17  r.INITIAL_EXTENT           ,
18  r.NEXT_EXTENT              ,
19  r.MIN_EXTENTS              ,
20  r.MAX_EXTENTS              ,
21  r.PCT_INCREASE             ,
22  r.FREELISTS                ,
23  r.FREELIST_GROUPS          ,
24  r.LOGGING                  ,
25  r.BACKED_UP                ,
26  r.NUM_ROWS                 ,
27  r.BLOCKS                   ,
28  r.EMPTY_BLOCKS             ,
29  r.AVG_SPACE                ,
30  r.CHAIN_CNT                ,
31  r.AVG_ROW_LEN              ,
32  r.AVG_SPACE_FREELIST_BLOCKS,
33  r.NUM_FREELIST_BLOCKS,
34  r.DEGREE             ,
35  r.INSTANCES          ,
36  r.CACHE              ,
37  r.TABLE_LOCK         ,
38  r.SAMPLE_SIZE        ,
39  r.LAST_ANALYZED      ,
40  r.PARTITIONED        ,
41  r.IOT_TYPE           ,
42  r.TEMPORARY          ,
43  r.SECONDARY          ,
44  r.NESTED             ,
45  r.BUFFER_POOL        ,
46  r.ROW_MOVEMENT       ,
47  r.GLOBAL_STATS       ,
48  r.USER_STATS         ,
49  r.DURATION           ,
50  r.SKIP_CORRUPT       ,
51  r.MONITORING         ,
52  r.CLUSTER_OWNER      ,
53  r.DEPENDENCIES       ,
54  r.COMPRESSION        ,
55  r.DROPPED
56  );
57  end loop;
58* end;
SQL> /
PL/SQL procedure successfully completed.
Elapsed: 00:00:04.07
SQL> insert into test
  2  select *
  3  from all_tables;
3718 rows created.
Elapsed: 00:00:01.07
SQL> Regards,
Gerd

Similar Messages

  • Direct insert Vs collection

    Greetings,
    In one of my procedures, I've used collection with commit every 1000 records. It took about 4 hrs to process 14,000,000 records. Then, without changing the logic, I've modified the same procedure and did direct insert and delete statements. It took about 1.5 hrs.
    I was wrong thinking that collection would improve performance as they fetch all at the same time. Is this because of 'commit'? I have also experimented with commit every 10,000 records, even then it didn't improve much.
    Could you discuss further on why slow Vs faster? Thank you,
    Lakshmi

    The rules of thumb (borrowed from Tom Kyte)
    1) If you can do it in SQL, do it in SQL
    2) If you can't do it in SQL, do it in PL/SQL. "Can't do it in SQL" generally means that you can't figure out how to code the logic in SQL and/or the business logic is so complex that it makes the SQL unreadable.
    2a) If you have to use PL/SQL, try to use collections & bulk processing.
    3) If you can't do it in PL/SQL, do it in a Java stored procedure
    4) If you can't do it in a Java stored procedure, do it in an external procedure
    Collections are never preferred over straight SQL from a performance standpoint, but they may be easier to code for more complex business rules.
    Justin

  • DIRECT INSERTS

    Dear forum,
    We can directly insert to table in bulk like
    insert into target as select * from source1 a,source_2 b where a.s1=b.s2
    From the perfomance itself we know that above is n't a record by record insert instead it is bulk insert .
    1) Experts pls tell me ..how oracle manages above query
    2)How many records it send to db @ a time
    Thanks

    hi keith,
    Pls see my query
    insert into pkg_tmp
    select           find_key(VALUE,8) ,
              (VL_ID)mod10 ,
              VALUE_ST     /* */     
    from
    SOURCE_ATTR d,
    STOCK c,
    CONTENT b ,
    PACKAGES a /* */
    where
    a.PACKAGE_ID = b.PACKAGE_ID
    and
         b.TP_ID =1
         and
         b.VL_ID = c.RRS_ID
    and
         c.RRS_ID = VAL_ID
         and
         ATTR_ID = 11
    Tables in from clause contains records in millions ( a = 33M ,b=37M,C=37M,D=122M)...
    I need to select the query as shown in query and finally result to be loaded into pkg_tmp table.
    Thanks..

  • Direct Insert path

    I want to insert large data from dblinks using direct insert however as rollback segment is quite small i need to commit after every 2000 rows or so.
    insert /*+ append */ into abc_monthly@dblink select * from abc partition(ODFRC_20040201);
    any help guys !
    or any other way round for faster insert than this ?
    Edited by: Avi on Aug 12, 2009 9:41 AM

    Hi,
    Don't shoot the messenger but your append hint will be ignored:
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:549493700346053658

  • How to directly insert pictures inside a wiki page without the need to enter the picture's properties

    I am working on a publishing site collection using the enterprise wiki template. Add when I want to add a new Picture from my Pc to the wiki page, I will be prompted with the following dialog:-
    So can anyone advice how i can bypass this dialog and directly add the imag to the wiki page?
    i tried modifying the "Images" content type by hiding the above fields, but still i will be prompted to enter the following :-

    Hi,
    According to your post, my understanding is that you want to directly insert pictures inside a wiki page without the need to enter the picture's properties.
    Per my knowledge, you can use javascript to bypass Edit Properties Page while uploading pictures.
    Here are some similar articles for your reference:
    How to bypass Edit Properties
    Page while uploading documents in a document library
    How to bypass Edit Properties Page while uploading documents in a
    SharePoint and .NET: How to bypass Edit Properties Page or Skip EditForm.aspx Page while uploading documents in a document library
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • Direct Insert or API

    I want to create/insert directly into CI_FUNCTION_ENTITY_USAGES (i$sdd_funent) by using appropriate values for all the not-null columns. I can see that ci_function_entity_usages.function_ref comes from ci_functions.irid,
    ci_function_entity_usages.entity_ref comes from ci_entities.irid, etc ...
    My question is : How do I populate ci_function_entity_usages.id,
    ci_function_entity_usages.irid, ci_function_entity_usages.ivid.
    Are they primed from some internal sequence or some pre-insert
    trigger somewhere?
    Is there a better way of doing this than using direct insert eg an api?
    Thanks in advance
    Ian

    At the moment I've no access to any Designer repository.
    The idea of API is that for each object there is individual PL/SQL package. For ci_function_entity_usages it should be ciofunction_entity_usage. In this package is type 'data' defined. It has 2 components: i and v. i is record of boolean variables. v is record of values - it maches ci_function_entity_usages view.
    To insert new function-entity usage you have to fill in data.v values and set data.i to true for elements you filled in in data.v. Then use ciofunction_entity_usage.ins procedure to insetr record.
    You do not have to set ID, ivid or irid - it is set by API.
    You can follow example.
    Hope it helps. Paweł

  • TOO many OPEN CURSORS during loop of INSERT's

    Running ODP.NET beta2 (can't move up yet but will do that soon)
    I don't think it is related with ODP itself but probably on how .Net works with cursors. We have a for/next loop that executes INSERT INTO xxx VALUES (:a,:b,:c)
    statements. Apparently, when monitoring v$sysstat (current open cursors) we see these raising with 1 INSERT = 1 cursor. If subsequently we try to perform another action, we get max cursors exceeded. We allready set open_cursor = 1000, but the number of inserts can be very high. Is there a way to release these cursors (already wrote oDataAdaptor.dispose, oCmd.dispose but this does not help.
    Is it normal that each INSERT has it's own cursor ? they all have the same hashvalue in v$open_cursor. They seem to be released after a while, especially when moving to another asp.net page, but it's not clear when that happens and if it is possible to force the release of the (implicit?) cursors faster.
    Below is a snippet of the code, I unrolled a couple of function-calls into the code so this is just an example, not sure it will run without errors like this, but the idea should be clear (the code looks rather complex for what it does but the unrolled functions make the code more generic and we have a database-independend datalayer):
    Try
    ' Set the Base Delete statement
    lBaseSql = _
    "INSERT INTO atable(col1,col2,col3) " & _
    "VALUES(:col1,:col2,:col3)"
    ' Initialize a transaction
    lTransaction = oConnection.BeginTransaction()
    ' Create the parameter collection, containing for each
    ' row in the list the arguments
    For Each lDataRow In aList.Rows
    lOracleParamters = New OracleParameterCollection()
    lOracleParameter = New OracleParameter("luserid", OracleDbType.Varchar2,
    _ CType(aCol1, Object))
    lOracleParamters.Add(lOracleParameter)
    lOracleParameter = New OracleParameter("part_no", OracleDbType.Varchar2, _
    CType(lDataRow.Item("col2"), Object))
    lOracleParamters.Add(lOracleParameter)
    lOracleParameter = New OracleParameter("revision", OracleDbType.Int32, _
    CType(lDataRow.Item("col3"), Object))
    lOracleParamters.Add(lOracleParameter)
    ' Execute the Statement;
    ' If the execution fails because the row already exists,
    ' then the insert should be considered as succesfull.
    Try
    Dim aCommand As New OracleCommand()
    Dim retval As Integer
    'associate the aConnection with the aCommand
    aCommand.Connection = oConnection
    'set the aCommand text (stored procedure name or SQL statement)
    aCommand.CommandText = lBaseSQL
    'set the aCommand type
    aCommand.CommandType = CommandType.Text
    'attach the aCommand parameters if they are provided
    If Not (lOracleParameters Is Nothing) Then
    Dim lParameter As OracleParameter
    For Each lParameter In lOracleParameters
    'check for derived output value with no value assigned
    If lParameter.Direction = ParameterDirection.InputOutput _
    And lParameter.Value Is Nothing Then
    lParameter.Value = Nothing
    End If
    aCommand.Parameters.Add(lParameter)
    Next lParameter
    End If
    Return
    ' finally, execute the aCommand.
    retval = cmd.ExecuteNonQuery()
    ' detach the OracleParameters from the aCommand object,
    ' so they can be used again
    cmd.Parameters.Clear()
    Catch ex As Exception
    Dim lErrorMsg As String
    lErrorMsg = ex.ToString
    If Not lTransaction Is Nothing Then
    lTransaction.Rollback()
    End If
    End Try
    Next
    lTransaction.Commit()
    Catch ex As Exception
    lTransaction.Rollback()
    Throw New DLDataException(aConnection, ex)
    End Try

    I have run into this problem as well. To my mind
    Phillip's solution will work but seems completey unnecessary. This is work the provider itself should be managing.
    I've done extensive testing with both ODP and OracleClient. Here is one of the scenarios: In a tight loop of 10,000 records, each of which is either going to be inserted or updated via a stored procedure call, the ODP provider throws the "too many cursor errors at around the 800th iteration. With over 300 cursors being open. The exact same code with OracleClient as the provider never throws an error and opens up 40+ cursors during execution.
    The applicaation I have updates a Oracle8i database from a DB2 database. There are over 30 tables being updated in near real time. Reusing the command object is not an option and adding all the code Phillip did for each call seems highly unnecessary. I say Oracle needs to fix this problem. As much as I hate to say it the microsoft provider seems superior at this point.

  • My cursor is blocking table Inserts

    I have a function that copies any new rows from Database_A, Table_A to Database_B, Table_A via a database link. This happens periodically, and there can be around 100k new records accumulated in between these periods.
    I gather the records which need to be copied by filtering on a "copied" field on the table. Within the loop, I update the row's copied field so it won't be copied during the next pass. Here's the function:
    FUNCTION copy_records RETURN INTEGER IS
         pCount INTEGER := 0;       
            CURSOR recs IS
                SELECT * FROM TABLE_A
                WHERE FLG_COPIED = 'N' OR FLG_COPIED IS NULL
                ORDER BY MYKEY;    -- do the oldest first (only important if we commit during iterations)             
        BEGIN
            pCount := 0;
            FOR rec IN recs LOOP
                -- first copy to backup db
                INSERT INTO TABLE_A@BACKUP_DB
                    (FIELD_1, FIELD_2, FIELD_3)
                VALUES
                    (rec.FIELD_1, rec.FIELD_2, rec.FIELD_3);
                -- now flag as copied
                UPDATE TABLE_A
                SET FLG_COPIED = 'Y'
                WHERE MYKEY = rec.MYKEY;
                -- counter sent back for logging
                pCount := pCount + 1;              
            END LOOP;
            RETURN pCount;
        EXCEPTION
            WHEN OTHERS THEN
                RETURN SQLCODE;
        END;
    END;My problem is that it is blocking on the table while this process takes place. I would expect some row-level blocking, which is fine (this table is primarily INSERT only). But I'm not sure why it blocks such that it won't allow me to INSERT into the table. Can anyone explain this to me?
    If I do a COMMIT during each iteration, I can at least perform an Insert, but this seems to slow things down greatly when used across a DBLink, so I'd like to avoid it (plus I wish to make this function an all-or-none transaction). I'm also not sure if the effect of the COMMIT is to make the block row-only, or just minimize the window of the table locking.
    Edited by: xaeryan on Oct 14, 2011 3:51 PM

    xaeryan wrote:
    Solomon Yakobson wrote:
    xaeryan wrote:
    My problem is that it is blocking on the table while this process takes place.No it is not. Based on your code all it is locking are table TABLE_A rows it updates. So if any other session tries to update/delete rows updated by your function such session will wait till you commit/rollback. If any other session tries to update/delete rows NOT updated by your function such session will NOT wait.
    SY.Based on my observations, it is - when I try to perform an INSERT while the function is running, it will not complete. I can see in Toad that there is a row-level lock for my function which is NOT blocking, however there is also a transaction level lock associated with the same session ID which is listed as blocking. There's not much more detail on that one in Toad.
    Is there anything in Oracle that might do something unexpected when your cursor selects the majority of the records? Like optimization for performance that might involve blocking? It seems silly, I know, but I can't seem to understand why this function stops any use of this table.how can we reproduce what you report?

  • How to load data to a "clustered table" quickly?  direct insert not work

    We have a single hash clustered table whose size is 45G, and we used the command:
    insert /*+ append */ into t_hashed as select * from source.
    the source is about 30G, it takes a very long time(several days, and not yet completed).
    I dumped the redo log, and find there are lots of redo info about the insert. I have thought it should create much redo as it is "direct path insert", but from the dump info I could say the direct path insert didn't take effect.

    Your assessment is correct. INSERT /*+ APPEND */ does not work with a hash clustered table.
    If you think about how hash clusters work, and how direct load insert works, it should be clear to you why a direct load insert via the append hint can never work.
    How hash clusters work (well, only the part that's relevant to the current discussion):
    Initially, at creation time, the storage for the entire cluster is preallocated, based on the various cluster parameters you chose. Think about that. All the storage is allocated to the cluster.
    How direct load works (well, only the part that's relevant to the current discussion):
    Space already allocated to and used by the segment is ignored. Space is taken from free, unformatted blocks from above the high water mark.
    So, hopefully it's clear how these features are incompatible.
    Bottom line:
    It doesn't work, there's no way around it, there's nothing you can do about it....
    -Mark

  • PL/SQL dynamic insert using cursor

    Hello all,
    I'm having big performance issues with (as you'll see below, a very simple) stored procedure. Any hints/suggestions would be really appreciated. Query below in the Loop is used to create a list of all months that fall within a member's start_date & end_date for insertion to member_months table. Can anyone suggest maybe a different approach to improve run time? It was timing out just using SQL, thought I might try PL. Thanks for anyone's help.
    CREATE OR REPLACE PROCEDURE proc_MEMBER_MONTHS authid current_user as
    /* get all Master member id's for cursor */
    CURSOR c_unique_mmi IS
        select distinct master_member_id
        from member;
    v_mmi                           member.master_member_id%TYPE;
    BEGIN
    dbms_output.enable(null);
       OPEN c_unique_mmi;
        LOOP
            FETCH c_unique_mmi INTO v_mmi;   /* pass mmi in cursor to variable */
            EXIT WHEN c_unique_mmi%NOTFOUND;
      INSERT INTO member_months   
          (mmi,                                        
          member_nbr,
          lob,
          member_month,
          member_year,
          member_month_count)
    (SELECT master_member_id mmi,
            member_nbr,
            lob,
            month member_month,
            year member_year,
            ROW_NUMBER ()
              OVER (PARTITION BY member_nbr ORDER BY lob, year, month ASC)
              member_month_count
      FROM (SELECT DISTINCT
                       master_member_id,
                       member_nbr,
                       lob,
                       TO_CHAR (ADD_MONTHS (eligibility_start_date, LEVEL - 1), 'MM')
                          as MONTH,
                       TO_CHAR (ADD_MONTHS (eligibility_start_date, LEVEL - 1),
                                'YYYY')
                          as YEAR
                  FROM (SELECT *
                          FROM mmi_data
                         WHERE master_member_id = v_mmi)              /* v_mmi is current MMI variable passed from cursor */
            CONNECT BY LEVEL <=
                            MONTHS_BETWEEN (TRUNC (eligibility_end_date, 'MM'),
                                            TRUNC (eligibility_start_date, 'MM'))
                          + 1));
        commit;
        END LOOP;
        CLOSE c_unique_mmi;
    END;
    /Edited by: BluShadow on 08-Aug-2012 14:03
    added {noformat}{noformat} tags for readability.  Please read {message:id=9360002} and learn to do this yourself.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    All you would need is a direct insert like this
         insert into member_months   
                mmi
              , member_nbr
              , lob
              , member_month
              , member_year
              , member_month_count
          select   master_member_id mmi
              , member_nbr
              , lob
              , member_month
              , member_year
              , row_number () over (partition by member_nbr order by lob, year, month asc) member_month_count
            from (
                select distinct master_member_id
                    , member_nbr
                    , lob
                    , to_char (add_months (eligibility_start_date, level - 1), 'mm') as member_month
                    , to_char (add_months (eligibility_start_date, level - 1), 'yyyy') as member_year
                  from (
                      select *
                        from mmi_data
                        join (
                            select distinct master_member_id v_mmi
                              from member
                          on master_member_id = v_mmi
               connect
                    by level <= months_between (trunc (eligibility_end_date, 'mm'), trunc (eligibility_start_date, 'mm'))+ 1
              );Your procedure should have only this nothing else. Drop the cursor and looping.
    If you want help in tuning the above query please give us the following details.
    1. DB Version.
    2. Execution Plan
    3. Table Details (Number of rows)
    4. Index Details

  • Inserting the Brio Query Data into EIS OLAP Model  tables directly

    Dear Experts,
    Can someone please suggest how we can export data (result set records) from bqy files' queries into an EIS server's OLAP Model's tables?
    Right now I have a cube on Essbase server getting the data from excel sheets which store the data from the execution of .bqy files.
    Use of file system (excel sheets) has not been liked by my business users so now I need to avoid storing the data from brio queries into excel sheets for loading into the Essbase cube. This I am required to achieve using EIS/AIS so that the data from brio queries(.bqy files) can be directly inserted into Essbase cube with the help of (i.e. via) EIS/AIS.
    Any quick help would boost the life of this project of mine.
    Thank you,
    Vikas Jain

    user12072290 wrote:
    Dear Experts,
    Can someone please suggest how we can export data (result set records) from bqy files' queries into an EIS server's OLAP Model's tables?
    Right now I have a cube on Essbase server getting the data from excel sheets which store the data from the execution of .bqy files.
    Use of <A class=bodylinkwhite href="http://www.software-to-convert.com/h264-conversion-software/h264-to-quicktime-software.html"><FONT face=tahoma,verdana,sans-serif color=#000 size=1>file</FONT></A> system (excel sheets) has not been liked by my business users so now I need to avoid storing the data from brio queries into excel sheets for loading into the Essbase cube. This I am required to achieve using EIS/AIS so that the data from brio queries(.bqy files) can be directly inserted into Essbase cube with the help of (i.e. via) EIS/AIS.
    Any quick help would boost the life of this project of mine.
    Thank you,
    Vikas JainHave you got the answer? Would you pls post it here? 

  • Cursor within cursor insert problem.

    Hello,
    Am trying to pass a parameter from one cursor to another and making an insert in the 2nd cursor but its not inserting.
    declare
    v_txt_1 varchar2(10);
    v_txt_2 varchar2(10);
    cursor C1 is
    select txt1,txt2 from test1where country='INI';
    cursor C2(p_txt_1 varchar2,p_txt_2 varchar2) is
    select txt1,txt2 from test2
    where old_cat_1=p_txt_1
    and old_cat_2=p_txt_2;
    begin
    for c1rec in c1 loop
    v_cat_1 := 'AA';
    v_cat_2 := 'BB';
    v_cat_3 := 'CC';
    for c2rec in c2(c1rec.cat_1,c1rec.cat_2,c1rec.cat_3) loop
    insert into chnl_ini_test (new_cat_1,new_cat_2,new_cat_3,remarks) values (v_cat_1,v_cat_2,v_cat_3,'CATEGORY not found');
    end loop;
    end loop;
    end;
    END;
    the block gets executed but the insert is not happening. am not getting any error. if I place the insert below the first for loop then the insertion is happening.
    Any suggestions plz.

    Looking at the example code, it seems that the values to be INSERTED are not taken from the cursor.
    Instead we just want to insert the same values multiple times based on the no. of rows returned from the cursor c2 and c1.
    If that is the case :
    * Query for both cursor c1 and c2 can be Merged.
    * Count the no. of rows returned.
    * Use FORALL to insert all the records in one shot.
    DECLARE
        v_txt_1 VARCHAR2(10);
        v_txt_2 VARCHAR2(10);
        w_count PLS_INTEGER;
    BEGIN
        SELECT
            COUNT(*)
        INTO
            w_count
        FROM
                SELECT
                    t2.txt1,
                    t1.txt2
                FROM
                    test2 t2,
                    test1 t1
                WHERE
                    country  ='INI'
                AND old_cat_1=t1.txt1
                AND old_cat_2=t2.txt2
        v_cat_1  := 'AA';
        v_cat_2  := 'BB';
        v_cat_3  := 'CC';
        FORALL i IN 1 .. w_count
            INSERT
            INTO
                chnl_ini_test
                    new_cat_1,
                    new_cat_2,
                    new_cat_3,
                    remarks
                VALUES
                    v_cat_1,
                    v_cat_2,
                    v_cat_3,
                    'CATEGORY not found'
    END;

  • Can I insert a new text at the cursor's position In Formatted-Text-Edit?

    Hi Everyone,
        I am doing a task to enhance Formatted-Text-Edit,
        The task's goal is: simultaneously edit/display text and image.
        I decide to use Formatted-Text-Edit and Formatted-Text-View to realize the task.
        In Formatted-Text-Edit, user can edit the text and input URL for image,
        In Formatted-Text-View, the text and image can be displayed simultaneously.
        I add a button "Insert Image" besides the Formatted-Text-Edit,
       This is the key point:
       When user click the button, I want to insert a new text(URL of the image) at the cursor's position in Formatted-Text-Edit,
       But how can I get the cursor's position, and insert a new text at the cursor's position in FTE.
       Can anyone give me some suggestion or solution?
       Many thanks in advance.
    Best Regards,
    Derek

    I don't believe that you can get the cursor position yet with the FormattedTextEdit UI element. The best you could do currently is to place the insert at the end of the current text.
    Note duplicate thread/question to;
    Set cursor on the table column inherited with text edit UI element
    Please don't double post your questions in the forum.

  • CHECK for duplicate inside a cursor and pass a ouptut parameter in sql server 2008

    Hi All,
    I am inserting a value into a table, Before inserting i am checking that record already exists or not in the target table, If its existsing i am making an entry into errorlog table and set the output parameter to 'errorlog' . This is inside the cursor, as
    il be passing multiple values. Next is I have separate query to get the new record which is not in the target table. Using EXCEPT i get the new record and i insert into a main table. after insertion i set output as 'success'. 
    Here while executing the procedure i pass a duplicate value and a new value. As it is in cursor,first it will insert into errorlog and set output parameter as 'errorlog' .Next it will insert a new record into main table and set output parameter as 'Success'.
    So on completion of the execution of the procedure i get output as success.
    But i should get as errorlog. I should get success only on no errors in the procedure. How i can i achieve this? Please help me.
    Below is my code
    IF NOT EXISTS(SELECT Beginmilepost,BeginTrackName,Endmilepost,EndTrackName
    FROM SSDB_Segment WHERE BeginMilepost>=@BegMP AND EndMilepost<=@EndMP AND SearchID = @SearchID AND Reference = 'Range')
    BEGIN
                     Declare C_Max1 Cursor FOR
    (SELECT Beginmilepost,BeginTrackName,Endmilepost,EndTrackName FROM SSDB_Segment WHERE BeginMilepost = @BegMP AND EndMilepost = @EndMP AND  BeginTrackName = @BegtrkName 
    AND EndTrackName = @EndTrkName  AND SearchID = @SearchID)
      Open C_Max1
      FETCH FROM C_MAX1 INTO @BeginMilepost,@BTrackName,@EndMilepost,@ETrackName
    WHILE(@@FETCH_STATUS=0)
    BEGIN
    IF OBJECT_ID ('tempdb..#temp') IS NOT NULL
    BEGIN
          DROP TABLE #temp
    END--IF
    Select BeginLatitude,BeginLongitude,BeginTrackName,BeginMilepost,BeginMilepostPrefix,BeginMilepostSuffix,EndLatitude,EndLongitude,EndTrackName,EndMilepost,TrainType into #temp
    FROM
    SELECT BeginLatitude= case when @BegLat = 0 THEN NULL ELSE @BegLat end ,BeginLongitude= case when @BegLong=0 THEN NULL ELSE @BEgLong end ,@BTrackName AS BeginTrackName,ROUND(@BeginMilepost ,3) AS BeginMilepost,
    BeginMilepostPrefix= CASE WHEN @BegPrefix = 'null' THEN NULL ELSE @BegPrefix END,BeginMilepostSuffix= CASE WHEN @BegSuffix  = 'null' THEN NULL ELSE @BegSuffix  END,
    EndLatitude=case when @EndLat =0 then NULL else @EndLat end,EndLongitude=case when @Endlong = 0 THEN NULL ELSE @Endlong END,@ETrackName AS EndTrackName,ROUND(@EndMilepost ,3) AS EndMilepost,@TrainType AS TrainType 
    UNION ALL
    select BeginLatitude,BeginLongitude,BeginTrackName,ROUND(BeginMilepost,3) AS BeginMilepost,BeginMilepostPrefix,BeginMilepostSuffix, EndLatitude,EndLongitude,EndTrackName,ROUND(EndMilepost,3) AS EndMilepost,TrainType from SSDB_MaximumPermissibleSpeed)data
    group by  BeginLatitude,BeginLongitude,BeginTrackName,BeginMilepost,EndLatitude,EndLongitude,EndTrackName,EndMilepost,BeginMilepostPrefix,BeginMilepostSuffix,TrainType
    having COUNT(*)>1
    SET @COUNT= (select count(*) from #temp )
    Print @COUNT
    IF @COUNT>=1
    BEGIN
     INSERT INTO ErrorLog_Asset (
                                        ErrorCode,
                                        ErrorMessage,
                                        TableName,
                                        MilepostPrefix,
                                        Milepost
    SELECT
                                     '1',
                                     'Already exists at BeginMp '+ CAST(@BeginMilepost  as varchar) +',EndMp '+ CAST(@EndMilepost as varchar) +' ,Beginlat
    '+CAST(@BegLat   as varchar)
                                     +' ,Endlat '+CAST(@EndLat   as varchar)+', BeginTrackName '+@BTrackName  +' and EndTrackName '+@ETrackName
                                     'MaximumPermissibleSpeed',
                                      CASE WHEN @BegPrefix = 'null' THEN NULL
    ELSE @BegPrefix END ,
    @BeginMilepost  
    SET @output = 'Errorlog'
    END
     IF OBJECT_ID ('tempdb..#Max') IS NOT NULL
    BEGIN
     DROP TABLE #Max
    END--IF
     Select BeginLatitude,BeginLongitude,BeginTrackName,BeginMilepost,BeginMilepostPrefix,BeginMilepostSuffix,EndLatitude,EndLongitude,EndTrackName,EndMilepost,TrainType into #Max from
           (SELECT BeginLatitude= case when @BegLat = 0 THEN NULL ELSE @BegLat end ,BeginLongitude= case when @BegLong=0 THEN NULL ELSE @BEgLong end ,@BTrackName AS BeginTrackName,ROUND(@BeginMilepost ,3)
    AS BeginMilepost,
                  BeginMilepostPrefix= CASE WHEN @BegPrefix = 'null' THEN NULL ELSE @BegPrefix END,BeginMilepostSuffix= CASE WHEN @BegSuffix  = 'null' THEN NULL ELSE @BegSuffix  END,
                  EndLatitude=case when @EndLat =0 then NULL else @EndLat end,EndLongitude=case when @Endlong = 0 THEN NULL ELSE @Endlong END,@ETrackName AS EndTrackName,ROUND(@EndMilepost ,3) AS EndMilepost,@TrainType AS TrainType 
    except
                 select BeginLatitude,BeginLongitude,BeginTrackName,ROUND(BeginMilepost,3) AS BeginMilepost,BeginMilepostPrefix,BeginMilepostSuffix, EndLatitude,EndLongitude,EndTrackName,ROUND(EndMilepost,3) AS EndMilepost,TrainType
    from SSDB_MaximumPermissibleSpeed)data
     Declare C_Max2 Cursor FOR
     Select BeginMilepost,BeginTrackName,EndMilepost,EndTrackName from #Max 
      Open C_Max2
      FETCH FROM C_Max2 INTO  @BeginMP,@BeginTrackName,@EnMP,@EnTrackName
    WHILE(@@FETCH_STATUS=0)
    BEGIN
       IF (Select COUNT(*) from tbl_Trackname )>=1
       BEGIN
     IF (@TrainType IN (SELECT TrainType  FROM SSDB_TrainType )AND (@Speed <>0) AND @BeginMP IS NOT NULL AND @BeginTrackName IS NOT NULL  AND @EnMP IS NOT NULL
     AND @Direction IN (SELECT Direction FROM SSDB_Direction) AND @EnTrackName IS NOT NULL )
     BEGIN-------------
     SET @ID = (Select MAX(MaximumpermissibleSpeedID) from SSDB_MaximumPermissibleSpeed)
    IF @COUNT =0
       BEGIN
                          INSERT INTO SSDB_MaximumPermissibleSpeed
    BeginMilepostPrefix,
    BeginMilepostSuffix,
    BeginMilepost,
    BeginTrackName,
    BeginLatitude,
    BeginLongitude,
    BeginElevation,
    EndMilepostPrefix,
    EndMilepostSuffix,
    EndMilepost,
    EndTrackName,
    EndLatitude,
    EndLongitude,
    EndElevation,
    Direction,
    Speed,
    TrainType,
    Description,
    InsertUser,
    S_ID
                                                     SELECT
      CASE WHEN @BegPrefix = 'null' THEN NULL
      ELSE @BegPrefix END,
                          CASE WHEN @BegSuffix = 'null' THEN NULL
      ELSE @BegSuffix END,
      @BeginMP ,
      @BeginTrackName  ,
      case WHEN @BegLat = 0 THEN NULL
      ELSE @BegLat END,
      CASE WHEN @BegLong=0 THEN NULL
      ELSE @BegLong END ,
      CASE WHEN @BegEle = 0 THEN NULL
      ELSE @BegEle END ,
      CASE WHEN @EndPrefix = 'null' THEN NULL
      ELSE @EndPrefix END,
                          CASE WHEN @EndSuffix = 'null' THEN NULL
      ELSE @EndSuffix END,
      @EnMP ,
      @EnTrackName  ,
      case WHEN @EndLat = 0 THEN NULL
      ELSE @EndLat END,
      CASE WHEN @EndLong=0 THEN NULL
      ELSE @EndLong END ,
      CASE WHEN @EndEle = 0 THEN NULL
      ELSE @EndEle END ,
      @Direction ,
      @Speed ,
      @TrainType ,
      CASE WHEN @Description ='null' THEN NULL
      ELSE @Description END ,
      @InsertUser ,
      @UID     
    INSERT INTO SSDB_MaxSpeed_History
                       MSID,
    BeginMilepostPrefix,
    BeginMilepostSuffix,
    BeginMilepost,
    BeginTrackName,
    BeginLatitude,
    BeginLongitude,
    BeginElevation,
    EndMilepostPrefix,
    EndMilepostSuffix,
    EndMilepost,
    EndTrackName,
    EndLatitude,
    EndLongitude,
    EndElevation,
    Direction,
    Speed,
    TrainType,
    Description,
    S_ID,
    NOTES ,
    [Action] ,
    InsertUser
                                 SELECT 
                          (Select MaximumPermissibleSpeedID from SSDB_MaximumpermissibleSpeed WHERE MaximumPermissibleSpeedID > @ID),
                          CASE WHEN @BegPrefix = 'null' THEN NULL
      ELSE @BegPrefix END,
                          CASE WHEN @BegSuffix = 'null' THEN NULL
      ELSE @BegSuffix END,
      @BeginMP ,
      @BeginTrackName  ,
      case WHEN @BegLat = 0 THEN NULL
      ELSE @BegLat END,
      CASE WHEN @BegLong=0 THEN NULL
      ELSE @BegLong END ,
      CASE WHEN @BegEle = 0 THEN NULL
      ELSE @BegEle END ,
      CASE WHEN @EndPrefix = 'null' THEN NULL
      ELSE @EndPrefix END,
                          CASE WHEN @EndSuffix = 'null' THEN NULL
      ELSE @EndSuffix END,
      @EnMP ,
      @EnTrackName  ,
      case WHEN @EndLat = 0 THEN NULL
      ELSE @EndLat END,
      CASE WHEN @EndLong=0 THEN NULL
      ELSE @EndLong END ,
      CASE WHEN @EndEle = 0 THEN NULL
      ELSE @EndEle END ,
      @Direction ,
      @Speed ,
      @TrainType ,
      CASE WHEN @Description ='null' THEN NULL
      ELSE @Description END ,
      @UID,
      NULL,
      'INSERT',
      @InsertUser 
    set @output='Success'
    --IF ((@COUNT >=1) AND (@COUNT =0))
    --BEGIN
    --  SET @output = 'ErrorLog'
    --END
    --IF (@COUNT = 0)
    -- BEGIN
    --SET @output ='Success'
    --END
    --END
    END
    END------------------------> 
    Deepa

    Hi Deepa,
    If I understand your question correctly, you would like the @Output parameter to contain the value "Success" only if all rows were successful. As soon as one row was found to be a duplicate, the value of @Output at the end of execution should be "ErrorLog".
    Currently, you modify the value of @Output in each iteration of the cursor, so at the end of execution you're left with the last value.
    In order to change that to work the way you want it, you need to set the value of @Output in the beginning of execution (before entering the cursor) to "Success", and as soon as there is a duplicate row, you should modify the value to "ErrorLog". This way,
    if all rows are successful, the value of @Output will be "Success" at the end of execution. On the other hand, if there is even a single duplicate row, the value of @Output will be "ErrorLog" at the end of execution.
    I hope this helps...
    Guy Glantser
    SQL Server Consultant & Instructor
    Madeira - SQL Server Services
    http://www.madeirasql.com

  • SYS_REFCURSOR takes more time than direct query execution

    I have a stored proc which has 4 inputs and 10 output and all outputs are sys_refcursor type.
    Among 10 ouputs, 1 cursor returns 4k+ records and all other cursors has 3 or 4 records and average 5 columns in each cursors. For this, it takes 8 sec to complete the execution. If we directly query, it gives output in .025 sec.
    I verified code located the issue with cursor which returns 4k+ only.
    The cursor opening from a temporary table (which has 4k+ records ) without any filter. The query which inserted into temporary is direct inserts only and i found nothing to modify there.
    Can anyone suggest, how we can bring the results in less than 3 sec? This is really a challenge since the code needs to go live next week.
    Any help appreciated.
    Thanks
    Renjish

    I've just repeated the test in SQL*Plus on my test database.
    Both the ref cursor and direct SQL took 4.75 seconds.
    However, that time is not the time to execute the SQL statement, but the time it took SQL*Plus in my command window to print out the 3999 rows of results.
    SQL> create or replace PROCEDURE TEST_PROC (O_OUTPUT OUT SYS_REFCURSOR) is
      2  BEGIN
      3    OPEN  O_OUTPUT FOR
      4      select 11 plan_num, 22  loc_num, 'aaa' loc_nm from dual connect by level < 4000;
      5  end;
      6  /
    Procedure created.
    SQL> set timing on
    SQL> set linesize 1000
    SQL> set serverout on
    SQL> var o_output refcursor;
    SQL> exec test_proc(:o_output);
    PL/SQL procedure successfully completed.
    Elapsed: 00:00:00.04
    SQL> print o_output;
      PLAN_NUM    LOC_NUM LOC
            11         22 aaa
            11         22 aaa
            11         22 aaa
            11         22 aaa
            11         22 aaa
    3999 rows selected.
    Elapsed: 00:00:04.75
    SQL> select 11 plan_num, 22  loc_num, 'aaa' loc_nm from dual connect by level < 4000;
      PLAN_NUM    LOC_NUM LOC
            11         22 aaa
            11         22 aaa
            11         22 aaa
            11         22 aaa
            11         22 aaa
            11         22 aaa
    3999 rows selected.
    Elapsed: 00:00:04.75
    That's the result I expect to see, both taking the same amount of time to do the same thing.
    Please demonstrate how you are running it and getting different results.

Maybe you are looking for

  • Lenovo G510 Windows 8 Read Only Folders

    Hello everyone, I just bought a new Lenovo G510, and I installed the Windows 8 that was on the partition. The installation went fine, and everything is in order. The only problem is that ALL my folders are Set to READ ONLY. Is this something normal f

  • Problems with path

    Alright guys I got a book on java and I'm trying to learn but things aren't going too great. My first problem is the path. I understand that i need to change something under Environmental Variables but I do not understand if I need to make changes to

  • IDML modification using xsl and xml

    Hi All I am new to IDML development and get stuck at first step. I know that IDML becomes a bunch of xml files so if I create a template and Export it as IDML and for processing the job, I want to modify the story xml file among all the xml files whi

  • Bank Charges in F110

    Hi Everybody, We want to post the bank charges,while executing F110 transaction. We have tried with 'Expenses/Charges' in FBZP,but of no use,as its only ment for printing purpose. We found from this forum that 'Instruction Key' will take care of this

  • Confuse to develope web application.

    Hello Team, I am developing web application in ADF tehcnology...... I want to know that M I going on right direction? plese help me.... I have one form registration & on that I have 3 screen for this form.... My first Apprach:- I develope One Jspx Pa