DIRECT INSERTS

Dear forum,
We can directly insert to table in bulk like
insert into target as select * from source1 a,source_2 b where a.s1=b.s2
From the perfomance itself we know that above is n't a record by record insert instead it is bulk insert .
1) Experts pls tell me ..how oracle manages above query
2)How many records it send to db @ a time
Thanks

hi keith,
Pls see my query
insert into pkg_tmp
select           find_key(VALUE,8) ,
          (VL_ID)mod10 ,
          VALUE_ST     /* */     
from
SOURCE_ATTR d,
STOCK c,
CONTENT b ,
PACKAGES a /* */
where
a.PACKAGE_ID = b.PACKAGE_ID
and
     b.TP_ID =1
     and
     b.VL_ID = c.RRS_ID
and
     c.RRS_ID = VAL_ID
     and
     ATTR_ID = 11
Tables in from clause contains records in millions ( a = 33M ,b=37M,C=37M,D=122M)...
I need to select the query as shown in query and finally result to be loaded into pkg_tmp table.
Thanks..

Similar Messages

  • Direct insert Vs collection

    Greetings,
    In one of my procedures, I've used collection with commit every 1000 records. It took about 4 hrs to process 14,000,000 records. Then, without changing the logic, I've modified the same procedure and did direct insert and delete statements. It took about 1.5 hrs.
    I was wrong thinking that collection would improve performance as they fetch all at the same time. Is this because of 'commit'? I have also experimented with commit every 10,000 records, even then it didn't improve much.
    Could you discuss further on why slow Vs faster? Thank you,
    Lakshmi

    The rules of thumb (borrowed from Tom Kyte)
    1) If you can do it in SQL, do it in SQL
    2) If you can't do it in SQL, do it in PL/SQL. "Can't do it in SQL" generally means that you can't figure out how to code the logic in SQL and/or the business logic is so complex that it makes the SQL unreadable.
    2a) If you have to use PL/SQL, try to use collections & bulk processing.
    3) If you can't do it in PL/SQL, do it in a Java stored procedure
    4) If you can't do it in a Java stored procedure, do it in an external procedure
    Collections are never preferred over straight SQL from a performance standpoint, but they may be easier to code for more complex business rules.
    Justin

  • Direct Insert path

    I want to insert large data from dblinks using direct insert however as rollback segment is quite small i need to commit after every 2000 rows or so.
    insert /*+ append */ into abc_monthly@dblink select * from abc partition(ODFRC_20040201);
    any help guys !
    or any other way round for faster insert than this ?
    Edited by: Avi on Aug 12, 2009 9:41 AM

    Hi,
    Don't shoot the messenger but your append hint will be ignored:
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:549493700346053658

  • How to directly insert pictures inside a wiki page without the need to enter the picture's properties

    I am working on a publishing site collection using the enterprise wiki template. Add when I want to add a new Picture from my Pc to the wiki page, I will be prompted with the following dialog:-
    So can anyone advice how i can bypass this dialog and directly add the imag to the wiki page?
    i tried modifying the "Images" content type by hiding the above fields, but still i will be prompted to enter the following :-

    Hi,
    According to your post, my understanding is that you want to directly insert pictures inside a wiki page without the need to enter the picture's properties.
    Per my knowledge, you can use javascript to bypass Edit Properties Page while uploading pictures.
    Here are some similar articles for your reference:
    How to bypass Edit Properties
    Page while uploading documents in a document library
    How to bypass Edit Properties Page while uploading documents in a
    SharePoint and .NET: How to bypass Edit Properties Page or Skip EditForm.aspx Page while uploading documents in a document library
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • Direct Insert or API

    I want to create/insert directly into CI_FUNCTION_ENTITY_USAGES (i$sdd_funent) by using appropriate values for all the not-null columns. I can see that ci_function_entity_usages.function_ref comes from ci_functions.irid,
    ci_function_entity_usages.entity_ref comes from ci_entities.irid, etc ...
    My question is : How do I populate ci_function_entity_usages.id,
    ci_function_entity_usages.irid, ci_function_entity_usages.ivid.
    Are they primed from some internal sequence or some pre-insert
    trigger somewhere?
    Is there a better way of doing this than using direct insert eg an api?
    Thanks in advance
    Ian

    At the moment I've no access to any Designer repository.
    The idea of API is that for each object there is individual PL/SQL package. For ci_function_entity_usages it should be ciofunction_entity_usage. In this package is type 'data' defined. It has 2 components: i and v. i is record of boolean variables. v is record of values - it maches ci_function_entity_usages view.
    To insert new function-entity usage you have to fill in data.v values and set data.i to true for elements you filled in in data.v. Then use ciofunction_entity_usage.ins procedure to insetr record.
    You do not have to set ID, ivid or irid - it is set by API.
    You can follow example.
    Hope it helps. Paweł

  • How to load data to a "clustered table" quickly?  direct insert not work

    We have a single hash clustered table whose size is 45G, and we used the command:
    insert /*+ append */ into t_hashed as select * from source.
    the source is about 30G, it takes a very long time(several days, and not yet completed).
    I dumped the redo log, and find there are lots of redo info about the insert. I have thought it should create much redo as it is "direct path insert", but from the dump info I could say the direct path insert didn't take effect.

    Your assessment is correct. INSERT /*+ APPEND */ does not work with a hash clustered table.
    If you think about how hash clusters work, and how direct load insert works, it should be clear to you why a direct load insert via the append hint can never work.
    How hash clusters work (well, only the part that's relevant to the current discussion):
    Initially, at creation time, the storage for the entire cluster is preallocated, based on the various cluster parameters you chose. Think about that. All the storage is allocated to the cluster.
    How direct load works (well, only the part that's relevant to the current discussion):
    Space already allocated to and used by the segment is ignored. Space is taken from free, unformatted blocks from above the high water mark.
    So, hopefully it's clear how these features are incompatible.
    Bottom line:
    It doesn't work, there's no way around it, there's nothing you can do about it....
    -Mark

  • Cursor v/s direct insert

    hi all,
    which one is better...?
    defining a cursor for an insert or using the select stmt in the insert?
    ex1:
    declare cursor c1 as
    select a,b from table1
    begin
    for i in c1
    loop
    insert into table2 values i.a, i.b ;
    end loop;
    end;
    ex2
    begin
    insert into table2 values (select a, b from table1);
    end;
    thnks

    ...and here are the facts:
    SQL> set timing on
    SQL> declare
      2  cursor c is
      3  select *
      4  from all_Tables;
      5  begin
      6  for r in c loop
      7  insert into test values(
      8  r.OWNER                    ,
      9  r.TABLE_NAME               ,
    10  r.TABLESPACE_NAME          ,
    11  r.CLUSTER_NAME             ,
    12  r.IOT_NAME                 ,
    13  r.PCT_FREE                 ,
    14  r.PCT_USED                 ,
    15  r.INI_TRANS                ,
    16  r.MAX_TRANS                ,
    17  r.INITIAL_EXTENT           ,
    18  r.NEXT_EXTENT              ,
    19  r.MIN_EXTENTS              ,
    20  r.MAX_EXTENTS              ,
    21  r.PCT_INCREASE             ,
    22  r.FREELISTS                ,
    23  r.FREELIST_GROUPS          ,
    24  r.LOGGING                  ,
    25  r.BACKED_UP                ,
    26  r.NUM_ROWS                 ,
    27  r.BLOCKS                   ,
    28  r.EMPTY_BLOCKS             ,
    29  r.AVG_SPACE                ,
    30  r.CHAIN_CNT                ,
    31  r.AVG_ROW_LEN              ,
    32  r.AVG_SPACE_FREELIST_BLOCKS,
    33  r.NUM_FREELIST_BLOCKS,
    34  r.DEGREE             ,
    35  r.INSTANCES          ,
    36  r.CACHE              ,
    37  r.TABLE_LOCK         ,
    38  r.SAMPLE_SIZE        ,
    39  r.LAST_ANALYZED      ,
    40  r.PARTITIONED        ,
    41  r.IOT_TYPE           ,
    42  r.TEMPORARY          ,
    43  r.SECONDARY          ,
    44  r.NESTED             ,
    45  r.BUFFER_POOL        ,
    46  r.ROW_MOVEMENT       ,
    47  r.GLOBAL_STATS       ,
    48  r.USER_STATS         ,
    49  r.DURATION           ,
    50  r.SKIP_CORRUPT       ,
    51  r.MONITORING         ,
    52  r.CLUSTER_OWNER      ,
    53  r.DEPENDENCIES       ,
    54  r.COMPRESSION        ,
    55  r.DROPPED
    56  );
    57  end loop;
    58* end;
    SQL> /
    PL/SQL procedure successfully completed.
    Elapsed: 00:00:04.07
    SQL> insert into test
      2  select *
      3  from all_tables;
    3718 rows created.
    Elapsed: 00:00:01.07
    SQL> Regards,
    Gerd

  • Inserting the Brio Query Data into EIS OLAP Model  tables directly

    Dear Experts,
    Can someone please suggest how we can export data (result set records) from bqy files' queries into an EIS server's OLAP Model's tables?
    Right now I have a cube on Essbase server getting the data from excel sheets which store the data from the execution of .bqy files.
    Use of file system (excel sheets) has not been liked by my business users so now I need to avoid storing the data from brio queries into excel sheets for loading into the Essbase cube. This I am required to achieve using EIS/AIS so that the data from brio queries(.bqy files) can be directly inserted into Essbase cube with the help of (i.e. via) EIS/AIS.
    Any quick help would boost the life of this project of mine.
    Thank you,
    Vikas Jain

    user12072290 wrote:
    Dear Experts,
    Can someone please suggest how we can export data (result set records) from bqy files' queries into an EIS server's OLAP Model's tables?
    Right now I have a cube on Essbase server getting the data from excel sheets which store the data from the execution of .bqy files.
    Use of <A class=bodylinkwhite href="http://www.software-to-convert.com/h264-conversion-software/h264-to-quicktime-software.html"><FONT face=tahoma,verdana,sans-serif color=#000 size=1>file</FONT></A> system (excel sheets) has not been liked by my business users so now I need to avoid storing the data from brio queries into excel sheets for loading into the Essbase cube. This I am required to achieve using EIS/AIS so that the data from brio queries(.bqy files) can be directly inserted into Essbase cube with the help of (i.e. via) EIS/AIS.
    Any quick help would boost the life of this project of mine.
    Thank you,
    Vikas JainHave you got the answer? Would you pls post it here? 

  • Inserting values from a PL SQL table to a database table

    Hi,
    Here is my dilemma.
    I have values inserted to a pl sql table which I have gathered from a web page. Now I need to add these same values to a database table. I see the values as 1-'AA', 2='BB' etc.
    I use the following code to insert to a outside table in the database and it seems to be crashing.
    for i in app_table.first .. app_table.last loop
    r_convention(i).priority_country_code:=app_table(i);
    insert into test_countries
    values (test_seq,tion(i).r_convention(i).priority_country_code);
    commit;
    end loop;
    WHen I run this code I see the values. But it goes to the first value and then crashes and does not go through the
    rest of the values. What am I missing here?
    Thanks!

    Hi,
    Why can't you directly insert into the table from the object type....
    for i in app_table.first .. app_table.last loop
    r_convention(i).priority_country_code:=app_table(i);
    insert into test_countries
    values (test_seq,app_table(i));
    commit;
    end loop;Can you give me your complete code, so that we can have a better picture..
    Edited by: plsql dev on Sep 10, 2010 10:14 PM

  • Insert/update japanese langunage data in a column of datatype varchar2(..)

    Hello,
    I am using ORACLE DATABASE 11g (EE) and RHEL 5.
    I want to insert/update japanese language data in a column which has the datatype as varchar2(256).
    I tried to change the NLS_LANGUAGE and NLS_TERRITORY parameters with 'ALTER SESSION set ...' command but no effect.
    I tried to bounce back ( shutdown and startup ) the DB but still no effect.
    I tried to inset the NLS_LANGUAGE and NLS_TERRITORY in init.ora file but still no use.
    If anybody knows the detail steps which i have mentioned above .... let me know. Might be that i am wrong in my method.
    Can you please guide me how to change the language of DB for a perticular session to japanese ???
    Thanks in advance...
    Edited by: VJ4 on May 9, 2011 6:21 PM

    VJ4 wrote:
    Thanks for the info.
    Yes i tried with UNISTR function and was able to insert the data successfully.
    but the point is that we can't remember unicode for each of the letter. It's their any method that we can directly insert japanese character using an insert.
    As you said :-
    Note that changing database character set is something complicated that requires many steps.
    Can you please provide me some links or some stuffs to study about the detail steps of chaining database character set.
    I have gone through the Oracle online documentation.. if you can pin point any good link in it you can else provide me some other stuff.
    Thanks .You will need to convert your database characterset to AL32UTF8. This is not a trivial exercise if your database already has data in it. See these MOS Docs
    Changing the NLS_CHARACTERSET to AL32UTF8 / UTF8 (Unicode)          (Doc ID 260192.1)
    AL32UTF8 / UTF8 (Unicode) Database Character Set Implications          (Doc ID 788156.1)
    http://download.oracle.com/docs/cd/E11882_01/server.112/e10729/ch11charsetmig.htm#g1011430
    HTH
    Srini

  • JDBC insert with XMLTYPE data type

    Hi,
    SOAP to JDBC scenario. Oracle 11G as a receiver.
    Requirement is to  insert whole xml payload message in one of Oracle table fields as a xml string. Target oracle DB table column is defined with XMLTYPE data type, it has capacity to hold xml data more than 4GB. I am using graphical mapping with direct INSERT statement.
    When I try to insert xml payload with smaller size transaction goes through. However when the xml payload size increases it is giving following error in JDBC receiver communication channel monitoring.
    Could not execute statement for table/stored proc. "TABLE_NAME" (structure "StructName") due to java.sql.SQLException: ORA-01704: string literal too long
    Here is insert statement as in communication channel monitoring. (Note: XML payload with bold characters is truncated)
    INSERT INTO  TABLE_NAME (REQ_ID, OUTAGE_OBJ, TIMESTAMP, PROCESSED_FLAG) VALUES (VAL1, <?xml version="1.0" encoding="UTF-8"?>............</>, TO_DATE(2010-11-15 10:21:52,YYYY-MM-DD HH24:MI:SS), N)
    Any suggestions to handle this requirement?
    Thank you in advance.
    Anand More.

    Hi Anand,
    The problem here is definitely the length of the SQL query. i.e "INSERT INTO ......... VALUES......."
    This is what i got when i searched for this ORACLE error code:
    ORA-01704: string literal too long
    Cause: The string literal is longer than 4000 characters.
    Action: Use a string literal of at most 4000 characters. Longer values may only be entered using bind variables.
    Please ask a ORACLE DB expert on how to handle this Also i am not sure how can we handle Bind Varibales in SAP PI.
    I hope this helps.
    Regards, Gaurav.

  • SYS_REFCURSOR takes more time than direct query execution

    I have a stored proc which has 4 inputs and 10 output and all outputs are sys_refcursor type.
    Among 10 ouputs, 1 cursor returns 4k+ records and all other cursors has 3 or 4 records and average 5 columns in each cursors. For this, it takes 8 sec to complete the execution. If we directly query, it gives output in .025 sec.
    I verified code located the issue with cursor which returns 4k+ only.
    The cursor opening from a temporary table (which has 4k+ records ) without any filter. The query which inserted into temporary is direct inserts only and i found nothing to modify there.
    Can anyone suggest, how we can bring the results in less than 3 sec? This is really a challenge since the code needs to go live next week.
    Any help appreciated.
    Thanks
    Renjish

    I've just repeated the test in SQL*Plus on my test database.
    Both the ref cursor and direct SQL took 4.75 seconds.
    However, that time is not the time to execute the SQL statement, but the time it took SQL*Plus in my command window to print out the 3999 rows of results.
    SQL> create or replace PROCEDURE TEST_PROC (O_OUTPUT OUT SYS_REFCURSOR) is
      2  BEGIN
      3    OPEN  O_OUTPUT FOR
      4      select 11 plan_num, 22  loc_num, 'aaa' loc_nm from dual connect by level < 4000;
      5  end;
      6  /
    Procedure created.
    SQL> set timing on
    SQL> set linesize 1000
    SQL> set serverout on
    SQL> var o_output refcursor;
    SQL> exec test_proc(:o_output);
    PL/SQL procedure successfully completed.
    Elapsed: 00:00:00.04
    SQL> print o_output;
      PLAN_NUM    LOC_NUM LOC
            11         22 aaa
            11         22 aaa
            11         22 aaa
            11         22 aaa
            11         22 aaa
    3999 rows selected.
    Elapsed: 00:00:04.75
    SQL> select 11 plan_num, 22  loc_num, 'aaa' loc_nm from dual connect by level < 4000;
      PLAN_NUM    LOC_NUM LOC
            11         22 aaa
            11         22 aaa
            11         22 aaa
            11         22 aaa
            11         22 aaa
            11         22 aaa
    3999 rows selected.
    Elapsed: 00:00:04.75
    That's the result I expect to see, both taking the same amount of time to do the same thing.
    Please demonstrate how you are running it and getting different results.

  • INSERT Script stored as "SQL Statement Script" not firing on Oracle Alert

    We are using Alert Manager to run Alerts based on absence information in EBS R12.1.3 HR tables.
    Application: Human Resources
    Period/Event: Periodic
    Frequency: Every Day
    The Alert has 2 actions.
    One is to email a supervisor based on the Alert SQL finding staff who have had x number of absences over a set period of time.
    That is working fine.
    The other action is defined as an "SQL Statement Script".
    The SQL script inserts the variables defined in the SQL from the main Alert definition SQL, into a custom table.
    When the Alert fires, we are getting the email generated, but the INSERT statement isn't firing.
    However, when we look at the result of the Alert when it has run, and look at the steps which were fired, we can see the email step and the SQL step. When we look at the SQL in the SQL step which was generated, we can copy and paste it direct into TOAD and it works fine.
    We have tried a simple SQL INSERT statement which even just does:
    INSERT INTO xx_tmp (VALUE1_DATE, VALUE2) VALUES (SYSDATE, 'this');
    But even that does not work.
    We wonder if the issue is that we cannot do a direct INSERT from an alert type of "SQL Statement Script"?
    We have found another Alert which does a direct UPDATE on a table, but the SQL Statement Script definition has "Purchasing" as the Application, because it's updating a PO table.
    In our case, we're trying to insert into a custom table held in our in-house XX schema.
    Could the problem be that we don't have the "Application" set? If that's the case, we don't have a corresponding application anyway.
    Or can we not do a direct INSERT like this - do we need to use a package instead, and call that?
    Any advice much appreciated.
    Thanks

    Hi Brian,
    Alert issue - Custom SQL fix
    1. Define actions via "Actions" button on Alert Definition
    2. Create an action for e.g. "Send Message to Supervisor"
       Action type: message
       Action - sends email to people
    3. Create another action called e.g. "Insert Record"
       Action type: SQL Statement Script
       Fill in SQL in the "Text" section, no need to add a begin or end bit, just a simple INSERT INTO X etc ending with a semi colon
    4. Back on the main alert definition window, go to "Action Sets" button
       Here, need one SEQ for the Message / Email (e.g. Seq 1), and then click on "Action Set Details" and click on "Members" tab and select the "Send Message to Supervisor" Action
       Then close the "Action Set Details" window, and return to the "Action Sets" window
       click into 2nd line, add e.g. Seq 2, and call it e.g. "Insert Record", click on "Action Set Details" > Members tab and select the "Insert Record" The problem was that in the first instance, we had a single Action Set, and under "Members" had selected the "Send Message to Supervisor" and "Insert Record" at the same time.
    Short answer: each Action needs to be defined as a separate line in the Action Sets definition.
    I can send screenshots if you need them, and don't mind putting your email address on this forum.
    You can always edit the thread after I've emailed you, to remove your email address.

  • Multiple table insert using receiver jdbc adapter

    Hi,
    I am trying to insert data in to two tables in a single structure using receiver jdbc adapter. I am not using any stored procedure to insert data instead directly inserting the data using PI. Please see the structure I am using.
    SOURCE side:
    DT_ABC_SENDER
    --IT_HEADER_TEXT
      -- EBELN
      -- LINENO
      --TDTEXT
    --IT_ITEM_TEXT
      -- EBELN
      -- LINENO
      --TDLINE
    TARGET side:
    DT_ABC_RECEIVER
    --InsertStatement
         --HEADER_TEXT
                -- action                         (insert)
                -- Table                          (Table 1)
                --access
                     -- IDS_ENQ_NO
                     -- IDS_DESC
                     -- IDS_TEXT
       --ITEM_TEXT
                -- action                         (insert)
                -- Table                          (Table 2)
                --access
                     -- IIS_ENQ_NO
                     -- IIS_DESC
                     -- IIS_TEXT
    Using the above structure I am able to successfully insert the data in Table 1 but data is not getting inserted in Table 2.
    In sxmb_moni it is saying message successfully delivered but I but there is data insertion took place in Table 2.
    Please help me urgently.
    Thanks in advance.
    Neeeraj

    Hi Neeraj,
    Add --InsertStatement statement for the second table structure in the same level of first InsertStatement.
    Target structure like this:
    DT_ABC_RECEIVER
    --InsertStatement
         --HEADER_TEXT
                -- action                         (insert)
                -- Table                          (Table 1)
                --access
                     -- IDS_ENQ_NO
                     -- IDS_DESC
                     -- IDS_TEXT
    --InsertStatement
       --ITEM_TEXT
                -- action                         (insert)
                -- Table                          (Table 2)
                --access
                     -- IIS_ENQ_NO
                     -- IIS_DESC
                     -- IIS_TEXT

  • Efficient Technique to Insert Data in Table ???

    Hi
    I want to implement the best efficient technique to insert data in Table. I have rows counts in millions in one db and I want to insert them in to another db with least time.
    >>
    I am implementing the technique of reading data from first db, apply some logics on it, insert the data in the log file (log4j) in the tab and line separated format and after completing the process insert log file data into the final table (using mysql load data local infile).
    >>
    Another technique that I use is to directly insert data into the final table after reading it from first db and applying logics on it.
    >>
    Another technique that I am going to use is using the JDBC Batch Update i.e. updating around (say 1000) rows in a final table after reading and applying logics on it.
    Bench Marks:
    The first technique takes 2 minutes for 60,000 rows
    The second technique takes 4.5 minutes for 60,000 rows
    The third technique is not implemented yet
    Do anyone have any other technique to implement this working in more efficient way?
    Regards
    Rizwan Ahmed (rizzz86)

    When optimizing JDBC code, there are several things you should look into:
    * use batch statements (reducing the number of roundtrips is really important!)
    * disable autoCommit and commit() on your own (reducing the number of roundtrips and the number of disk accesses on the DB server)
    * use PreparedStatements (avoid parsing the same statement again and again)
    Those three should buy you quite a bit of time.
    Edit: if it's still too slow (and only then!) you can look at the documentation of your JDBC driver. Sometimes there are options that can be enabled to speed things up (MySQL is especially bad here: there are some advanced features that are disabled by default, just because older server releases don't support them. I don't understand why they arent enabled conditionaly.).

Maybe you are looking for