Fastest method of inserting

hi guys,
am using java as my front end and oracle 10g as the back end. i need to find out which is the fastest way to insert bulk( say about 100 records) of data in the database. i've used statement, prepared statements, callable statement, Updatebatch and forall. can anyone suggest someother way to insert data in a faster method.

Tom Kyte discusses such an issue here.
Regards,
Georger
user10894075 wrote:
hi guys,
am using java as my front end and oracle 10g as the back end. i need to find out which is the fastest way to insert bulk( say about 100 records) of data in the database. i've used statement, prepared statements, callable statement, Updatebatch and forall. can anyone suggest someother way to insert data in a faster method.

Similar Messages

  • ORA-22370: incorrect usage of method AnyData Insert

    When I insert inherited SQL Types as payload into ANYDATA queue I am getting the following error.
    ERROR at line 1:
    ORA-00604: error occurred at recursive SQL level 1
    ORA-22370: incorrect usage of method AnyData Insert
    ORA-06512: at "SYS.DBMS_AQ", line 243
    ORA-06512: at line 15
    The above error does not happen if the strucutre is flat and not inherited from other type.
    I have DOCUMENT_ADD_EVENT structure extends from EVENT_TYPE. IF I insert DOCUMENT_ADD_EVENT as payload, I get the above error but If I use DOCUMENT_ADD_EVENT_FLAT which does not extend any type works fine.
    Here is the example code :
    create type event_type as object
    name VARCHAR2(100 CHAR),
    description VARCHAR2(1000 CHAR),
    actor_id number,
    source_eid number,
    source_scope number,
    type VARCHAR2(1 CHAR)
    ) NOT FINAL NOT INSTANTIABLE;
    CREATE TYPE DOCUMENT_ADD_EVENT UNDER event_type
    properties DOCUMENT_ADDED
    CREATE TYPE DOCUMENT_ADD_EVENT_FLAT AS OBJECT
    name VARCHAR2(100 CHAR),
    description VARCHAR2(1000 CHAR),
    actor_id RAW(16),
    source_eid RAW(16),
    source_scope RAW(16),
    type VARCHAR2(1 CHAR),
    properties DOCUMENT_ADDED
    DECLARE
    d document_type := document_type(2,'document1');
    a actor_type := actor_type(2,'actor1','[email protected]');
    w workspace_type := workspace_type(2,'MarketingWorkspace');
    da document_added := document_added(d,a,w);
    --dae document_add_Event := document_add_event(null,null,null,null,null,null,da);
    dae document_add_Event_flat := document_add_Event_flat(null,null,null,null,null,null,da);
    payload SYS.ANYDATA;
    Enq_ct DBMS_AQ.Enqueue_options_t;
    Msg_prop DBMS_AQ.Message_properties_t;
    Enq_msgid RAW(16);
    begin
    payload := ANYDATA.convertObject(dae);
    Msg_prop.Exception_queue := null;
    Msg_prop.Correlation := null;
    DBMS_AQ.ENQUEUE('ocs.beehive_events', Enq_ct, Msg_prop, payload, Enq_msgid);
    end;
    /

    As said above, the example posted works on 10.2 as well as 9.2.0.5. Can you post a similar example (as my example below) that shows the problem and the exact version of the Oracle database?
    SQL> create or replace type xObj as object (
      2          name varchar2(32757),
      3          constructor function xObj return self as result
      4  );
      5  /
    Type created.
    SQL> CREATE OR REPLACE type body xObj as
      2          constructor function xObj return self as result is
      3          begin
      4                  name := 'this is a sample';
      5                  return;
      6          end;
      7  end;
      8  /
    Type body created.
    SQL> CREATE GLOBAL TEMPORARY TABLE DATA_TABLE ( DATA ANYDATA ) ;
    Table created.
    SQL> declare
      2          inv xobj := xobj();
      3          tmp anydata;
      4  begin
      5          tmp := anydata.convertObject(inv);
      6          insert into data_table values(tmp);
      7  end;
      8  /
    PL/SQL procedure successfully completed.
    SQL> disconnect
    Disconnected from Oracle9i Enterprise Edition Release 9.2.0.5.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.5.0 - Production
    SQL>                                            

  • Fastest method of sending XML

    I'm trying to figure out what the fastest method of sending/receiving XML to/from
    a webservice. What involves the least amount of work on the client and (more
    importantly) on the server side?
    Strings actually seem to be fairly quick, org.w3c.xml.Elements are very slow,
    javax.xml.transform.Source are about as fast as Strings, but can be slower.
    One issue is that whatever the format it either needs to be serializable or easy
    to convert to something that is serializable. javax.xml.transform.Source is not,
    but it can be converted to a string fairly easily.
    I can use rpc or document based, whichever is fastest.
    (As a side question, why can't I cast the weblogic implementation of SOAPElement
    into a org.w3c.xml.Element?)

    Hi Tom,
    I've not tried this, but others have reported success using a servlet
    filter for the compress; search back for "gzip" and always concerning
    performance issues...your mileage may vary...
    Regards,
    Bruce
    Tom Hennen wrote:
    >
    We would almost certainly be interested in something like Fast WS.
    As for compression, what compression methods would you suggest? Zipping the XML
    and then using soap attachments for the binary?
    These are fairly large messages (around 60k).
    I'd say the biggest current bottleneck comes from serializing the Element. So
    I'm not sure how much compression will help.
    Bruce Stephens <[email protected]> wrote:
    Hi Tom,
    Strings will probably be your fastest option as far as least cycles used
    in the transport layers.
    Do you have a bunch of small messages to exchange the fixed overhead
    could be a limiting factor, or if are they large messages you might
    investigate compressing your messages, this can drastically reduce
    server-side overhead particularly with text/XML messages which compress
    hugely.
    Another possiblity for your consideration is the emerging technolgy of
    Fast XML, see:
    http://developer.java.sun.com/developer/technicalArticles/WebServices/fastWS/
    We would be interested to hear your feedback if this would be of
    potential use in your situation.
    Thanks,
    Bruce
    Tom Hennen wrote:
    I'm trying to figure out what the fastest method of sending/receivingXML to/from
    a webservice. What involves the least amount of work on the clientand (more
    importantly) on the server side?
    Strings actually seem to be fairly quick, org.w3c.xml.Elements arevery slow,
    javax.xml.transform.Source are about as fast as Strings, but can beslower.
    One issue is that whatever the format it either needs to be serializableor easy
    to convert to something that is serializable. javax.xml.transform.Sourceis not,
    but it can be converted to a string fairly easily.
    I can use rpc or document based, whichever is fastest.
    (As a side question, why can't I cast the weblogic implementation ofSOAPElement
    into a org.w3c.xml.Element?)

  • Fastest method of shipping?

    Anybody know what the fastest method of shipping is with the iPod Touch? I'm going on vaction for a few weeks on Friday, and I was hoping that if I ordered it today, I might have it by then.
    Is there an overnight or 2-3 day shipping option with them?

    you can get 2-3 day shipping and Apple even has next day, but you have to call them for that option.
    Also be aware that ordering today doesn't mean you will get it next day or 2-3 days just because you pay for that. It still may take several days before the iPod is even shipped. The faster shipping option is for the actual shipping speed.

  • What is the best/fastest method to create a table (Oracle 11gR2 dB)?

    Assuming there are no statistics for the source tables - tables are every day dropped and recreated
    My tables have a few million rows
    I try to create a table populated with data as fast as it is possible via
    1. createtable_name as SELECT * from emp;
    2. create table_name
    Parallel degree 4
      as SELECT * from emp;
    I 1 case I got timing about 34 Sec
    In case 2 the table was created in 15 Sec
    Is in oracle other possibilities to create a table much faster that with this CTAS?
    Or it will be faster when I will create a table via create table_name
      column1datatype[NULL | NOT NULL],
      column2datatype[NULL | NOT NULL],
      column_ndatatype[NULL | NOT NULL]
    or maybe should I use this one method
    INSERT /*+ APPEND */ INTO empSELECT * FROM all_objects;
    INSERT /*+ APPEND_VALUES */ INTO emp SELECT * FROM all_objects;
    or it is better to simple create a table
    and than use FORALL BULK INTO COLLECTION statement
    whit combination of
    INSERT /*+ APPEND */ INTO emp;

    Assuming there are no statistics for the source tables - tables are every day dropped and recreated
    My tables have a few million rows
    The 'fastest' way is to NOT drop and recreate the tables every day.  Unless the table changes structure every day just create the tables ONE TIME. The each day you can truncate them with the REUSE STORAGE clause.
    Then populate the tables using DIRECT-PATH loading such as by using the APPEND hint. That can also be done in PARALLEL if desired.
    FORALL and BULK COLLECTION would ONLY be used if the data is ALREADY in collections as part of complex ETL data cleansing/conversion. If the data already exists in external files or other tables there is generally no need to use PL/SQL or collections.
    What PROBLEM are you trying to solve?
    You need to give us much more info if you really want help. For ETL the task of 'loading a table fast' is generally WAY DOWN the list of concerns.
    1. where is the data now?
    2. How much data is there?
    3. what cleansing/conversion needs to be done on the data?
    4. what are the requirements for restart/recovery needed?
    5. what are the requirements for detecting and reporting on data issues?

  • Fastest was to insert entries fr ITAB1 into hashed ITAB2

    Howdy,
    what is the fastest way to move entires from a standard ITAB1 into a ITAB2 of type hashed?
    Also, what about when ITAB2 has been passed as method parameter in the format <fs_ITAB2> of type any ?
    Important: before the insert ITAB2 already contains entries which must be preserved (thereby disqualifies squarebrackets=squarebrackets )
    Thanks for any answers.

    if your table itab2 exists already and has entries, then you don't have much freedom about the type, or?
    A hashed table is a questionable choice, do the old entries fulfill the uniqueness of the key fields?
    The COLLECT is a special solution, it sums p the numeric values of lines with same key. Perfect it required, but do you want that?
    The default choice is a sorted table with non-unique key.
    And if you must use a standard table, then you must manully optimize it, if it must be sorted afterwards.
    Overall, if the result is only used in LOOPs, and no fast reads are required, then the
    APPEND LINES .... is the fastest solution.
    Siegfried

  • Efficient method to insert large number of data into table

    Hi,
    I have a procedure that accepts an input parameter, that contains, a comma seperated values as input.
    Something like G12-UHG,THA-90HJ,NS-98039,........There can be more than 90,000 values in that comma seperated input paramter.
    What is the most efficient way to do an insert in this case?.
    3 methods I have in mind are :
    1) Get individual tokens from CSV and use a plain old loop and do an insert.
    2) Use BULK COLLECT & FOR ALL. However I don't know how to do this, since input is not from cursor, rather a parameter.
    3) Use Table collections. Again this involves plain old looping through the collection. Same as 1st method.
    Please do suggest the most efficient method.
    Thanks

    90,000 values?? Whats the data type of the input parameter?
    you can use the string to row conversion trick if you want and do a single insert
    SQL> with t as (select 'ABC,DEF GHI,JKL' str from dual)
      2  select regexp_substr(str,'[^,]+', 1, level) list
      3  from t connect by level <= NVL( LENGTH( REGEXP_REPLACE( str, '[^,]+', NULL ) ), 0 ) + 1
      4  /
    LIST
    ABC
    DEF GHI
    JKL Edited by: Karthick_Arp on Feb 13, 2009 2:18 AM

  • How CallableStatement in JSP use setDate() method to insert the date value into DB?

    Dear all,
    I met a strange error message when i insert a date value into DB via JSP call PL/SQL procedures.
    The error seems caused by the setDate(index, date) method with CallableStatement.
    The message is: Can not find the setDate(int, java.util.Date) method in the CallableStatement interfaces.
    Any ideas?
    Thanks advanced.

    Thank you!:)
    I solved it using this:
    String name="david";
                stmt = con1.createStatement();
                String prikaz1 = "INSERT INTO table (id,age,surname,name) IN 'C:\\Users\\David\\Desktop\\db.mdb' SELECT id,age,surname,' " + name + " ' FROM table2";
                stmt.executeUpdate(prikaz1);

  • Php -  Method cant Insert

    Hi, I attempt to insert new data
    but i method dont return any value...
    this is my code
    $crmdomain="https://secure-xxxxxxx.crmondemand.com";
    function wslogin() {
    global $crmdomain;
    $url = $crmdomain."/Services/Integration?command=login";
    $page = "/Services/Integration?command=login";
    $headers = array(
    "GET ".$page." HTTP/1.0",
    "UserName: username",
    "Password: password",
    $ch = curl_init();
    curl_setopt($ch, CURLOPT_URL,$url);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
    curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
    curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
    curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, FALSE);
    curl_setopt($ch, CURLOPT_HEADER, true);
    $data = curl_exec($ch);
    if (!curl_errno($ch)) {
    // Show me the result
    $sessionid = substr($data,(strpos($data,"Set-Cookie:")+23),(strpos($data,";")-strpos($data,"Set-Cookie:")-23));
    curl_close($ch);
    return $sessionid;
    else
    return curl_error($ch);
    $sessionid=wslogin();
    $lead = array();
    $lead = "John ";
    $lead = "Smith";
    $listOfLead = array('Lead' =&gt; $lead);
    $leadWS = array('ListOfLead' =&gt; $listOfLead);
    include_once 'lib/nusoap.php';
    $Clase= new SoapClient("lead.wsdl" , true);
    $Clase->setcookie("JSESSIONID", $sessionid);
    // Error check
    $err = $Clase->getError();
    if ($err) {
    // Show error
    echo 'Error with soapclient creation' . $err . '</pre>';
    // call won't work
    $leadResult=$Clase->call('LeadInsert',array('LeadWS_LeadInsert_Input' => $leadWS));
    print_r($leadResult);
    Edited by: user10786927 on 19-ene-2009 6:38
    Edited by: user10786927 on 19-ene-2009 6:47

    If you check out [this other thread|http://forums.oracle.com/forums/thread.jspa?messageID=3417337#3474141] you'll see I posted a utility function that can be used to query, insert, and update any object you have a WSDL for.
    Using that function this is how you create an Opportunity and then re-query for it.
    $createOpportunity = array(
         "OpportunityWS_OpportunityInsert_Input" => array(
              "ListOfOpportunity" => array(
                   "Opportunity" => array(
                        "OpportunityName" => htmlspecialchars("Some Company", ENT_NOQUOTES, "ISO-8859-1"),
                        "AccountId" => "1-ZX429",
                        "Probability" => "10",
                        "SalesStage" => "Lead",
                        "CloseDate" => date("m/d/Y", strtotime("+90 days")),
                        "OpportunityType" => "New Customer-Initial Sale"
    $createOpportunityResult = sodCall('wsdl/opportunity.wsdl','OpportunityInsert',$createOpportunity);
    $findOpportunity = array(
         "OpportunityWS_OpportunityQueryPage_Input" => array(
              "ListOfOpportunity" => array(
                   "Opportunity" => array(
                        "OpportunityName" => "='Some Company'",
                        "AccountId" => "",
                        "Probability" => "",
                        "SalesStage" => "",
                        "CloseDate" => "",
                        "OpportunityType" => ""
    $findOpportunityResult = sodCall('wsdl/opportunity.wsdl','OpportunityQueryPage',$findOpportunity);

  • Is this the fastest way to insert 100k rows into a database?

    I'm looking to insert ~100k rows into a database as fast as possible. I'm connected to the database through a small LAN with static IP adresses. The target database is running my SQL. Here's the fastest I've gotten the code so far. Task manager shows that it's only sending like 2.5mbps. If I take the size of the 2D array (64MB) and divide by the time it takes to transfer it currently, I calculate ~160kbps. Is that the best I can hope for with the NI DB Toolkit?
    CLD (2014)

    Hi,
    Use DB Tool Insert Data VI to insert the data in the database.
    You can insert complete row data at a time. No need to insert single-single element in database this will reduce your code performance.
    Thanks and Regards
    Himanshu Goyal | LabVIEW Engineer- Power System Automation
    Values that steer us ahead: Passion | Innovation | Ambition | Diligence | Teamwork
    It Only gets BETTER!!!

  • Need a method to insert 1000 records in oracle in once

    Hi All,
    I want to insert more than 1000 records in oracle database in once. Please let me know the way to do this. It's urgent..........
    Regards,
    Puneet Pradhan

    More then 1000?
    So, how about 10000?
    Use the CONNECT BY LEVEL clause to generate records:
    insert into table
    select level --or whatever
    from dual
    connect by level <= 10000;
    It's urgent..........Since it's your first post, I recommend you to not use the 'U-word', or you'll be made fun of...

  • Method to Insert data

    Hi All,
    I need to populate the data into 5 tables from 20+ tables. As there is no direct mapping from table to table, I prepared a sheet on column mapping for data load and written a script to join tables and copy the rows.
    But my problem is I am not getting the exact rows while retrieving the data back from table.
    The structure is similar as:
    New tables:
    T1, T2, T3.....
    Old tables:
    P1, P2, P3, P4, P5 .......
    Columns in T1 are mapped to P1, P2 and P4 ....
    Columns in T2 are mapped to P1,P3,P4 and P5 .....
    Whats is the appropriate way to insert the data then. As I am not getting the correct results by joining P* tables.
    Any suggestions will be appreciated.
    Thanks
    Bhupinder

    Hi Bharath,
    P1(col1,col2,col3,col4,col5) -> T1(c1,c2,c3),T2(c3,c4) ...
    P2(col4,col5.col6.col7) -> T2(c1,c2), T3(c1,c2)There is a direct mapping in columns.
    (P1.COL1 -> T1.C1, P1.COL2 -> T2.C2, P1.COL3 -> T1.C3, P1.COL4 -> T2.C3, P1.COL5 -> T2.C4)
    (P2.COL4 -> T2.C1, P2.COL5 -> T2.C2, P2.COL6 -> T3.C1, P2.COL7 -> T3.C2)
    U are talking bout a view in place of P2. How is the
    view created? any scripts..
    U have issued a select statement on P2...u will get
    col 1 col2 and col3 alone...you cannot get 4 and
    5...please check ur statements....Its still not
    clear...I am retrieving the data with:
    SQL : select T2.C1,T2.C2,T3.C1,T3.C2 from T1, T2 where T1.C1=T2.C2;
    OR
    SQL : select T2.C1,T2.C2,T3.C1,T3.C2 from T1, T2 where T1.C1=T2.C2(+);
    is this what u r trying to do...
    Insert into T2 c1,c2 select col4,col5 from P2
    Insert into T3 c1,c2 select col4,col5 from P2No, I am using a join statement to get all the data and inserting to tables.
    When u issue select * from p2...u will get attributes
    from P2 only...T2.C1,T2.C2,T3.C1,T3.C2
    Here the problem is when I am not getting the correct data, as it does not match with no of records with original table.
    Thanks for your inputs.

  • Fastest Method of Importing Video

    I am trying to start a project where I will be uploading video to stream online and in order to save time on editing, I was wondering what is the fastest way to import video in to CS3? Is the only way to import real time, meaning 1 hour of video takes 1 hour to import?
    I know people say that hard drive camcorders are not the same quality but I would think that Adobe would make importing video from them easy to save on import time.
    Thank you in advance. Sorry if I am asking common sense questions.

    The problem with hard drive cameras is the format they record to. If someone would make a DV hard drive camera, you'd be golden. But they all use MPEG compression, which is just not a good idea for source media.
    Currently the fastest way is to hook your camera up to your editing computer and record live to Premiere. Clips get saved on the hard drive, are already in the bins, labeled and ready to edit. This works great for Laptops, but is a but more troublesome if you edit on a workstation.

  • Fastest method call

    Hi,
    All the method calls (instance, static, newObject,...) in JNI come in 3 forms, that are different regarding the lastest argument(s), which are the actual arguments of the method call :
    - one that take a variable number of args
    - one that take a jvalue[]
    - one that take a va_list.
    Is there any one of those that is known to be faster than the others ? Is is known according to experiments or by design?
    (Note : I am currently using Sun's JDK 1.3 & 1.4 on W2K)
    Thanks for your advice...
    C.Dore

    - one that take a variable number of args
    - one that take a jvalue[]
    - one that take a va_list.I can't speak to the middle one, but the 1st and 3rd are variations of the same thing. One might suppose that the first would be slightly slower than the 3rd because a variable assignment would be required in the 1st.
      void doit(char* format, ...)
        va_list vl;
        va_start( vl, format );
    .Of course the only way that I have ever found that increasing the performance of an application is to actually benchmark/profile it.

  • Nicer method for inserting dynamic into a JSP

    I have a site layout which has header.jsp, menu.jsp, content.jsp(s), footer.jsp.
    Instead of having 10 various topic pages all of which including header,menu, and footer into them, I have opted for one JSP that loads the content.jsp based on a parameter in the URL e.g,
    index.jsp?content=news
    index.jsp?content=polls
    index.jsp?content=products
    So far I can do this by :
    <jsp:include page="includes/header.jsp" />
    <jsp:include page="includes/menu.jsp" />
    <%
    String content = request.getParameter("content");
    String action = request.getParameter("action");
    String id = request.getParameter("id");
    if(content.equals("news"))
    if(action == null)
    %>
    <jsp:include page="news/content.jsp" />
    <%
    else if(action != null)
    %>
    <jsp:include page="news/form.jsp" />
    <%
    if(content.equals("forums"))
    if(action == null)
    %>
    <jsp:include page="forums/content.jsp" />
    <%
    else if(action.equals("edit"))
    %>
    <jsp:include page="forums/showtopic.jsp" />
    <%
    // other IF blocks here
    %>
    <%@ include file="includes/footer.jsp" %>
    footer.jsp in constant so it is '<%@ include', the header and menu have content which can vary between views so they have '<jsp:include'.
    Originally I thought I could have one single
    <jsp:include page="<%=content%>/content.jsp" />
    or find any means to use the content variable to feed it into the JSP:INCLUDE, but it doesn't seem to be possible.
    Does anyone have any ideas for me?
    Many thanks

    Originally I thought I could have one single
    <jsp:include page="<%=content%>/content.jsp" />
    or find any means to use the content variable to fee
    it into the JSP:INCLUDE, but it doesn't seem to be
    possible.
    Does anyone have any ideas for me?
    Try:
    <% String pageName = content + "/content.jsp";   %>
    <jsp:include page="<%=pageName%>"/>or
    <jsp:include page='<%=content + "/content.jsp"%>' />

Maybe you are looking for