Oracle 10 Million records insert using Pro c

Hi,
     As i am new to Oracle 10G Pro c and SQL Loader, i would like to know few informations regarding the same.
My requirement is this that, i need to read a 20GB file (20Million Lines) line by line and convert the line into database records with few data manipulation and insert
into a Oracle 10G database table.
     I read some articles and it says Pro C is faster than SQL Loader in performance (fast insertion). And also Pro C talks to
the oracle directly and it puts the data pages directly into Oracle but not through SQL Engine or Parser.
     Even in the Pro c samples, i have seen a For loop to insert mulitple records. Will each insertion cost
     more time ?
     Is there any bulk insert program on Pro C like 10 Million rows at a shot ?
     Or Pro c can do upload of a file data into Oracle database table ?
     If any one already posted this query means please inform me the thread number or id
Thank you,
Ganesh
Edited by: user12165645 on Nov 10, 2009 2:06 AM

Alex Nuijten wrote:
Personally I would go for either an External Table or SQL*Loader, mainly because I've never used Pro*C.. ;)
And so far I never needed to resort to other option because of poor performance.I fully agree. Also we are talking about "only" 10 mill rows. This will take some time, but probably not more than 30 min, max. 2 hours depending on many factors including storage, network and current database activities. I've seen systems where such a load took only a few minutes.
I guess if there is a difference between Pro*C and external tables I would still opt for external tables, because they are so much easier to use.

Similar Messages

  • Errors not logged when IKM Oracle Multi Record Insert is selected

    Dear All,
    I am new in ODI 11g and I am facing the following problem:
    I created a package with 3 Interfaces with Oracle Multi Record Insert.
    In the first interface I load the source data to a temporary target
    In the second interface target#1 table is loaded using interface#1
    In the third interface target#2 is loaded using interface#1 and the multiple insert is executed and committed.
    This works correctly, but when a data error occurs (e.g. mandatory column is null), instead of the error being logged in the Error table, execution of the last interface fails with the following error:
    Caused By: java.sql.SQLIntegrityConstraintViolationException: ORA-01400: cannot insert NULL into (<schema>.<table>.<column>)
    I noticed that when I use Oracle Incremental Update instead, the errors get logged correctly in the error table.
    Does anyone know what could be causing this?

    Hi Bhabani,
    Thanks for the reply.
    I am afraid that this is a major issue for me, I do not want to re-query the source table for each target table, but on the other hand I cannot fail the process for each invalid record and I need the logging.
    Can you think of a workaround for my use case?
    Thank you!

  • Partial Records insertion using DB Adapter - SOA 11g

    Hi,
    We have a BPEL process in which we are inserting records in a table using DB Adapter. Currently if the input data has any problem of data type miss-match then all the records are rejected. None of the records are inserted in the table. Whole batch is rejected.
    We have a new requirement that when there is problem is any record, then only that problematic record is rejected and rest of the records are inserted in the table. Is it possible to do so?
    Thanks,
    Sanjay

    In that case, its better to move the insert statement into procedure and do the insert and return value, if the insert statement is more then this will increase the performance time.
    How many rows you will insert at very worst case ? like 100 lines ?
    Thanks,
    Vijay

  • Limit the Oracle query run time using Pro*C

    All,
    I would like to limit my sql query for say a duration of only 10-15 secs. I do not want to kill the oracle process but send a signal to the C program with some message that " its taking more time"
    Is there an param which I can use it to limit, without killing or shutting down the oracle session or process.
    Thanks in advance

    My answer is only for 1. problem (about ORA-01458)
    I think that you use declaration for cursor C0 with a varchar
    variable before you ininitialize length (member .len of varchar
    structure) of this variable.
    It's famous that many errors come from uninitialized varchar
    variables in Pro*C.

  • How to store txt file into Oracle 7 Database using Pro*C

    Hi,
    I want to store a txt file into Oracle 7 database table using
    Pro*C application. But do not know what type of column to use.
    At first glance it appeared to me as LONG can serve the purpose
    but later I noticed that I can not do the sequential read/write
    in LONG type of column. That is, I have to use chunks of max of
    2GB (or file lenght) to read/write into such columns.
    I want something simiar to CLOB of Oracle 8.
    I would appreciate if you can provide me solution.
    Thanks,
    Anurag

    You store images in a BLOB column in the database.
    However, inserting image in that column and displaying data/image from that column are 2 very different tasks.
    If you are using Oracle forms, displaying is easy. Default block sitting on the table with BLOB column, can be easily mapped to image box (or similar control), which will display that image.
    Inserting images will be a different ball game. If your forms are web based (i.e. run from browser) and you want to insert images from client machine, some special arrangements are required.
    If images are on database server and you want to insert them in database, the stored procedure given in the earlier thread (posted above) will do the job.

  • Slow record insertion when using millions of queries in one transaction

    For test purposes, we play a table creation scenario (no indexes) under multiple conditions : we insert records in bulk mode or one by one, with or without transactions, etc.
    In general, the record insertion is ok, but not when we try to insert 1 million record one by one (sending 1 million INSERT commands) in a single transaction: in this case, the insertion is quick enough for the first 100000 records approximatively, but then, it becomes extremely slow, so that it would take several days to complete. This doe not happen whithout the transaction.
    We were not able to find the database parameters to change to gain a better performance : rollback? transactions? undo? what else?
    Does anybody have an idea of teh parameters to modify?
    Thank-you in advance.

    >
    For test purposes, we play a table creation scenario (no indexes) under multiple conditions : we insert records in bulk mode or one by one, with or without transactions, etc.
    In general, the record insertion is ok, but not when we try to insert 1 million record one by one (sending 1 million INSERT commands) in a single transaction: in this case, the insertion is quick enough for the first 100000 records approximatively, but then, it becomes extremely slow, so that it would take several days to complete. This doe not happen whithout the transaction.
    >
    Hi
    How are you inserting the one million records when you do one at a time? If it's within a loop, you are probably doing a COMMIT as well within the loop. This will cause log file sync waits as LGWR will be too busy writing the redo entries. This will slow down the insert process as well as the performance of database. This is an expected bahaviour. Commit causes checkpoint to triiger which in turn will make log writer write the redo entries in the online redo logs. This is a serial process.
    You can do the following methods to insert bulk records
    a) INSERT /*+ APPEND */ into table select * from stage;
    This will cause direct path load to happen bypassing buffer cache and reducing the redo to a great extent. However if you have foreign keys enabled in the table, it will silently ignore the direct path directive and do the conventional load.
    b) Forall....SELECT ...BULK COLLECT..... This will be a good method when you do from PL/SQL
    c) When you do within a loop
        Declare
        v_commit_cnt Number := 0;
       Begin
        For i in (select col1, col2, col3 from billion_record_table)
        Loop
         v_commit_cnt := v_commit_cnt + 1;
         Insert into target values (i.col1, i.col2, i.col3);
         If (v_commit_cnt >= 50000) Then
          commit;
          v_commit_cnt := 0;
         End If;
       End Loop;
       COMMIT;
    End;
    /4) If the target table is a staging table, you can do a CTAS (Create table as SELECT)
    and many more options if you plan well in advance.

  • Selecting Records from 125 million record table to insert into smaller table

    Oracle 11g
    I have a large table of 125 million records - t3_universe.  This table never gets updated or altered once loaded,  but holds data that we receive from a lead company.
    I need to select records from this large table that fit certain demographic criteria and insert those into a smaller table - T3_Leads -  that will be updated with regard to when the lead is mailed and for other relevant information.
    My question is what is the best (fastest) approach to select records from this 125 million record table to insert into the smaller table.  I have tried a variety of things - views, materialized views, direct insert into smaller table...I think I am probably missing other approaches.
    My current attempt has been to create a View using the query that selects the records as shown below.  Then use a second query that inserts into T3_Leads from this View V_Market.  This is very slow. Can I just use an Insert Into T3_Leads with this query - it did not seem to work with the WITH clause?    My Index on the large table is t3_universe_composite and includes zip_code, address_key, household_key. 
    CREATE VIEW V_Market  as
    WITH got_pairs    AS  
         SELECT /*+ INDEX_FFS(t3_universe t3_universe_composite) */  l.zip_code, l.zip_plus_4, l.p1_givenname, l.surname, l.address, l.city, l.state, l.household_key, l.hh_type as l_hh_type, l.address_key, l.narrowband_income, l.p1_ms, l.p1_gender, l.p1_exact_age, l.p1_personkey, e.hh_type as filler_data, 1.p1_seq_no, l.p2_seq_no 
         ,      ROW_NUMBER () OVER ( PARTITION BY  l.address_key 
                                      ORDER BY      l.hh_verification_date  DESC 
                      ) AS r_num   
         FROM   t3_universe  e   
         JOIN   t3_universe  l  ON   
                l.address_key  = e.address_key
                AND l.zip_code = e.zip_code
              AND   l.p1_gender != e.p1_gender
                 AND   l.household_key != e.household_key         
                 AND  l.hh_verification_date  >= e.hh_verification_date 
      SELECT  * 
      FROM  got_pairs
      where l_hh_type !=1 and l_hh_type !=2 and filler_data != 1 and filler_data != 2 and zip_code in (select * from M_mansfield_02048) and p1_exact_age BETWEEN 25 and 70 and narrowband_income >= '8' and r_num = 1
    Then
    INSERT INTO T3_leads(zip, zip4, firstname, lastname, address, city, state, household_key, hh_type, address_key, income, relationship_status, gender, age, person_key, filler_data, p1_seq_no, p2_seq_no)
    select zip_code, zip_plus_4, p1_givenname, surname, address, city, state, household_key, l_hh_type, address_key, narrowband_income, p1_ms, p1_gender, p1_exact_age, p1_personkey, filler_data, p1_seq_no, p2_seq_no
    from V_Market;

    I had no trouble creating the view exactly as you posted it.  However, be careful here:
    and zip_code in (select * from M_mansfield_02048)
    You should name the column explicitly rather than select *.  (do you really have separate tables for different zip codes?)
    About the performance, it's hard to tell because you haven't posted anything we can use, like explain plans or traces but simply encapsulating your query into a view is not likely to make it any faster.
    Depending on the size of the subset of rows you're selecting, the /*+ INDEX hint may be doing your more harm than good.

  • Approx how much time should JDBC adapter take to insert 1.5 million record?

    Hi All,
              What is the optimum time for inserting 1.5 million records to Oracle Staging Table? My scenario ECC to Oracle is taking 3 hours.
    With your previous experience, what do you think about this. Is there a scope of improvement?
    We have a simple insert through JDBC datatype. i.e Action = INSERT.
    Kindly Advice.
    Regards,
    XIer
    Edited by: XIer on Mar 27, 2008 9:20 AM
    Edited by: XIer on Mar 27, 2008 10:02 AM

    Hi,
    >What do you think is the optimum time with your experience...
    We had Similar Situation  after adding Application Server  the time was reduced to 1 hour.  Now  how many App server are available in your XI system ?
    Regards
    Sangeetha

  • Using Pro*C on Linux Oracle 8.1.6.0 on RH 6.2

    Whe have some problems using Pro*C:
    the program runs ok on all Oracle versions, including Oracle 8.1.6.0 on DEC OSF, but various sql errors are encountered on Linux Red Hat 6.2:
    1) "ORA-01458: invalid length inside variable character string" for this code:
    EXEC SQL BEGIN DECLARE SECTION;
    varchar I_LANGLB [4];
    short O_NOLBLB ;
    varchar O_LIBELB [26];
    EXEC SQL END DECLARE SECTION;
    EXEC SQL INCLUDE SQLCA.H;
    EXEC SQL WHENEVER SQLERROR GO TO MAJ_RESULT;
    EXEC SQL WHENEVER NOT FOUND CONTINUE;
    EXEC SQL DECLARE C0 CURSOR FOR
    SELECT LBL.NOLBLB, LBL.LIBELB
    FROM LBL
    WHERE LBL.EDITLB = 'MON' AND LBL.LANGLB = :I_LANGLB;
    strcpy (I_LANGLB.arr, "fra");
    I_LANGLB.len = 3;
    EXEC SQL OPEN C0;
    for ( ; ; )
    EXEC SQL WHENEVER NOT FOUND DO break;
    EXEC SQL FETCH C0 INTO :O_NOLBLB, :O_LIBELB;
    EXEC SQL CLOSE C0;
    2) with Dynamic Sql: "ORA-01007: variable not in select list"
    SELECT
    nvl(MODEME, ''),
    nvl(NBANME, 0),
    nvl(BASEME, '0'),
    nvl(PRORME,'0'),
    nvl(EVALME, '0'),
    nvl(DECOME, '0') ,
    nvl(CDDAME, '0'),
    nvl(CDMIME, '0'),
    nvl(TXMIME, 0),
    nvl(CDMAME, '0'),
    nvl(TXMAME, 0),
    nvl(CDTXME, '0'),
    nvl(CDSUME, '0'),
    nvl(TXSUME, 0),
    nvl(DUMIME, 0),
    nvl(MTVNME, 0),
    nvl(NOANML, 0),
    nvl(TAUXML, 0),
    DESIME
    FROM MET, LME
    WHERE MODEME = MODEML
    AND nvl(INLBME,'1')='1'
    ORDER BY MODEME,NOANML
    [ or
    ORDER BY 1,17 ]
    In both cases,
    We use the following precompiling options:
    include=/oracle/OraHome1/precomp/lib ireclen=132 oreclen=132 sqlcheck=syntax parse=partial select_error=no char_map=VARCHAR2 mode=ORACLE unsafe_null=yes
    dbms=V8
    Could someone help ?
    Thanks ...

    My answer is only for 1. problem (about ORA-01458)
    I think that you use declaration for cursor C0 with a varchar
    variable before you ininitialize length (member .len of varchar
    structure) of this variable.
    It's famous that many errors come from uninitialized varchar
    variables in Pro*C.

  • I am recording with Logic Pro X, using my Yamaha XS8 keyboard as a Midi controller. I also have a Thunderbolt Display. Every time I play and release a note on the keyboard, the Thunderbolt Display speaker emits a doink sound. How do I get it to stop?

    I am recording with Logic Pro X, using my Yamaha XS8 keyboard as a Midi controller. I also have a Thunderbolt Display. Every time I play and release a note on the keyboard, the Thunderbolt Display speaker emits a doink sound. What is causing it and how do I get it to stop? My sound is running through the Saffire Pro 40 Interface (Focusrite) into external speakers with powered amps.
    Also, When I record I am hearing little pops in the system. I have checked all of my meters and they are not clipping. What is causing it and how do I get it to stop?
    Ken

    MUYconfundido wrote:
    Pancenter,
    Thanks for the response, but I do not have a midi interface. I am using a midi to usb connector cable, thus bypassing the need for a Midi interface.
    The Mac reads the USB cable as a midi device, but not the keyboard that I am trying to use as a controller. I have tried it with my korg sp 300 and with my Nord Electro 2.
    Thoughts?
    Thanks,
    Tristan
    Tristan...
    This is what you have, correct?
    http://www.alesis.com/usbmidicable
    This from Alesis..
    "The AudioLink Series USB cable receives and outputs MIDI signal thanks to its internal interface. The USB-MIDI Cable connects plug-and-play to your Mac or PC for an all-in-one USB-MIDI solution."
    Notice, -internal interface-. What you have is a simple USB MIDI Interface. Most MIDI interfaces are USB.
    My point is (was), MIDI OUT of the Korg goes to the connector marked MIDI IN on the Alesis, those new to MIDI often get this wrong.
    pancenter-

  • Can't record sound using the jack in/out on macbook pro 13 os x

    hi
    i bought a new macbook pro 13' intel core i5 4gb with the latest os x version 10.8.2. mainly for sound and video editing
    yesterday i tried to record sound using garage band and through the mini jack port. once i connected an mp3 player to the mini jack port, the sound prefrences window recognized it as headphones and not as input jack as it did on my friend's same mac with leopard os on it.
    it seems like no matter what i try to do (changing instument set up on garage band, restarting the computer with the jack in or out etc...) nothing worked
    i bought this mac just so i wouldn't have to deal with these kind of problems....suppose to be a very user friendly os and hardwear.
    very dissapointing. please if any one has a clue what can be done?

    i'm aware of my other audio interface options, but i would expact this machine to have the basic options every 20 year old pc has, a simple plug and play difault to get audio into my computer. i bought it at b and h in NY and in their specs it says that it's a 'combo kack' - meaning both in and out analog audio ports. after cheaking mac's site it says only headphones.
    also tryied to connect a digidesign mini mbox2, through the usb port, and i had to unplug it, shut down the mbp, restart it again etc.. and it also doesn't always work propely and makes all kinds of digital error sounds.
    although in this case i can't that the mbox is not damaged.
    over all it's very dissapointing. it's my first mac and i figured it would be a much more idiot proof, user freindly work tool.
    if you got any other tips i'd appreciate that
    thanx alot for the focusrite link

  • What is the best way of insertion using structured or binary type in oracle

    what is the best way of insertion using structured or binary type in oracle xml db 11g database

    SQL*Loader.

  • Best way to Insert Millions records in SQL Azure on daily basis?

    I am maintaining millions of records in Sql Server 2008 R2 and now i am intended to migrate these on SQL Azure.
    In existing system with SQL Server 2008 R2, few SSIS packages and Stored Procedures are firstly truncate the existing records and then perform Insert operation on the table which holds
    approx 26 Million records in 30 mins. on Daily basis (as system demands).
    When i migrate these on SQL Azure, i am unable to perform these operations in a
    faster way as i did in SQL 2008. Sometimes i got Request timeout error.
    While searching for faster way, many of them suggest for Batch process or BCP. But Batch processing is NOT suitable in my case because it takes much time to insert those records. I required some faster and efficient way on SQL Azure.
    Hoping for some good suggestions.
    Thanks in advance :)
    Ashish Narnoli

    +1 to Frank's advice.
    Also, please upgrade your Azure SQL Database server to
    V12 as you will receive higher performance on the premium tiers.  As you scale-up your database for your bulk insert, remember that
    SQL Database charges by the hour. To minimize costs, scale back down when the inserts have completed.

  • How to update and insert the records without using Table_comparison and Map_operation?

    How to update and insert the records without using Table_comparison and Map_operation?

    Use either join or MERGE see this Inserting, Updating, and Deleting Data by Using MERGE

  • How to send  Million records using  Open hub using  into my own file server

    hi
    i am using OPEN HUB process to send  6 miilion record to my own network server. is there any limitaiton how i need to go about that? my process is failing when i execute DTP openhub. i am abel to send few records to sap aplication server.
    did any body send  more than million records to aplication server out side SAP ? how to achieve this. pelase help me out..
    i am trying to send ods records into file server

    I'm glad you solved your problem.
    Generally it is nice to post how you solved your problem so that when others have a similar problem and search the archives, they can see your solution.
    Thanks

Maybe you are looking for

  • ITunes on an external HDD and Airport Extreme BS

    Ok, sorry to be boring, but I can't get this to work as I think it should. I want to have a central iTunes library based on the USB-connected 1TB HDD, so that I can train all my Macs to reach for the files from the same place. So that I can access th

  • Error while using UTL_FILE.FOPEN

    sir, when i write procedure using utl_file.fopen i am getting error as it must be declared PLS-00201. what mistake, i am doing. yours dr.s.r.bhattachar

  • IBook G4 with OSX10.5.8 suddenly won't access Facebook.

    In the past several days, my ability to access Facebook has failed. The response is always to the effect that FB unexpectedly dropped the connection while trying to connect. I normally go to FB from messages received in my Yahoo mailbox. I have even

  • String concatenation, Buffers and a bug

    Good morning to the almighty forum. As I saw in this Thread, http://forum.java.sun.com/thread.jsp?forum=31&thread=269622 a + b + c + d (all strings) is equal to.... new StringBuffer(a).append(b).append(c).append(d).toString(). If this is correct AND,

  • Dynamic text won't display

    I don't know if this is a bug. I have a movie that contains a dynamic text. When I run it, the text displays fine. But when the said movie is run through another movie clip (using loadMovie), the dynamic text won't display. Any ideas why? Thank you.