Using Bulk insert or SQL Loader in VB6

Hi,
I am quite new to the Oracle world and also forums. But I am looking for some directions in how i get a dataset of 10000 records into a table the quickest way. I have the recordset in an ADO Recordset (or a textfile if that is easier) and I want to insert them in an empty Oracle table. The problem is - I don't know how to.
Situation
The Oracle DB is on another computer I have nothing special installed on the computer running the VB6 application.
Can anyone please provide code example or guidelines...
Regards,
Christian

This may not be "bulk insert" by your definition, but it can transfer data as you want.
A simple VB code for demo purpose:
Dim con As New ADODB.Connection
Dim con2 As New ADODB.Connection
Dim rst As New ADODB.Recordset
Dim rst2 As New ADODB.Recordset
Dim rst3 As New ADODB.Recordset
con.ConnectionString = "Provider=OraOLEDB.Oracle.1;User ID=scott;Password=tiger;Data Source=db_one;"
con.Open
rst.Open "select * from dept", con, adOpenDynamic, adLockOptimistic
'save to a file using ADTG format. You may choose other format.
rst.Save "c:\myfile.txt", adPersistADTG
'dept2 is an empty table with the same table definition as dept. You can create it using SQL*Plus.
'add rows by reading from the saved file.
con2.ConnectionString = "Provider=OraOLEDB.Oracle.1;User ID=xyz;Password=xyz;Data Source=db_two;"
con2.Open
'open the saved file
rst2.Open "c:\myfile.txt"
'rst3 is an empty recordset because dept2 is empty at this time.
rst3.Open "select * from dept2", con2, adOpenDynamic, adLockOptimistic
'adding rows into dept2.
Do Until rst2.EOF
rst3.AddNew Array("deptno", "dname", "loc"), Array(rst2.Fields("deptno"), rst2.Fields("dname"), rst2.Fields("loc"))
rst2.MoveNext
Loop
rst.Close
rst2.Close
rst3.Close
con.Close
con2.Close
Sinclair

Similar Messages

  • If your database in Full Recovery mode, can you use Bulk Insert Task to load data

    If your database in Full Recovery mode, can you use Bulk Insert Task to load data

    If your database in Full Recovery mode, can you use Bulk Insert Task to load data
    Yes you can ofourse but dont be in idea that logging will be mininal. Loggign will be as per recovery model full. Every thing will be logged. If you are going to use bulk insert task you can consider switching recovery model to Bulk logged but you will not
    have option to do point in time recovery.
    PS: please dont create duplicate threads
    If you read first Note section in below link it clearly states that yes logging will be full and you can use
    http://technet.microsoft.com/en-us/library/ms191244(v=sql.105).aspx
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it.
    My TechNet Wiki Articles

  • Is there a way to dynamically figure out the FIRSTROW & LASTROW using Bulk Insert?

    I noticed most of my files follow a certain convention, so I can use Bulk Insert with this:
    FIRSTROW = 2, LASTROW = 224
    However a few files have many more records.  I'd like SQL Server to dynamically figure out the first and last rows with data, but I don't know if that's possible.  Can someone confirm?
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

    Nope, you would need read the files from some program. Could be a CLR stored procedure if you want to do it from SQL Server. But from BULK INSERT alone, no.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • How to use bulk insert in utl_file

    i am creating text file of employee list i go huge data i want to use bulk insert with utl_file is it possible if so, how pls help me.

    If what you are asking, in some unknown version of Oracle, is whether using FORALL will increase the speed for writing a file to your hard disk the answer is no.
    BULK COLLECT, however, might help.

  • Bulk   Insert   from  SQL Server  to Oracle

    I have to load around 20 million rows from SQL Server table to Oracle table using Network Link,wrote following code using Bulk Collect,which is working but taking more time(taking 5 hrs).
    I also tried with changing table to parallel degree 8 didn't help(Also Oracle Table set to NOLOGGONG mode).
    Is there any better way to do this ? Appreciate any help in this regard .
    Script :
    CREATE OR REPLACE PROCEDURE INSERT_SQLSERVER_TO_ORACLE
    IS
    TYPE v_ARRAY IS TABLE OF TARGET_CUST%ROWTYPE INDEX BY BINARY_INTEGER;
    ins_rows v_ARRAY;
    BEGIN
    DECLARE CURSOR REC1 IS
    SELECT COL1, COL2,COL3,COL4 SOURCE_SQLSERVER_CUST;
    BEGIN
    OPEN REC1;
    LOOP
    FETCH REC1 BULK COLLECT INTO ins_rows LIMIT 5000;
    FORALL i IN ins_rows.FIRST..ins_rows.LAST
    INSERT INTO TARGET_CUST VALUES ins_rows(i);
    EXIT WHEN REC1%NOTFOUND;
    END LOOP;
    COMMIT;
    CLOSE REC1;
    END;
    END;
    Thanks in Advance.

    887204 wrote:
    I have to load around 20 million rows from SQL Server table to Oracle table using Network Link,wrote following code using Bulk Collect,which is working but taking more time(taking 5 hrs).I would not pull that data via a network link and use standard SQL insert statements. Bulk processing is meaningless in this context. It does nothing to increase the performance - as context switching is not the issue.
    The biggest factor is pulling 20 million rows's data via database link across the network. This will be slow by it's very nature.
    I would use bcp (Bulk Copy export) on SQL-Server to write the data to a CSV file.
    Zip that file. FTP/scp/sftp it to the Oracle server. Unzip it.
    Then do a parallel direct load of the data using SQL*Loader.
    This will be a lot faster than pulling uncompressed data across the network, a couple of rows at a time (together with the numerous moving parts on the Oracle side that uses a HS agent as interface between SQL-Server and the Oracle database).

  • File access problem while using BULK INSERT

    I'm creating a script to automatically convert a large mess of data.  Here's a test query I was using to bring a file into the database:
    INSERT [ImportTestTable]
    SELECT a.*
    FROM
    OPENROWSET(
    BULK 'D:\TestFile.csv',
    FORMATFILE = 'D:\TestStyle.fmt',
    FIRSTROW = 2
    ) AS a;
    SELECT * FROM ImportTestTable
    I've used queries like this on other networks and machines before, but when I run that query on the particular machine I'm working with now, I get the following error:
    Msg 4861, Level 16, State 1, Line 13
    Cannot bulk load because the file "D:\TestFile.csv" could not be opened. Operating system error code 21
    (The device is not ready.).
    Here's some relevant facts I can think of that might help:
    I am running the query from SQL Server Management Studio on a remote machine running Windows 7 Ultimate.  I am connected to this server using SQL authentication.  I believe we are on the same domain / network.  I am not the DBA.  I do
    have the permission "Administer Bulk Operations" explicitly granted to me by the DBA.  The user I am currently logged in as in windows is capable of opening, editing, and saving the file in windows explorer.  The format file is a NON-XML
    format file.
    Any pointers as to where to look for more detailed information would be greatly appreciated!

    I know this is a little old by now, but this is what I used just this week:
    bulk insert [dbo].[Test_Table]
    from 'C:\Documents and Settings\rshuell\Desktop\Test_File.txt'
    WITH (
    FIELDTERMINATOR=',',
    ROWTERMINATOR = '\n',
    KEEPNULLS,
    FIRSTROW=2
    That worked fine for me.
    Of course, we Naomi stated, the file has to be ON THE SERVER.  Or, if you're using an FTP site, for instance, the path would have to point to the FTP site.
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

  • Using User function in SQL*Loader

    Hi, I have created the following control file. I am using DIRECT path insert. I am using a function to retrieve the column system_date. However, when I am running sql*loader, the fields system_date and load_date are loaded with null value.
    OPTIONS (ERRORS=999999999, DIRECT=TRUE, ROWS=100000, SKIP=1)
    LOAD DATA
    INFILE 'Z:\xx.csv'
    APPEND
    PRESERVE BLANKS
    INTO TABLE TB_RAW_DATA
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    REFERENCE_NUMBER,     
    SYSTEM_DATE "pkg_ken_utility.Fn_ken_get_param_value('SYSTEM_DATE','SYSTEM_DATE')",
    LOAD_DATE "SYSDATE"
    )

    No,
    You can not call PL/SQL functions in DIRECT PATH mode, also this is documented, so you should not need to ask.
    Sybrand Bakker
    Senior Oracle DBA

  • Using oracle sequence in SQL Loader

    I'm using oracle sequence in control file of sql loader to load data from .csv file.
    Controlfile:
    LOAD DATA APPEND
    INTO TABLE PHONE_LIST
    FIELDS TERMINATED BY "," TRAILING NULLCOLS
    PHONE_LIST_ID "seqId.NEXTVAL",
    COUNTRY_CODE CHAR,
    CITY_CODE CHAR,
    BEGIN_RANGE CHAR,
    END_RANGE CHAR ,
    BLOCKED_FREE_FLAG CHAR
    Datafile:
    1516,8,9,9,B
    1517,1,1,2,B
    1518,8,9,9,B
    1519,8,9,9,B
    1520,8,9,9,B
    1521,8,9,9,B
    1) As first column uses oracle sequence, we have not defined that in datafile.
    This gives me error "Can not insert NULL value for last column"
    Is it mandatory to specify first column in datafile, even though we are using sequence?
    2) Another table is referencing PHONE_LIST_ID column (the one for which we using sequence) of this table as a foreign key.
    So is it possible to insert this column values in other table simultaneously? Sequence no. should be same as it is in first table...
    Kindly reply this on urgent basis....

    use BEFORE INSERT trigger
    with
    select your_seq.nextval into :new.id from dual;

  • Use of WHEN in SQL Loader Control File

    Does anyone know of a way, using the CTL file below, to NOT LOAD (i.e. discard) any record where field G is null -OR- substr(G,1,3)='CON'+ (the first 3 characters of field G are "CON")?
    The WHEN clause seems very limited in what it can test and all my attempts to use a compound condition, much less the SUBSTR function, have failed. The documentation indicates that only very simple conditions can be specified.
    (Oracle 10g, Unix)
    Thanks
    LOAD DATA
    TRUNCATE
    INTO TABLE TEST_TABLE
    WHEN (G <> '')
    FIELDS TERMINATED BY "^"
    TRAILING NULLCOLS
    A ,
    B FILLER,
    C FILLER,
    D FILLER,
    E FILLER,
    F FILLER,
    G "LTRIM(:G,'0')",
    H FILLER,
    I FILLER,
    J FILLER,
    K SYSDATE,
    L CONSTANT "SQLLDR"
    )

    Hello MTM.
    The WHEN clause in SQL*Loader, despite what you might read, doesn't seem to evaluate the value of the column you specify after SQL clauses are processed. Therefore, the only way you can guarantee to NOT LOAD a record is by using a function to raise an application error when the column data meets your criteria. You'll need to do some basic comparisons between bad data and discarded data in the bad data file to isolate these.
    You can still discard the null data using (replace ne with greater and less than)WHEN (g ne '')Hope this helps,
    Luke
    Please mark the answer as helpful or answered if it is so. If not, provide additional details.
    Always try to provide create table and insert table statements to help the forum members help you better.

  • First Row Record is not inserted from CSV file while bulk insert in sql server

    Hi Everyone,
    I have a csv file that needs to be inserted in sql server. The csv file will be format will be like below.
    1,Mr,"x,y",4
    2,Mr,"a,b",5
    3,Ms,"v,b",6
    While Bulk insert it coniders the 2nd column as two values (comma separte) and makes two entries .So i used filelterminator.xml.  
    Now, the fields are entered into the column correctly. But now the problem is, the first row of the csv file is not reading in sql server. when i removed the  terminator,  i can get the all records. But i must use the above code terminator. If
    am using means, am not getting the first row record.
    Please suggests me some solution.
    Thanks,
    Selvam

    Hi,
    I have a csv file (comma(,) delimited) like this which is to be insert to sql server. The format of the file when open in notepad like below:
    Id,FirstName,LastName,FullName,Gender
    1,xx,yy,"xx,yy",M
    2,zz,cc,"zz,cc",F
    3,aa,vv,"aa,vv",F
    The below is the bulk insert query which is used for insert above records,
    EXEC(BULK INSERT EmployeeData FROM '''+@FilePath+'''WITH
    (formatfile=''d:\FieldTerminator.xml'',
    ROWTERMINATOR=''\n'',
    FIRSTROW=2)'
    Here, I have used format file for the "Fullname" which has comma(,) within the field. The format file is:
    The problem is , it skip the first record (1,xx,yy,"xx,yy",M) when i use the format file. When i remove the format file from the query, it takes all the records but the "fullName" field makes the problem because of comma(,) within the
    field. So i must use the format file to handle this. So please suggest me , why the first record skipped always when i use the above format file.
    If i give the "FirstRow=1" in bulk insert, it shows the "String or binary data would be truncated.
    The statement has been terminated." error. I have checked the datatype length.
    Please update me the solution.
    Regards,
    Selvam. M

  • How to use BULK INSERT for a data from a cursor?

    Oracle 10G enterprise edition.
    I tried to Bulk insert datas returning from a cursor, its returning error.
    PLS-00302: component 'LAST' must be declared
    I need some help to use the Bulk INSERT here.Can any one help me to specify what error i have made?
    CREATE OR REPLACE PROCEDURE HOT_ADMIN.get_search_keyword_stats_prc
    IS
    CURSOR c_get_scenarios
    IS
    SELECT a.*,ROWNUM rnum
    FROM (
    SELECT TRUNC(r.search_date) sdate,
    r.search_hits hits,
    r.search_type stype,
    r.search_qualification qual,
    r.search_location loc,
    r.search_town stown,
    r.search_postcode pcode,
    r.search_college college,
    r.search_colname colname,
    r.search_text text,
    r.affiliate_id affiliate,
    r.search_study_mode smode,
    r.location_hint hint,
    r.search_posttown ptown,
    COUNT(1) cnt
    FROM w_search_headers r
    WHERE search_text IS NOT NULL
    AND NVL(search_type,' ') <> 'C'
    AND TRUNC(search_date)= TO_DATE(TO_CHAR(SYSDATE-1,'DD-MON-RRRR'))
    GROUP BY TRUNC(r.search_date),
    r.search_hits,
    r.search_type,
    r.search_qualification,
    r.search_location,
    r.search_town,
    r.search_postcode,
    r.search_college,
    r.search_colname,
    r.search_text,
    r.affiliate_id,
    r.search_study_mode,
    r.location_hint,
    r.search_posttown
    ORDER BY cnt desc
    ) a
    WHERE ROWNUM <=1000;
    lc_get_data c_get_scenarios%ROWTYPE;
    BEGIN
    OPEN c_get_scenarios;
    FETCH c_get_scenarios into lc_get_data;
    CLOSE c_get_scenarios;
    FORALL i IN 1..lc_get_data.last
    INSERT INTO W_SEARCH_SCENARIO_STATS VALUES ( i.sdate,
    i.hits,
    i.stype,
    i.qual,
    i.loc,
    i.stown,
    i.pcode,
    i.college,
    i.colname,
    i.text,
    i.affiliate,
    i.smode,
    i.hint,
    i.ptown,
    i.cnt
    COMMIT;
    END;

    This isn't what you asked, but I've generally found it helpful to list the columns in an INSERT statement before the values. It is of course optional, but useful for reference when looking at the statement later

  • How to use nvl() function in SQL Loader

    I am trying to use nvl() funtion in my SQL Loader control file and I keep geting errors. Would someone please tell me where I can find the syntax reference to this?
    Thanks a lot!

    I just answered a similar question like this last Thursday.
    SQL*LOADER how to load blanks when data is null

  • How to use Conditional statements in SQL Loader control file

    Hi,
    I am using sql loader to load a flat file to the table. I am using control file for this purpose as show below:
    LOAD
    INTO TABLE store_shrink
    TRUNCATE
    FIELDS TERMINATED BY "     "
    TRAILING NULLCOLS
    SITE_ID char,
    ST_SHRINK char,
    ST_REVENUE char,
    SHRINK_PR char ":ST_SHRINK/:ST_REVENUE"
    My question is this. If in the flat file the value of 'ST_REVENUE' is '0', then I want 'SHRINK_PR' to be '0' as well, and skip the calculation (:st_shrink/:st_revenue).
    How to achieve this with the conditional statement or using any Oracle function?
    Any help or suggestion is greatly appreciated.
    Thanks in advance.

    Hi there,
    I tried the following in my above query and it doesn't work somehow. Anyone has an idea? I have been on internet throughout but to no avail. Please help:
    LOAD
    INTO TABLE store_shrink
    TRUNCATE
    FIELDS TERMINATED BY "     "
    TRAILING NULLCOLS
    SITE_ID char,
    ST_SHRINK char,
    ST_REVENUE char,
    SHRINK_PR char "case (when :st_revenue<>'0.00' then :SHRINK_PR=:ST_SHRINK/:ST_REVENUE else :SHRINK_PR='0.00') end"
    )

  • Using constant values in SQL Loader control file

    Hi,
    I have a requirement where I need to migrate the data of 6 tables from MS Access into Oracle data base. I have written SQL Loader scripts so that I can create CSV files based on MS Access data and then migrate into Oracle. But in Oracle tables we have 4 common columns like Create_By, Created_Date,Updated_By and Update_Date, and those fields should be taken care by system. So, basically system should update login user ID and sysdate values in respective columns. My question here is, I do not have above mentioned common columns in MS Access tables. So, can I hard code those values in control file by saying
    Created_By CONSTANT user,
    Create_Date CONSTANT TO_CHAR(SYSDATE,'MM/DD/YYYY HH24:MI:SS'),
    Updated_By CONSTANT user,
    Updated_Date CONSTANT TO_CHAR(SYSDATE,'MM/DD/YYYY HH24:MI:SS')
    Please let me know your valuable suggestions.

    You do it without it constant
    --sample data file
    1,2,,
    3,4,,
    LOAD DATA
    INFILE 'D:\d.csv'
    INTO TABLE ABC
    FIELDS TERMINATED BY ","
    ( A,
      B,
      C sysdate,
      D "USER"
    )Edited by: J99 on Jul 23, 2009 12:14 PM
    OR use to avoid extra ',' in datafile.
    --sample data file
    1,2
    3,4
    LOAD DATA
    INFILE 'D:\d.csv'
    INTO TABLE ABC
    FIELDS TERMINATED BY "," TRAILING NULLCOLS
    ( A,
      B,
      C sysdate,
      D "USER"
    )

  • Index is unusable after using Direct path of SQL Loader

    When using the Direct path of SQL Loader to load records from a comma(,) seperated file into a table, the index which I specify the records are sorted in becomes unusable. In the log file created by SQL Loader the error: "ORA-01409: NOSORT option may not be used; rows are not in ascending order" occurs. I am pre sorting the data in the order of the index, the index is a multi column index. The index and table prior to initiating SQL Loader are not empty and contain existing records.
    Can anybody suggest a reason as to why this is happening even thought I am pre sorting the data in ascending order as per the index?
    Brad.

    Thanks for reply Gerald!
    After further investigation I kind of answered my own question and found that the data I was loading was not in the order that Oracle was expecting. The problem stems from how the UNIX command SORT works compared to how data is ordered by in Oracle by the ORDER BY statement. I was using UNIX's SORT command to pre sort my data before loading it directly into Oracle.
    UNIX's SORT seems to order Null's before anything else in ascending order, where as Oracle's ORDER BY order's Null values last for ascending order!
    I got around this by putting in my pre-sorted data the highest printing ASCII character when ever a field was Null. The highest printing ASCII character is 'tilde' (~). As UNIX's SORT uses the ASCII collating sequence to determine the sort order the post-sorted data was sorted as per Oracle expected. And to not actually load up "~" into my Oracle table I just specified in the SQL*Loader control file for these fields, NULLIF (field_name = "~").

Maybe you are looking for

  • Any functional spec or technical spec for  MB5B report??

    Guys, Please send me any Functional Spec or Technical Spec of the standard SAP report  with tcode MB5B ( Stock as on posting date). Any document explaining logic or technical flow is very much useful to me..I need to understand the logic and what are

  • Payment block for Vendor

    Hi, As per tolerance limit maintained say 5%, if there is increase in amount more than 5% in MIRO, system blocks that invoice for payment. This is at company code level. But I want that system should not block the invoice for payment for few vendors

  • Moving Itunes From one computer to another

    HI, Can anyone offer advice for moving Itunes from one laptop to another? The original laptop that has my itunes on it, has broke, it froze and will not operate. Rather that lose all my purchased music and playlists i would like to swap it to another

  • Reset a Lightweight AP

    Hi I have a WLC4402-12 which I have installed perfectly with no issues. I have also configured 2nos AIR-LAP1242AG-E-K9 Access points without any issues. I have 8 more LWAPP's of the same type which have had their configs locked because they have move

  • Retina MacBook Pro fan very loud and usb mouse disconnecting when restarting the computer

    2 days ago, I got a brand new retina MacBook Pro 13 inch with the specs 2.8 GhZ i7, 8 gigs of ram and 512gb ssd. It should definitely be functioning properly since it's a really expensive computer, but I already noticed a few problems that I hope you