Import and process larger data with SQL*Loader and Java resource

Hello,
I have a project to import data from a text file in a schedule. A lager data, with nearly 20,000 record/1 hours.
After that, we have to analysis the data, and export the results into a another database.
I research about SQL*Loader and Java resource to do these task. But I have no experiment about that.
I'm afraid of the huge data, Oracle could be slowdown or the session in Java Resource application could be timeout.
Please tell me some advice about the solution.
Thank you very much.

With '?' mark i mean " How i can link this COL1 with column in csv file ? "
Attilio

Similar Messages

  • Sql@loader-704  and ORA-12154: error messages when trying to load data with SQL Loader

    I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
    The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
    The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
    The first time I loaded the data i did it from a command line with SQL loader
    Now when I try to load the data I get this message:
    sql@loader-704 Internal error: ulconnect OCISERVERATTACH
    ORA-12154: tns:could not resolve the connect identifier specified
    I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
    I am able to  connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
    However SQL Developer will not let me load a file this big
    I have also tried to load the file within Apex  (SQL Workshop/ Utilities) but again, the file is too big.
    So it seems like SQL Loader is the only option
    I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
    Not sure what else to try or where to look
    thanks

    Hi,
    You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
    open a command prompt
    set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
    This will tell oracle use the config files you find here and no others
    then try sqlldr user/pass@db (in the same dos window)
    see if that connects and let us know.
    Cheers,
    Harry
    http://dbaharrison.blogspot.com

  • Loading huge file with Sql-Loader from Java

    Hi,
    I have a csv file with aprox. 3 and a half million records.
    I load this data with sqlldr from within java like this:
               String command = "sqlldr userid=" + user + "/" + pass
                        + "@" + service + " control='" + ctlFile + "'";
                System.out.println(command);
                if (System.getProperty("os.name").contains("Windows")) {
                    p = Runtime.getRuntime().exec("cmd /C " + command);
                } else {
                    p = Runtime.getRuntime().exec("sh -c " + command);
                }it does what I want to, load the data to a certain table, BUT it takes too much time, Is there a faster way to load data to an oracle db from within java?
    Thanks, any advice is very welcome

    Have your DBA work on this issue - they can monitor and check performance of SQL*Loader
    SQL*Loader performance tips          [Document 28631.1]
    SQL*LOADER SLOW PERFORMANCE          [Document 1026145.6]
    Master Note for SQL*Loader          [Document 1264730.1]
    HTH
    Srini

  • Load data with SQL Loader link field between CSV file and Control File

    Hi all,
    in a SQL Loader control file, how do you specify link with field in CSV file and Control file?
    E.g. if I wat to import the record in table TEST (col1, col2, col3) with data in csv file BUT in different position. How to do this?
    FILE CSV (with variable position):
    test1;prova;pippo;Ferrari;
    xx;yy;hello;by;
    In the table TEST i want that col1 = 'prova' (xx),
    col2 = 'Ferrari' (yy)
    col3 = default N
    the others data in CSV file are ignored.
    so:
    load data
    infile 'TEST.CSV'
    into table TEST
    fields terminated by ';'
    col1 ?????,
    col2 ?????,
    col3 CONSTANT "N"
    Thanks,
    Attilio

    With '?' mark i mean " How i can link this COL1 with column in csv file ? "
    Attilio

  • How to Load Data with SQl Loader to a Form (D2k,VB,Applet.Etc.,)

    hi.,
    By SQL LOADER iam able to Transfer data to Oracle Database, from a Flat File System.
    But now i want to Transfter the data from Flat Files, to Oracle Apps/VB/D2k Etc.,
    with help of SQL Loader.,
    is it possible to do..?
    Thanks in Advance,
    With Regards.,
    N.GowriShankar.

    For the Applications you can use file handling built-ins such as TEXT_IO, UTL_FILE, etc. You can write a batch program for sqlldr and can invoke it from front end Applications.

  • Tutorials on loading data with SQL LOADER

    Dear reader
    Pls i need tutorials on how to use SQL LOADER as well as a software for loading data into oracle software

    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/toc.htm
    Chapters 6-14 are filled with examples and tutorials

  • Loading data with sql loader

    Hi Experts,
    I have a file with the following format. I have to insert the data of those files in a table. I can use SQL Loader to load those files.
    My question is I need to schedule the upload of those files. Can i incorporate sql loader in a procedure?
    Agent Id|Agent Type|Create Date|Termination CDC|Activation CDC|Deactivation CDC|Agent IdX|Agent Status|Status Date|Status Reason Code|Update CDC|Update Serial|Update User|New Owner Agent Id|Previous Owner Agent Id|Agent Name|Primary Address1|Primary Address2|Primary Address3|Secondary Address1|Secondary Address2|Secondary Address3| Primary City|Primary State|Primary Zip|Primary Zip Suffix|Primary Country|Secondary City|Secondary State|Secondary Zip|Secondary Zip Suffix|Secondary Country|Phone Number|Fax number|Mobile Number|Business Type|Field Rep|Bill to Chain Id|Mon Open Time|Mon Close Time|Tue Open Time|Tue Close Time|Wed Open Time|Wed Close Time|Thu Open Time|Thu Close Time|Fri Open Time|Fri Close Time|Sat Open Time|Sat Close Time|Sun Open Time|Sun Close Time|Zone Id|Line Charge Class|Chain Id|Chain Code| Primary Contact  Name| Primary Contact Title| Primary Contact Phone|Secondary Contact Name|Secondary Contact Title|Secondary Contact Phone|Tertiary contact Name|Tertiary Contact Title|Tertiary Contact Phone| Bank Id| Bank Account Id| bank Account Type| Bank Account Date| EFT Flag| Fund Limit|Invoicable|TaxCode|Tax Id|Sales Tax|Service Charge|Instant Cashing Type|Instant Telsel Rep| Instant Number of Bins| Instant Number Itvms| InstantCredit Limit|Auto Reorder| Instant Terminal Reorder| Instant Telsel Reorder| Instant Teleset Active CDC| Instant Initial Distribution|Auto Telsel Schedule| Instant Auto Settle| Instant Call Day| Instant Call Week| Instant Call Cycle| Instant Order Restriction| Instant Delivery Flag| Instant Account Type| Instant Settle Class| Region|County|Territory|Route|Chain Statement|Master Agent Id| Minority Owned| Tax Name| State Tax Id|Mailing Name| Bank Account Name| DSR
    0|1|0|0|0|0|0|1|0|0|302|0|0|0|0|||||||||||||||||||||0|0|0|||||||||||||||0|0|0|||||||||||||0|-2145916800|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0||0|0|0|||||
    1|1|1256213087|0|-39081|-39081|1|2|1256213087|999|302|0|0|0|0|Pseudo Outlet||||||||MU|||MU||MU|||MU||||0|0|1|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|0|0|0|||||||||||||
    {code)
    Edited by: Kevin CK on 02-Feb-2010 03:28                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    Yes sorry about that mishap
    Agent Id|Agent Type|Create Date|Termination CDC|Activation CDC|Deactivation CDC|Agent IdX|Agent Status|Status Date|Status Reason Code|Update CDC|Update Serial|Update User|New Owner Agent Id|Previous Owner Agent Id|Agent Name|Primary Address1|Primary Address2|Primary Address3|Secondary Address1|Secondary Address2|Secondary Address3| Primary City|Primary State|Primary Zip|Primary Zip Suffix|Primary Country|Secondary City|Secondary State|Secondary Zip|Secondary Zip Suffix|Secondary Country|Phone Number|Fax number|Mobile Number|Business Type|Field Rep|Bill to Chain Id|Mon Open Time|Mon Close Time|Tue Open Time|Tue Close Time|Wed Open Time|Wed Close Time|Thu Open Time|Thu Close Time|Fri Open Time|Fri Close Time|Sat Open Time|Sat Close Time|Sun Open Time|Sun Close Time|Zone Id|Line Charge Class|Chain Id|Chain Code| Primary Contact  Name| Primary Contact Title| Primary Contact Phone|Secondary Contact Name|Secondary Contact Title|Secondary Contact Phone|Tertiary contact Name|Tertiary Contact Title|Tertiary Contact Phone| Bank Id| Bank Account Id| bank Account Type| Bank Account Date| EFT Flag| Fund Limit|Invoicable|TaxCode|Tax Id|Sales Tax|Service Charge|Instant Cashing Type|Instant Telsel Rep| Instant Number of Bins| Instant Number Itvms| InstantCredit Limit|Auto Reorder| Instant Terminal Reorder| Instant Telsel Reorder| Instant Teleset Active CDC| Instant Initial Distribution|Auto Telsel Schedule| Instant Auto Settle| Instant Call Day| Instant Call Week| Instant Call Cycle| Instant Order Restriction| Instant Delivery Flag| Instant Account Type| Instant Settle Class| Region|County|Territory|Route|Chain Statement|Master Agent Id| Minority Owned| Tax Name| State Tax Id|Mailing Name| Bank Account Name| DSR
    0|1|0|0|0|0|0|1|0|0|302|0|0|0|0|||||||||||||||||||||0|0|0|||||||||||||||0|0|0|||||||||||||0|-2145916800|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0||0|0|0|||||
    1|1|1256213087|0|-39081|-39081|1|2|1256213087|999|302|0|0|0|0|Pseudo Outlet||||||||MU|||MU||MU|||MU||||0|0|1|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|0|0|0|||||||||||||1|-2145916800|1|0|1|0||0|0|0|0|0|0|0|0|0|0|-3287|0|0|0|1|1|2|0|0|0|1|0|999|0||5|0|0|||||This is my file format which is a .txt file

  • Error loading large field with sql loader for 8i

    I want to load a large field (30K) to a database, where I use "long" datatype for it. But I can't because it loads hexadecimal characters, or it get stuck in a loop and does not load anything. I have tried using clob and long in the database and using char(40000), char, and log varraw in the control file. Any idea or suggestion?
    Thanks a lot.

    Hi,
    from what I see you are running sqllldr in Windows and your database server is on AIX. Is it a remote server or a local network server?
    The ORA-12170 error can be raised for many reasons: firewall issue, database down, listener down, bad sqlnet.ora parm, network trouble, etc
    If your network is slow maybe that is the reason. You may post also SQLNET.ORa configuration and try to reconfigure parameters SQLNET.INBOUND_CONNECT_TIMEOUT, SQLNET.SEND_TIMEOUT, SQLNET.RECV_TIMEOUT in sqlnet.ora to larger values.
    If your problem is not solved, please post your SQLNET.ORA configuration.
    You can check here for details about setting it: Profile Parameters.
    Regards.
    Al

  • How can I load data into table with SQL*LOADER

    how can I load data into table with SQL*LOADER
    when column data length more than 255 bytes?
    when column exceed 255 ,data can not be insert into table by SQL*LOADER
    CREATE TABLE A (
    A VARCHAR2 ( 10 ) ,
    B VARCHAR2 ( 10 ) ,
    C VARCHAR2 ( 10 ) ,
    E VARCHAR2 ( 2000 ) );
    control file:
    load data
    append into table A
    fields terminated by X'09'
    (A , B , C , E )
    SQL*LOADER command:
    sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
    datafile:
    column E is more than 255bytes
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)

    Check this out.
    http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961

  • How to export&import data using sql *loader

    Hi all,
    How to export&import data from sql*loader. Give me the clear steps..
    Thanks in Advance

    Hi did you already exported data from SQL SERVER? if not using SQL*LOADER you cannot export data. SQL*LOADER is only mean for importing data from flat files(usually text files) into ORACLE tables.
    for importing data into oracle tables using sql*loader use below steps
    1) create a sql*loader control file.
    it looks like as follows
    LOAD DATA
    INFILE 'sample.dat'
    BADFILE 'sample.bad'
    DISCARDFILE 'sample.dsc'
    APPEND
    INTO TABLE emp
    TRAILING NULLCOLS
    or for sample script of control file search google.
    2) at command prompt issue following
    $ sqlldr test/test
    enter control file=<give control file name which you create earlier>
    debug any errors (if occured)

  • Loading data with dates using SQL*Loader

    Dear everyone
    I am currently trying to load some data containing dates using SQL*Loader.
    For termination of fields I have been using ^ because I have some book titles which contain " and ' as part of their title. I found that the TO_DATE function did not seem to work using ^ instead of ". Would I be correct? I think the Oracle manual says that " must be used.
    After some Web research I eventually amended my control file to as follows:
    load data
    infile 'h:\insert_statements\22_insert_into_SCAN_FILE_INFO.txt'
    REPLACE
    into table SCAN_FILE_INFO
    fields terminated by "," optionally enclosed by '^'
    TRAILING NULLCOLS
    (scan_id, scan_filename
    file_format_id
    orig_scanning_resolution_dpi
    scanner_id, scanner_operator_id
    scanning_date "TO_DATE (:scanning_date, 'YYYY-MM-DD')"
    original_map_publication_id
    reprint_publication_id)
    A simple line of data is as follow:
    280001, ^1910 - London^, 270001, 400, 250001, 260001, "TO_DATE('2007-06-06', 'YYYY-MM-DD')", 200019,
    The final column being null.
    However when I attempt that I get the following error message:
    Record 1: Rejected - Error on table SCAN_FILE_INFO, column SCANNING_DATE.
    ORA-01841: (full) year must be between -4713 and +9999, and not be 0
    If I change the scanning_date part to:
    scanning_date "EXPRESSION TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    or
    scanning_date "CONSTANT TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    I get the error message:
    Record 1: Rejected - Error on table SCAN_FILE_INFO, column SCANNING_DATE.
    ORA-00917: missing comma
    As soon as I do the following:
    scanning_date "EXPRESSION, TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    or
    scanning_date "CONSTANT, TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    I get too many values error message:
    Record 1: Rejected - Error on table SCAN_FILE_INFO.
    ORA-00913: too many values
    I also tested out scanning_date DATE "YYYY-MM-DD", but that just gave the same ORA-01841 error message as above.
    I must be doing something very simple which is wrong but I cannot figure it out.
    Kind regards
    Tim

    And why do you have scanning date as "TO_DATE('2007-06-06', 'YYYY-MM-DD')" in your infile? All you need is 2007-06-06. If you can not change infile generation code, use:
    load data
    infile 'h:\insert_statements\22_insert_into_SCAN_FILE_INFO.txt'
    REPLACE
    into table SCAN_FILE_INFO
    fields terminated by "," optionally enclosed by '^'
    TRAILING NULLCOLS
    (scan_id, scan_filename
    file_format_id
    orig_scanning_resolution_dpi
    scanner_id, scanner_operator_id
    scanning_date "TO_DATE(REPLACE(REPLACE(:scanning_date,'TO_DATE('),'''YYYY-MM-DD'')'), 'YYYY-MM-DD')"
    original_map_publication_id
    reprint_publication_id)SY.

  • How to do it with SQL Loader

    All,
    I have two tables HEADER_TABLE and LINE_TABLE. Each header record can have multiple line records. I have to load data from a flat file to these tables.Flat file can have two types of records. H-Header, L-Line. It looks as follows.. Each H record can have multiple corresponding L records
    H..........
    L.......
    L......
    L......
    H.........
    L.......
    L......
    L......
    I have HEADER_ID column in HEADER_TABLE and HEADER_ID, LINE_ID columns in the LINE_TABLE.
    While loading data using SQL Loader, I need to generate HEADER_ID and LINE_ID values as follows and load them.
    H..........<HEADER_ID = 1>
    L....... <HEADER_ID = 1><LINE_ID = 1>
    L...... <HEADER_ID = 1><LINE_ID = 2>
    L...... <HEADER_ID = 1><LINE_ID = 3>
    H......... <HEADER_ID = 2>
    L....... <HEADER_ID = 2><LINE_ID = 4>
    L...... <HEADER_ID = 2><LINE_ID = 5>
    L...... <HEADER_ID = 2><LINE_ID = 6>
    Is it possible to do this with SQL LODER?
    I tried to do this with sequences. But it loaded the tables as follows.
    H..........<HEADER_ID = 1>
    L....... <HEADER_ID = 1><LINE_ID = 1>
    L...... <HEADER_ID = 1><LINE_ID = 2>
    L...... <HEADER_ID = 1><LINE_ID = 3>
    H......... <HEADER_ID = 2>
    L....... <HEADER_ID = 1><LINE_ID = 4>
    L...... <HEADER_ID = 1><LINE_ID = 5>
    L...... <HEADER_ID = 1><LINE_ID = 6>
    Thanks
    Ketha

    Morgan,
    Examples given in the link are quite generic and I have tried them. But my requirement is focused on generating header_id and line_id values as i have described. It seems that SQLLDR scans all records for a particular WHEN clause and insert them into the specified table. I think that if SQLLDR is made to read recod in the data file sequentially, this can be done.
    ANy idea of how to make SQLLDR read the records from the file sequentially?
    Thanks
    Ketha

  • Loading "fixed length" text files in UTF8 with SQL*Loader

    Hi!
    We have a lot of files, we load with SQL*Loader into our database. All Datafiles have fixed length columns, so we use POSITION(pos1, pos2) in the ctl-file. Till now the files were in WE8ISO8859P1 and everything was fine.
    Now the source-system generating the files changes to unicode and the files are in UTF8!
    The SQL-Loader docu says "The start and end arguments to the POSITION parameter are interpreted in bytes, even if character-length semantics are in use in a datafile....."
    As I see this now, there is no way to say "column A starts at "CHARACTER Position pos1" and ends at "Character Position pos2".
    I tested with
    load data
    CHARACTERSET AL32UTF8
    LENGTH SEMANTICS CHARACTER
    replace ...
    in the .ctl file, but when the first character with more than one byte encoding (for example ü ) is in the file, all positions of that record are mixed up.
    Is there a way to load these files in UTF8 without changing the file-definition to a column-seperator?
    Thanks for any hints - charly

    I have not tested this but you should be able to achieve what you want by using LENGTH SEMANTICS CHARACTER and by specifying field lengths (e.g. CHAR(5)) instead of only their positions. You could still use the POSITION(*+n) syntax to skip any separator columns that contain only spaces or tabs.
    If the above does not work, an alternative would be to convert all UTF8 files to UTF16 before loading so that they become fixed-width.
    -- Sergiusz

  • Problem with loading file with SQL loader

    i am getting a problem with loading a file with SQL loader. The loading is getting
    terminated after around 2000 rows whereas there are around 2700000 rows in the file.
    The file is like
    919879086475,11/17/2004,11/20/2004
    919879698625,11/17/2004,11/17/2004
    919879698628,11/17/2004,11/17/2004
    the control file, i am using is like:-
    load data
    infile 'c:\ran\temp\pps_fc.txt'
              into table bm_05oct06
    fields terminated by ","
    (mobile_no, fcal, frdate )
    I hope, my question is clear. Please help, in solving the doubt.
    regards.

    So which thread is telling the truth?
    Doubt with SQL loader file wih spaces
    Are the fields delimited with spaces or with commas?
    Perhaps they are a mixture of delimiters and that is where the error is coming in?

  • Ignoring constraints with SQL*Loader & triggers

    When I load a table with SQL*Loader that has foreign key constraints and a before insert trigger, my foreign key constraints are ignored. The end result is a detail table that contains invalid rows (where some column values do not exist in the master table). If I drop the trigger and just load with SQL*Loader, the foreign key constraints are recognized. Any ideas why this is happening? The trigger simply populates the last updated date and user columns.

    I found this from Asktom very nice. It will help u a lot.
    http://asktom.oracle.com/pls/ask/f?p=4950:9:1988009758486146475::NO:9:F4950_P9_DISPLAYID:8806498660292

Maybe you are looking for

  • Case Insensitive Indexes

    In relation to switching on case insensitive queries using alter session set NLS_COMP=LINGUISTIC;Can anyone answer the following? > Yes, it works.... but I can't for the life of me figure out how to build a linguistic index that the LIKE clause will

  • Auto analysing of Music Folder for 'gapless playback'. How to stop?

    I have my music library on a portable HD which is connected to my old Quicksilver G4 via USB1. It seems that after a period of being open, but not playing, iTunes takes it upon itself to start analysing my Music Folder for 'gapless playback', and cer

  • Frequent Database Access through  Java Mapping ?

    I have to implement a Java Mapping Program which will require frequent database Access to compare and read table data from some other  data base . What is the most optimum procedure to  implement it ? Should i make a jdbc  call every time from the co

  • MSE 3300 License Query

    Hi there: I have a question. For example, i have a MSE3310 with 1000 client track license installed, what if i have 1001 clients, what will happen? Thanks a lot.

  • Set poup dialog visible

    Hi, In my Applet, I have a component called LookupField. When user clicks on LookupField a LookupFrame(JDialog) popups up, where the user can do some kind of search function. The problem is, when the LookupFrame is visible(poped up) and if we leave t