Problem in Data Migration via. SQL*Loader

Hi,
I am trying to load the data from a text generated file.
While using the slqldr command along with the url,".ctl", ".log",".bad" and ".dat" parameters, all the records goes to ".bad" file. Why?
Eventhouh it parse the control file successfully.
Anybody please help me in the matter.
I am sending the ".log" file generated during the transaction :
========================================================================
Control File: CUSTOMER.CTL
Data File: CUSTOMER.DAT
Bad File: CUSTOMER.BAD
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table CUSTOMER_BACKUP, loaded from every logical record.
Insert option in effect for this table: INSERT
Column Name Position Len Term Encl Datatype
CS_CODE FIRST * WHT O(") CHARACTER
CS_NAME NEXT * WHT O(") CHARACTER
CS_ADD1 NEXT * WHT O(") CHARACTER
CS_ADD2 NEXT * WHT O(") CHARACTER
CS_ADD3 NEXT * WHT O(") CHARACTER
CS_ADD4 NEXT * WHT O(") CHARACTER
CS_PIN NEXT * WHT O(") CHARACTER
CS_PHONE NEXT * WHT O(") CHARACTER
CS_TELEX NEXT * WHT O(") CHARACTER
CS_CR_DAYS NEXT * WHT O(") CHARACTER
CS_CR_LIM NEXT * WHT O(") CHARACTER
CS_TRD_DIS NEXT * WHT O(") CHARACTER
CS_CSH_DIS NEXT * WHT O(") CHARACTER
CS_STX_FRM NEXT * WHT O(") CHARACTER
CS_LSTX_NO NEXT * WHT O(") CHARACTER
CS_MOB_BAL NEXT * WHT O(") CHARACTER
CS_STX_PER NEXT * WHT O(") CHARACTER
CS_IND NEXT * WHT O(") CHARACTER
CS_CSTX_NO NEXT * WHT O(") CHARACTER
CS_SLMN_CD NEXT * WHT O(") CHARACTER
CS_BANK_1 NEXT * WHT O(") CHARACTER
CS_BANK_2 NEXT * WHT O(") CHARACTER
CS_BANK_3 NEXT * WHT O(") CHARACTER
CS_YOB_BAL NEXT * WHT O(") CHARACTER
CS_CURR NEXT * WHT O(") CHARACTER
CS_ZONE NEXT * WHT O(") CHARACTER
CS_CAT NEXT * WHT O(") CHARACTER
F_EDT NEXT * WHT O(") CHARACTER
F_UID NEXT * WHT O(") CHARACTER
F_ACTV NEXT * WHT O(") CHARACTER
CS_RANGE NEXT * WHT O(") CHARACTER
CS_ITNO NEXT * WHT O(") CHARACTER
CS_INT NEXT * WHT O(") CHARACTER
CS_CIRCLE NEXT * WHT O(") CHARACTER
CS_ECCCODE NEXT * WHT O(") CHARACTER
CS_MFRCODE NEXT * WHT O(") CHARACTER
CS_QTY_BUG NEXT * WHT O(") CHARACTER
CS_VAL_BUG NEXT * WHT O(") CHARACTER
Record 1: Rejected - Error on table CUSTOMER_BACKUP, column CS_YOB_BAL.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 2: Rejected - Error on table CUSTOMER_BACKUP, column CS_CURR.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 3: Rejected - Error on table CUSTOMER_BACKUP, column CS_CURR.
Column not found before end of logical record (use TRAILING NULLCOLS)
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 50: Rejected - Error on table CUSTOMER_BACKUP, column CS_MOB_BAL.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 51: Rejected - Error on table CUSTOMER_BACKUP, column CS_MOB_BAL.
Column not found before end of logical record (use TRAILING NULLCOLS)
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table CUSTOMER_BACKUP:
0 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 254904 bytes(26 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 51
Total logical records rejected: 51
Total logical records discarded: 0
Run began on Wed Aug 28 13:30:29 2002
Run ended on Wed Aug 28 13:30:29 2002
Elapsed time was: 00:00:00.18
CPU time was: 00:00:00.08
=================================================================
Regards,
Bimal

Hi,
I am bit late ,I cannot say bit, If we use TRAILING NULL COLS in the control file your problem should be fixed.
TRAILING NULLCOLS tells SQL*Loader to treat any relatively positioned columns that are not present in the record as null columns.
For example, if the following data
10 Accounting
is read with the following control file
INTO TABLE dept
TRAILING NULLCOLS
( deptno CHAR TERMINATED BY " ",
dname CHAR TERMINATED BY WHITESPACE,
loc CHAR TERMINATED BY WHITESPACE
and the record ends after DNAME. The remaining LOC field is set to null. Without the TRAILING NULLCOLS clause, an error would be generated due to missing data.

Similar Messages

  • Problem with field-length in sql-loader

    Hello,
    (sorry I see it's the wrong forum -> SQL-Developer, I searched for SQL-Loader, is there a possibility to change the forum ?)
    I can't find an answer for my question at google, so I hope there is someone in this forum who can help me.
    I have a dat-File that contains 12 500 000 records and want to laod it via sql-loader. The first field contains the ID and there were numbers from 1 to 12 500 000 stored.
    When I run the sql-loader, the ID run up to 9 999 999 and then, for the last 2 500 001 records, ist starts at 1 again.
    I noticed a few things :
    1. Numbers < 10 000 000 don't make Problems
    2. Numbers >= 10 000 000 make Problems, the first digit ( in this example "1") is cut, so the number "10 000 001" ist stored as "1". It comes to double entries (IDs 1 to 2 500 000).
    3. The same field-definition, I have for the third field of the record -> there is no Problem. THERE I can store any number.
    4. I tried to store a number > 100 000 000 -> the first digit was cut too, but ONLY the first digit.
    5. I'm able to store any number manually in the Database.
    So, I have a problem with the first field. If the number is greater then 10 000 000, the first Number is cut. It doesn't make any differance, if the number is 10 000 000 or 999 999 999, just the first digit, in the first field, is cut.
    Any idea ??????????
    Here some infos :
    Database :
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    SQL-PLUS :
    SQL*Plus: Release 10.2.0.4.0 - Production
    Script sqlldr :
    sqlplus ${schema}/$2 <<EOF >>LOAD.LOG
    set timing on
    set echo on
    set heading off
    set heading on
    !sqlldr userid=${schema}/$2 control=surface_geometry.ctl log=surface_geometry.log
    exit
    EOF
    ctl-File :
    LOAD DATA
    INFILE imp_surface_geometry_test2
    TRUNCATE
    CONTINUEIF NEXT(1:1) = '#'
    INTO TABLE SURFACE_GEOMETRY
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS (
    ID INTEGER EXTERNAL ,
    GMLID,
    GMLID_CODESPACE,
    PARENT_ID NULLIF PARENT_ID = BLANKS,
    ROOT_ID NULLIF ROOT_ID = BLANKS,
    IS_SOLID,
    IS_COMPOSITE,
    IS_TRIANGULATED,
    IS_XLINK,
    IS_REVERSE,
    GEB_ID,
    GEOMETRY COLUMN OBJECT
    SDO_GTYPE INTEGER EXTERNAL,
    SDO_SRID CONSTANT 31468,
    SDO_ELEM_INFO VARRAY TERMINATED BY '|/'
    (X FLOAT EXTERNAL),
    SDO_ORDINATES VARRAY TERMINATED BY '|/'
    (X FLOAT EXTERNAL)
    Table-Definition (sql-File) :
    CREATE TABLE SURFACE_GEOMETRY (
    ID NUMBER,
    GMLID VARCHAR2(256),
    GMLID_CODESPACE VARCHAR2(1000),
    PARENT_ID NUMBER,
    ROOT_ID NUMBER,
    IS_SOLID NUMBER(1,0),
    IS_COMPOSITE NUMBER(1,0),
    IS_TRIANGULATED NUMBER(1,0),
    IS_XLINK NUMBER(1,0),
    IS_REVERSE NUMBER(1,0),
    GEB_ID CHAR(7),
    GEOMETRY MDSYS.SDO_GEOMETRY,
    CONSTRAINT c_unique_id UNIQUE (ID))
    storage (initial 1M next 1M maxextents 1024) ;
    Some Entries in the dat-File :
    12556067| |XXX|12556066|12556066|0|0|0|0|0| |
    #3003|1|1003|1|/
    #4479400.000000|5333360.000000| 526.870000|4479380.000000|5333360.000000| 526.720000|4479400.000000|5333340.000000| 526.980000|4479400.000000|5333360.000000| 526.870000|/
    12556068| |XXX| |12556068|0|0|1|0|0| |
    #||/
    #|/
    12556069| |XXX|12556068|12556068|0|0|0|0|0| |
    #3003|1|1003|1|/
    #4479380.000000|5333380.000000| 526.600000|4479380.000000|5333360.000000| 526.720000|4479400.000000|5333360.000000| 526.870000|4479380.000000|5333380.000000| 526.600000|/
    log-File : (100 records for the test)
    SQL*Loader: Release 10.2.0.4.0 - Production on Fr Mai 28 15:16:43 2010
    Copyright (c) 1982, 2007, Oracle. All rights reserved.
    Kontrolldatei: surface_geometry.ctl
    Datendatei: imp_surface_geometry_test.dat
    Fehlerdatei: imp_surface_geometry_test.bad
    Datei für zurückgewiesene Sätze: nichts spezifiziert
    (alle Discards zulassen)
    Zu ladende Anzahl: ALL
    Zu überspringende Anzahl: 0
    Zulässige Fehler: 50
    Bind-Array: 64 Zeilen, maximal 256000 Bytes
    Fortsetzung: 1:1 = 0X23(Zeichen '#'), im nächsten physischen Satz
    Benutzer Pfad: Konventionell
    Tabelle SURFACE_GEOMETRY, geladen von jedem logischen Satz.
    Insert-Option in Kraft für diese Tabelle: TRUNCATE
    Option TRAILING NULLCOLS ist wirksam
    Spaltenname Position Läng Term Eing Datentyp
    ID FIRST * | CHARACTER
    GMLID NEXT * | CHARACTER
    GMLID_CODESPACE NEXT * | CHARACTER
    PARENT_ID NEXT * | CHARACTER
    NULL wenn PARENT_ID = BLANKS
    ROOT_ID NEXT * | CHARACTER
    NULL wenn ROOT_ID = BLANKS
    IS_SOLID NEXT * | CHARACTER
    IS_COMPOSITE NEXT * | CHARACTER
    IS_TRIANGULATED NEXT * | CHARACTER
    IS_XLINK NEXT * | CHARACTER
    IS_REVERSE NEXT * | CHARACTER
    GEB_ID NEXT * | CHARACTER
    GEOMETRY DERIVED * COLUMN OBJECT
    *** Felder in GEOMETRY
    SDO_GTYPE NEXT * | CHARACTER
    SDO_SRID CONSTANT
    Wert ist '31468'
    SDO_ELEM_INFO DERIVED * VARRAY
    Abschlusszeichenfolge : '|/'
    *** Felder in GEOMETRY.SDO_ELEM_INFO
    X FIRST * | CHARACTER
    *** Feldende in GEOMETRY.SDO_ELEM_INFO
    SDO_ORDINATES DERIVED * VARRAY
    Abschlusszeichenfolge : '|/'
    *** Felder in GEOMETRY.SDO_ORDINATES
    X FIRST * | CHARACTER
    *** Feldende in GEOMETRY.SDO_ORDINATES
    *** Feldende in GEOMETRY
    Tabelle SURFACE_GEOMETRY:
    100 Zeilen erfolgreich geladen.
    0 Zeilen aufgrund von Datenfehlern nicht geladen.
    0 Zeilen nicht geladen, da alle WHEN-Klauseln fehlerhaft waren.
    0 Zeilen nicht geladen, da alle Felder NULL waren.
    Zugewiesener Bereich für Bind-Array: 232576 Bytes (64 Zeilen)
    Byte in Lese-Puffer: 1048576
    Gesamtzahl der übersprungenen logischen Datensätze: 0
    Gesamtzahl der gelesenen logischen Datensätze: 100
    Gesamtzahl der abgelehnten logischen Datensätze: 0
    Gesamtzahl der zurückgewiesenen logischen Datensätze: 0
    Lauf begonnen am Fr Mai 28 15:16:43 2010
    Lauf beendet am Fr Mai 28 15:16:43 2010
    Abgelaufene Zeit: 00:00:00.19
    CPU-Zeit: 00:00:00.01
    Edited by: user9338988 on 28.05.2010 06:21

    sorry, wrong forum. I opened the thread in forum "Export/Import/SQL-Loader & External Tables"

  • Load image file into a LONG RAW column via SQL*Loader

    Does anyone know how to load a image file into a LONG RAW column via SQL*Loader?
    Thanks
    -Hsing-Hua-

    Are you trying to import the image on the client into Oracle lite, or from the server end into the main oracle database for download to the client?
    On our system images are loaded into the the oracle lite database (10g R1 at the moment) on the client (DELL X50 PDA's) into a BLOB (Not LONG RAW) column using Java (relatively standard functionality in Java) as this is what our application software is written in.
    From the server end, we do not at the moment load images as such, but we do load the application software into the database for deployment of new versions to the clients (the DMC/DMS updates we have found very unreliable), and the technique should be the same for images. Again the file is imported into a BLOB column.
    NOTE a column defined as BLOB on the main Oracle database appears as a LONG VARBINARY on the client Oracle lite database, but the synchronisation process handles the conversion with no problem.
    To import into a BLOB column on the server try the following
    1) you will need to create a DIRECTORY (CREATE DIRECTORY command) within the database, pointing at a directory on the database server (or accessible from it). This is needed for the file location.
    CREATE OR REPLACE DIRECTORY PDA_FILE_UPLOAD AS '/pdaapps/jar/'
    NOTE create directory needs to be run as SYSTEM
    2) define your table to hold the image and other data. Our tables is
    SQL> desc pda_applications
    Name Null? Type
    ID NOT NULL NUMBER(12)
    PDAAT_CODE NOT NULL VARCHAR2(10)
    VERSION NOT NULL NUMBER(5,2)
    PART_NO NOT NULL NUMBER(3)
    FILE_OBJECT BLOB
    DEPLOY_APPLICATION_YN NOT NULL VARCHAR2(1)
    3) copy the image (or in our case a .jar file) to the database server directory specified in step 1
    4) the actual import is done using a DBMB_LOB procedure
    PROCEDURE pr_load_file
    Module Name     :     pr_load_file
    Description     :     Main call. Create a new pda_applications record, and import the specified file into it
    Version History:
    Vers. Author Date Reason
    1.0 G Wilkinson 03/02/2006 Initial Version.
    (PA_VERSION IN NUMBER
    ,PA_FILENAME IN VARCHAR2
    ,PA_PDAAT_CODE IN VARCHAR2
    ,PA_PART_NO IN NUMBER DEFAULT 1
    ,PA_DEPLOY IN VARCHAR2 DEFAULT 'Y')
    IS
    l_FileLocator BFILE;
    l_blobLocator BLOB;
    l_seq NUMBER;
    l_location VARCHAR2(20);
    no_params EXCEPTION;
    call_fail EXCEPTION;
    BEGIN
    -- Throw error if required details not present
    IF pa_version IS NULL
    OR pa_filename IS NULL
    OR pa_pdaat_code IS NULL THEN
    RAISE no_params;
    END IF;
    -- Initialize the BLOB locator for writing. Note that we have
    -- to import a blob into a table as part of a SELECT FOR UPDATE. This locks the row,
    -- and is a requirement for LOADFROMFILE.
    SELECT pdaa_id_seq.nextval
    INTO l_seq
    FROM dual;
    -- First create the application record (file is imported by update, not insert
    INSERT INTO pda_applications
    (ID,PDAAT_CODE,VERSION,PART_NO,FILE_OBJECT,DEPLOY_APPLICATION_YN)
    VALUES (l_seq,pa_pdaat_code,pa_version,pa_part_no,EMPTY_BLOB(),pa_deploy);
    -- Lock record for update to import the file
    SELECT file_object INTO l_blobLocator
    FROM pda_applications
    WHERE id=l_seq
    FOR UPDATE;
    -- Initialize the BFILE locator for reading.
    l_FileLocator := BFILENAME('PDA_FILE_UPLOAD', pa_filename);
    DBMS_LOB.FILEOPEN(l_FileLocator, DBMS_LOB.FILE_READONLY);
    -- Load the entire file into the character LOB.
    -- This is necessary so that we have the data in
    -- character rather than RAW variables.
    DBMS_LOB.LOADFROMFILE(l_blobLocator, l_FileLocator
    ,DBMS_LOB.GETLENGTH(l_FileLocator)
    ,src_offset => 1);
    -- Clean up.
    DBMS_LOB.FILECLOSE(l_FileLocator);
    -- Create download records for each user associated with the application for sending to the PDA's
    -- to install the software
    IF pa_deploy = 'Y' then
    IF fn_deploy (pa_pdaa_id => l_seq
    ,pa_pdaat_code => pa_pdaat_code) != 'SUCCESS' THEN
    RAISE call_fail;
    END IF;
    END IF;
    EXCEPTION
    WHEN no_params THEN
    pkg_appm.pr_log_message( pa_mdl_name => g_module_name
    , pa_mdl_version => fn_get_body_version
    , pa_error_code => SQLCODE
    , pa_location => l_location
    , pa_text => 'Missing parameters'
    , pa_severity => 'E'
    WHEN OTHERS THEN
    DBMS_LOB.FILECLOSE(l_FileLocator);
    pkg_appm.pr_log_message( pa_mdl_name => g_module_name
    , pa_mdl_version => fn_get_body_version
    , pa_error_code => SQLCODE
    , pa_location => l_location
    , pa_text => SQLERRM
    , pa_severity => 'E'
    END pr_load_file;
    I hope this may be of some help

  • Loading XML files in Oracle via SQL*Loader

    Hello,
    I need to load the data in an XML file provided by a thrird party into my Oracle DB. This file is not formatted as per the requirements of XSU, so I need to find an alternative. In the online documentation for Oracle8i it is stated that one way to load XML files into Oacle is via SQL*Loader. On the SQL*Loader documentation this is not mentioned. Also as far as I know, SQL*Loader can be used with delimited fields or fixed length fields and thus I do not see how this can be done (but it would be very cool).
    Can you give me any advice on this?
    Thanks
    Ciao
    Ferruccio

    No, SQL*Loader cannot process DBF files directly
    Tom Kyte provides a package to read in DBF files thorugh utl_file
    http://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:711825134415

  • How to load an Excel sheet into Oracle via SQL* Loader

    Hallo!I am trying to load an Excel spreadsheet into an Oracle 10g database via SQL*Loader.
    How do I achieve this via Oracle Enterprise Manager?
    Please give a detailed explanation especially covering the locations of the control file,infile,bad file etc for Oracle 10g running in both Windows and Linux platforms.
    Thanks.

    This has nothing to do with Oracle Enterprise Manager
    SQL Loader is command line and it needs to be installed with SQL PLUS (The wrong install type will not have it)
    Consider reviewing this:
    http://www.orafaq.com/wiki/SQL*Loader_FAQ

  • How to Import data via SQL Loader with characterset  UTF16 little endian?

    Hello,
    I'm importing data from text file into one of my table which contains blob column.
    I've specified following in my control file.
    -----Control file-------
    LOAD DATA
    CHARACTERSET UTF16
    BYTEORDER LITTLE
    INFILE './DataFiles/Data.txt'
    BADFILE './Logs/Data.bad'
    INTO TABLE temp_blob truncate
    FIELDS TERMINATED BY "     "
    TRAILING NULLCOLS
    (GROUP_BLOB,CODE)
    Problem:
    SQL Loader always importing data via big endian. Is there any method available using which we can convert these data to little endian?
    Thanks

    A new preference has been added to customize the import delimiter in main code line. This should be available as part of future release.

  • Need suggestions for imporving data load performance via SQL Loader

    Hi,
    Our requirement is to load 512 (1 GB each) files in Oracle database.
    We are using SQL loaders to load files into the DB (A partitioned table) and have tried almost all the possible options that come with sql loaders (Direct load path, parallel=true, multithreading=true, unrecoverable)
    As the tables is growing bigger in size, each file load time is increasing (It started with 5 minutes per file and has reached 2 hours per 3 files now and is increasing with every batch- Note we are loading 3 files concurrently on the target table using the parallel = true oprion of sql loader)
    Questions 1:
    My problem is that somehow multithreading is not working for us (we have multi CPU server and have enabled multithreading=true). Could it be something to do with DB setting which might be hindering the data load to be done in multiple threads?
    Question 2:
    Would gathering stats on the target table and it's partitions help improve load performance ? I'm not sure if stats improve DML's, they would definitely improve sql queries. Any thoughts?
    Question 3:
    What would be the best strategy to gather stats on this table (which would end up having 512 GB data) ?
    Question 4:
    Do you think insertions in a partitioned table (with growing sizes) would have poor performance as compared to a non-partitioned table ?
    Any other suggestions to improve performace are most welcome !!
    Thanks,
    Sachin
    Edited by: Sachin Tiwari on Mar 13, 2013 6:29 AM

    2 hours to load just 3 GB of data seems unreasonable regardless of the SQL Loader settings. It seems likely to me that the problem is not with SQL Loader but somewhere else.
    Have you generated a Statspack/ AWR/ ASH report to see where all that time is being spent? Are there triggers on the table? Are there bitmap indexes?
    Is your table partitioned in a way that is designed to improve the efficiency of loads so that all the data from one file goes into one partition? Or is data from each file getting inserted into many different partitions.
    Justin

  • Problem import csv file with SQL*loader and control file

    I have a *csv file looking like this:
    E0100070;EKKJ 1X10/10 1 KV;1;2003-06-16;01C;75
    E0100075;EKKJ 1X10/10 1 KV;500;2003-06-16;01C;67
    E0100440;EKKJ 2X2,5/2,5 1 KV;1;2003-06-16;01C;37,2
    E0100445;EKKJ 2X2,5/2,5 1 KV;500;2003-06-16;01C;33,2
    E0100450;EKKJ 2X4/4 1 KV;1;2003-06-16;01C;53
    E0100455;EKKJ 2X4/4 1 KV;500;2003-06-16;01C;47,1
    I want to import this csv file to this table:
    create table artikel (artnr varchar2(10), namn varchar2(25), fp_storlek number, datum date, mtrlid varchar2(5), pris number);
    My controlfile looks like this:
    LOAD DATA
    INFILE 'e:\test.csv'
    INSERT
    INTO TABLE ARTIKEL
    FIELDS TERMINATED BY ';'
    TRAILING NULLCOLS
    (ARTNR, NAMN, FP_STORLEK char "to_number(:fp_storlek,'99999')", DATUM date 'yyyy-mm-dd', MTRLID, pris char "to_number(:pris,'999999D99')")
    I cant get sql*loader to import the last column(pris) as I want. It ignore my decimal point which in this case is "," and not "." maybe this is the problem. If the decimal point is the problem how can I get oracle to recognize "," as a decimal point??
    the result from the import now, is that a decimal number (37,2) becomes 372 in the table

    Set NLS_NUMERIC_CHARACTERS environment variable at OS level, before running SqlLoader :
    $ cat test.csv
    E0100070;EKKJ 1X10/10 1 KV;1;2003-06-16;01C;75
    E0100075;EKKJ 1X10/10 1 KV;500;2003-06-16;01C;67
    E0100440;EKKJ 2X2,5/2,5 1 KV;1;2003-06-16;01C;37,2
    E0100445;EKKJ 2X2,5/2,5 1 KV;500;2003-06-16;01C;33,2
    E0100450;EKKJ 2X4/4 1 KV;1;2003-06-16;01C;53
    E0100455;EKKJ 2X4/4 1 KV;500;2003-06-16;01C;47,1
    $ cat artikel.ctl
    LOAD DATA
    INFILE 'test.csv'
    replace
    INTO TABLE ARTIKEL
    FIELDS TERMINATED BY ';'
    TRAILING NULLCOLS
    (ARTNR, NAMN, FP_STORLEK char "to_number(:fp_storlek,'99999')", DATUM date 'yyyy-mm-dd', MTRLID, pris char "to_number(:pris,'999999D99')")
    $ sqlldr scott/tiger control=artikel
    SQL*Loader: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:01 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Commit point reached - logical record count 6
    $ sqlplus scott/tiger
    SQL*Plus: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:11 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    SQL> select * from artikel;
    ARTNR      NAMN                      FP_STORLEK DATUM      MTRLI       PRIS
    E0100070   EKKJ 1X10/10 1 KV                  1 16/06/2003 01C           75
    E0100075   EKKJ 1X10/10 1 KV                500 16/06/2003 01C           67
    E0100440   EKKJ 2X2,5/2,5 1 KV                1 16/06/2003 01C          372
    E0100445   EKKJ 2X2,5/2,5 1 KV              500 16/06/2003 01C          332
    E0100450   EKKJ 2X4/4 1 KV                    1 16/06/2003 01C           53
    E0100455   EKKJ 2X4/4 1 KV                  500 16/06/2003 01C          471
    6 rows selected.
    SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    $ export NLS_NUMERIC_CHARACTERS=',.'
    $ sqlldr scott/tiger control=artikel
    SQL*Loader: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:41 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Commit point reached - logical record count 6
    $ sqlplus scott/tiger
    SQL*Plus: Release 10.1.0.3.0 - Production on Sat Nov 12 15:10:45 2005
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - Production
    With the Partitioning, OLAP and Data Mining options
    SQL> select * from artikel;
    ARTNR      NAMN                      FP_STORLEK DATUM      MTRLI       PRIS
    E0100070   EKKJ 1X10/10 1 KV                  1 16/06/2003 01C           75
    E0100075   EKKJ 1X10/10 1 KV                500 16/06/2003 01C           67
    E0100440   EKKJ 2X2,5/2,5 1 KV                1 16/06/2003 01C         37,2
    E0100445   EKKJ 2X2,5/2,5 1 KV              500 16/06/2003 01C         33,2
    E0100450   EKKJ 2X4/4 1 KV                    1 16/06/2003 01C           53
    E0100455   EKKJ 2X4/4 1 KV                  500 16/06/2003 01C         47,1
    6 rows selected.
    SQL>                                                                            Control file is exactly as yours, I just put replace instead of insert.

  • Data migration from SQL Server 6.5

    Hi,
    I would like to migrate data from SQL Server 6.5 to Oracle
    8.0.5. I can create flat files using the BCP utility in SQL
    Server, but how do I get my Oracle db to recognize the same?
    Please help.
    Regards
    Urmi
    null

    sohail rana (guest) wrote:
    : Hi Urmi,
    : Could you please let me know the first step how from where you
    : can use BCP utility in SQL Server so that you can get the flat
    : file. I have one assignment to do migration.
    : Please help
    : regards
    : Sohail rana
    : Urmi (guest) wrote:
    : : Hi,
    : : I would like to migrate data from SQL Server 6.5 to Oracle
    : : 8.0.5. I can create flat files using the BCP utility in SQL
    : : Server, but how do I get my Oracle db to recognize the same?
    : : Please help.
    : : Regards
    : : Urmi
    There is a new feature in version 1.2.1 of the migration
    workbench that will generate the BCP and SQL Loader scripts for
    you.
    Right mouse click on the 'Tables' component group in the Oracle
    Model UI tree and click 'Generate Data Migrate Script...'.
    This will place the scripts in the Log directory you have
    specified in the Tools -> Options menu. Go to the Logging tab and
    enter a valid value for 'Log file directory'.
    The data move scripts will be located in the 'Log file
    directory'\SQLServer6 directory.
    null

  • Automating Data Migration from SQL Server 2005

    Hi All,
    I need to migrate data from SQL Server 2005 to Oracle DB ( Datawarehouse). The data migration involves updating Dimensions and Fact table. I have got painful job of switching topology to point to over 100 different weekly SQL DBs ( in sequence) and pull the data into DW. How can I automate my ODI process to switch from one after the other.
    Any Ideas?
    KS

    ODI Variables are there to help you with this. Please go through this post at [http://blogs.oracle.com/dataintegration/2009/05/using_odi_variables_in_topolog.html] .
    This will show you how to use a variable Oracle data server.
    On similar lines you should be able to switch to any of your 100 weekly SQL server DBs.
    And loop in a sequence to load data from all of them.
    Hope that helps

  • .csv data import via SQL developer

    I am trying to load an Access 2003 table into an Oracle 10g database via SQL developer. The Access file is located on another system and there is no ODBC connection between it and the SQL server.
    What I have done so far is the following:
    i) Export Access table as comma-delimited text file;
    ii) Open text file and re-saved as .CSV;
    iii) On SQL developer - created a table that matches the column structure in the original Access table, then right-clicked > Import Data > on Data Import Wizard I click Next...
    after this, the wizard hangs on step 1, and does not allow me to move to the second step. It doesn't matter whether or not the SQL *Loader option is checked.
    If anyone could offer some advice on the procedure I have applied above and possible sources of weakness, I'd be grateful to hear your thoughts. I'm relatively new to Oracel SQL and not very experienced with databases in general.
    Thanks

    Since this is not native export/import, a better place could be this forum:
    SQL Developer

  • Export data ready for sql loader

    Hi,
    I need to export some data from my database ready to be used by sql loader ... what are tools can I use? Using sqlplus (i.e. spool file.csv select c1||";"||c2||";" from table; spool off) is the only tool I can use?

    Hello,
    You have multiple choices depending upon your oracle version on both source and target.
    *1. Conventional export/import (no need to generate csv)*
    *2. Datapump (10g)*
    *3. Extracting .csv file using sql and loading using sqlldr*
    *4. External table unloading data using datapump.*
    *5. External table using .csv file generated via sql script.*
    Regards

  • Filter data on GUI SQL Loader of 10g

    I have a 10g and using the SQL developer to import data from an excel to a table.
    after right click on the table and specify the file, i have to map the column, since there is no header info in the spreadsheet.
    i then came into 2 problems
    1) on column with data type, it will say the data is null or invalid format.
    2) only a subset of the data in the sheet is needed, so, i like to place a where clause type criteria during the import.
    how to i handle these two situations? is there step by step instructions to follow on each?
    thanks.

    Your problem is most likely in decode - the return type in your expression will be character based on first search value ('null'), so it will be implicitly converted to character and then again implicitly converted to date by loading into date column. At some of this conversions you probably are loosing your time part. You can try instead use cast:
    SQL> desc t
    Name                                      Null?    Type
    LASTWRITTEN                                        DATE
    CREATEDON                                          DATE
    LASTUPDATEDON                                      DATE
    SQL> select * from t;
    no rows selected
    SQL> !cat t.ctl
    LOAD DATA
    INFILE *
    INTO TABLE T
    TRUNCATE
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    LASTWRITTEN
    "decode(:LASTWRITTEN,'null',cast(Null as date),
      to_date(:LASTWRITTEN,'YYYY-MM-DD HH24:MI:SS'))",
    CREATEDON
    "decode(:CREATEDON,'null',cast(Null as date),
      to_date(:CREATEDON,'YYYY-MM-DD HH24:MI:SS'))",
    LASTUPDATEDON
    "decode(:LASTUPDATEDON,'null',cast(Null as date),
      to_date(:LASTUPDATEDON,'DD/MM/YYYY HH24:MI:SS'))"
    BEGINDATA
    2007-02-15 15:10:20,null,null
    null,2007-02-15 15:10:20,null
    null,null,15/02/2007 15:10:20
    SQL> !sqlldr userid=scott/tiger control=t.ctl log=t.log
    SQL*Loader: Release 10.2.0.3.0 - Production on Fri Feb 29 00:20:07 2008
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 3
    SQL> select * from t;
    LASTWRITTEN         CREATEDON           LASTUPDATEDON
    15.02.2007 15:10:20
                        15.02.2007 15:10:20
                                            15.02.2007 15:10:20Best regards
    Maxim

  • How do i map one field to another in control file via SQL Loader

    Can someone please reply back to this question
    Hi,
    I have a flat file (student.dat delimiter %~| ) using control file (student.ctl) through sql loader. Here are the details.
    student.dat
    student_id, student_firstname, gender, student_lastName, student_newId
    101%~|abc%~|F %~|xyz%~|110%~|
    Corresponding table
    Student (
    Student_ID,
    Student_FN,
    Gender,
    Student_LN
    Question:
    How do i map student_newId field to student_id field in STUDENT DB table so that new id should be inserted in student_id column. How do i specify the mapping in control file. I dont want to create a new column in student table. Please let me know the best way to do this.
    Can someone please reply back to this question.
    My approach:
    In control file i will sepecify the below, Is this a best approach?. Do we have any othe way?
    STUDENT_ID *(:STUDENT_NEWID)*,
    STUDENT_FN,
    GENDER,
    STUDENT_LNAME,
    STUDENT_NEWID BOUNDFILLER
    Thanks
    Sunil
    Edited by: 993112 on Mar 13, 2013 12:28 AM
    Edited by: 993112 on Mar 13, 2013 12:30 AM
    Edited by: 993112 on Mar 13, 2013 12:31 AM
    Edited by: 993112 on Mar 18, 2013 2:52 AM

    OK, ok...
    Here is the sample data:
    101%~|abc%~|F %~|xyz%~|110%~|
    102%~|def%~|M %~|pqr%~|120%~|
    103%~|ghi%~|M %~|stu%~|130%~|
    104%~|jkl%~|F %~|vwx%~|140%~|
    105%~|mno%~|F %~|yza%~|150%~|Here is the control file:
    LOAD DATA
    INFILE student.dat
    TRUNCATE INTO TABLE STUDENT
    FIELDS TERMINATED BY '%~|' TRAILING NULLCOLS
      student_old  FILLER
    , student_fn
    , gender
    , student_ln
    , student_id
    )And here is the execution:
    SQL> CREATE TABLE student
      2  (
      3    student_id   NUMBER
      4  , student_fn   VARCHAR2 (10)
      5  , gender       VARCHAR2 (2)
      6  , student_ln   VARCHAR2 (10)
      7  );
    Table created.
    SQL>
    SQL> !sqlldr / control=student.ctl
    SQL*Loader: Release 11.2.0.3.0 - Production on Tue Mar 19 14:37:31 2013
    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.
    Commit point reached - logical record count 5
    SQL> select * from student;
    STUDENT_ID STUDENT_FN                     GENDER STUDENT_LN
           110 abc                            F      xyz
           120 def                            M      pqr
           130 ghi                            M      stu
           140 jkl                            F      vwx
           150 mno                            F      yza
    SQL>:p

  • How to handle Multiple date formats for the same date field in SQL*Loader

    Dear All,
    I got a requirement where I need to get data from a text file and insert the same into oracle table.
    I am using SQL*Loader to populate the data from the text file into my table.
    The file has one field where I am expecting date date data in multiple formats, like dd/mon/yyyy, yyyy/dd/mon, yyyy/mon/dd, ,mm/dd/yyyy, mon/dd/yyyy.
    While using SQL*Loader, I can see Loading is failing for records where we have formats like yyyy/dd/mon, yyyy/mon/dd, mon/dd/yyyy.
    Is there any way in SQL*Loader where we can mention all these date formats so that this date data should go smoothly into the underlying date column in the table.
    Appreciate your response on this.
    Thanks,
    Madhu K.

    The point being made was, are you sure that you can uniquely identify a date format from the value you receieve? Are you sure that the data stored is only of a particular limited set of formats?
    e.g. if you had a value of '07/08/03' how do you know what format that is?
    It could be...
    7th August 2003 (thus assuming it's DD/MM/RR format)
    or
    8th July 2003 (thus assuming it's MM/DD/RR format)
    or
    3rd August 2007 (thus assuming it's RR/MM/DD format)
    or
    8th March 2007 (thus assuming it's RR/DD/MM format)
    or even more obscurely...
    3rd July 2008 (MM/RR/DD)
    or
    7th March 2008 (DD/RR/MM)
    Do you have any information to tell you what formats are valid that would allow you to be specific and know what date format is meant?
    This is a classic example of why dates should be stored on the database using DATE datatype and not VARCHAR2. It can lead to corruption of data, especially if the date can be entered in any format a user wishes.

Maybe you are looking for