SQL*Loader-510: Physical record in data file (clob_table.ldr) is long

If I generate loader / Insert script from Raptor, it's not working for Clob columns.
I am getting error:
SQL*Loader-510: Physical record in data file (clob_table.ldr) is long
er than the maximum(1048576)
What's the solution?
Regards,

Hi,
Has the file been somehow changed by copying it between windows and unix? Ora file transfer done as binary or ASCII? The most common cause of your problem. Is if the end of line carriage return characters have been changed so they are no longer /n/r could this have happened? Can you open the file in a good editor or do an od command in unix to see what is actually present?
Regards,
Harry
http://dbaharrison.blogspot.co.uk/

Similar Messages

  • Sqlldr error510 Physical record in data file is longer than the max 1048576

    SQL*Loader: Release 10.2.0.2.0 - Production on Fri Sep 21 10:15:31 2007
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    Control File: /apps/towin_p/bin/BestNetwork.CTL
    Data File: /work/towin_p/MyData.dat
    Bad File: /apps/towin_p/bin/BestNetwork.BAD
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Continuation: none specified
    Path used: Direct
    Load is UNRECOVERABLE; invalidation redo is produced.
    Table "BN_ADM"."DWI_USAGE_DETAIL", loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    USAGE_DETAIL_DT FIRST * , DATE MM/DD/YYYY HH24:MI:SS
    UNIQUE_KEY SEQUENCE (MAX, 1)
    LOAD_DT SYSDATE
    USAGE_DETAIL_KEY NEXT * , CHARACTER
    RATE_AREA_KEY NEXT * , CHARACTER
    UNIT_OF_MEASURE_KEY NEXT * , CHARACTER
    CALL_TERMINATION_REASON_KEY NEXT * , CHARACTER
    RATE_PLAN_KEY NEXT * , CHARACTER
    CHANNEL_KEY NEXT * , CHARACTER
    SERIALIZED_ITEM_KEY NEXT * , CHARACTER
    HOME_CARRIER_KEY NEXT * , CHARACTER
    SERVING_CARRIER_KEY NEXT * , CHARACTER
    ORIGINATING_CELL_SITE_KEY NEXT * , CHARACTER
    TERMINATING_CELL_SITE_KEY NEXT * , CHARACTER
    CALL_DIRECTION_KEY NEXT * , CHARACTER
    SUBSCRIBER_LOCATION_KEY NEXT * , CHARACTER
    OTHER_PARTY_LOCATION_KEY NEXT * , CHARACTER
    USAGE_PEAK_TYPE_KEY NEXT * , CHARACTER
    DAY_OF_WEEK_KEY NEXT * , CHARACTER
    FEATURE_KEY NEXT * , CHARACTER
    WIS_PROVIDER_KEY NEXT * , CHARACTER
    SUBSCRIBER_KEY NEXT * , CHARACTER
    SUBSCRIBER_ID NEXT * , CHARACTER
    SPECIAL_NUMBER_KEY NEXT * , CHARACTER
    TOLL_TYPE_KEY NEXT * , CHARACTER
    BILL_DT NEXT * , DATE MM/DD/YYYY HH24:MI:SS
    BILLING_CYCLE_KEY NEXT * , CHARACTER
    MESSAGE_SWITCH_ID NEXT * , CHARACTER
    MESSAGE_TYPE NEXT * , CHARACTER
    ORIGINATING_CELL_SITE_CD NEXT * , CHARACTER
    TERMINATING_CELL_SITE_CD NEXT * , CHARACTER
    CALL_ACTION_CODE NEXT * , CHARACTER
    USAGE_SECONDS NEXT * , CHARACTER
    SUBSCRIBER_PHONE_NO NEXT * , CHARACTER
    OTHER_PARTY_PHONE_NO NEXT * , CHARACTER
    BILLED_IND NEXT * , CHARACTER
    NO_USERS_IN_CALL NEXT * , CHARACTER
    DAP_NO_OF_DSAS_USED NEXT * , CHARACTER
    USAGE_SOURCE NEXT * , CHARACTER
    SOURCE_LOAD_DT NEXT * , DATE MM/DD/YYYY HH24:MI:SS
    SOURCE_UPDATE_DT NEXT * , DATE MM/DD/YYYY HH24:MI:SS
    RATE_PLAN_ID NEXT * , CHARACTER
    NETWORK_ELEMENT_KEY NEXT * , CHARACTER
    SQL string for column : "-2"
    SQL*Loader-510: Physical record in data file (/work/towin_p/MyData.dat) is longer than the maximum(1048576)
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    Table "BN_ADM"."DWI_USAGE_DETAIL":
    0 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Date conversion cache disabled due to overflow (default size: 1000)
    Bind array size not used in direct path.
    Column array rows : 5000
    Stream buffer bytes: 256000
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records read: 7000382
    Total logical records rejected: 0
    Total logical records discarded: 0
    Total stream buffers loaded by SQL*Loader main thread: 1666
    Total stream buffers loaded by SQL*Loader load thread: 4996
    Run began on Fri Sep 21 10:15:31 2007
    Run ended on Fri Sep 21 10:27:14 2007
    Elapsed time was: 00:11:43.56
    CPU time was: 00:05:36.81

    What options are you using on the CTL file? How does your data file looks like (e.g. One line per record, one line only)?

  • SQL Loader: handling difference datatypes in data file and table column

    Hi,
    I am not sure if my question is valid but I am having this doubt.
    I am trying to load data from my data file into a table with just a single column of FLOAT datatype using SQL Loader. But very few insertions take place, leaving a large number of record rejected for the same reason-
    Record 7: Rejected - Error on table T1, column MSISDN.
    ORA-01722: invalid number
    The data in my datafile goes like this: (with a single space before every field)
       233207332711<EOFD><EORD>    233208660745<EOFD><EORD>    233200767380<EOFD><EORD>
    Here I want to know if there is any way to type cast the data read from the data file to suit my table column's datatype.
    How do I handle this? I want to load all the data from my datafile into my table.

    Pl continue the discussion in your original post - Pls help: SQL Loader loads only one record

  • Encounter SQL*Loader-510 error

    When try to load data from flat file to Oracle,the size is around 6 million row,below error was throwed.
    SQL*Loader-510: Physical record in data file (yadayadayada) is longer than the maximum(1048576)
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    Anyone has idea about this,thanks..
    Message was edited by:
    user588299

    Hi,
    I think that the error has occured if the target table column lenght is smaller than the one that you are sending fromt he control file
    Also there is an option 'reject unlimited ' that you can use in control file so even if errors occurs while uploading the file through SQL LOADER it will skip that record and carry on with the next.
    In that way i think the load shouldnt get aborted.
    You can also have a log file that may log out these bad data.
    Thanks

  • Sql loader 510

    I am loading a file with some BLOBs. Most of the data seems to have loaded ok but I am now getting this error:
    SQL*Loader-510: Physical record in data file
    (c:\Sheets_2005.dat) is longer than the maximum(20971520)
    The ctl file was auto generated by Migration Workbench...i have added the options in....
    options (BINDSIZE=20971520, READSIZE=20971520)
    load data
    infile 'c:\sheets_2005.dat' "str '<EORD>'"
    append
    into table SHEETS
    fields terminated by '<EOFD>'
    trailing nullcols
    (REFNO,
    SHEETNO,
    DETAIL CHAR(100000000)
    MESSAGE,
    SIZE_)
    Any ways around this error?
    Thanks

    Hello,
    Can you tell me which plugin you are using?
    option#1
    Cause: I think from the error message it appears that the datafile has a physical record that is too long.
    If that is the case, try changing the length of the column [this problem could be most likely at the blob/clob column]. Also try Using CONCATENATE or CONTINUEIF or break up the physical records.
    OPTION#2
    If you are using sql server or sybase plugin, this workaround may work:
    Cause: Export of Binary data may be bit too big. Hence it needs to be converted to the HEX format. Thus produced HEX data can be saved into a clob column.
    The task is split into 4 sub tasks
    1. CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
         --log into your system schema and create a tablespace
         --Create a new tablespace for the CLOB and BLOB column
         --You may resize this to fit your data ,
         --Remember that we save the data once as CLOB and then as BLOB   
         --create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;
    2. LOG INTO YOUR TABLE SCHEMA IN ORACLE
         --Modify this script to fit your requirements
         --START.SQL (this script will do the following tasks)
              ~~Modify your current schema so that it can accept HEX data
              ~~Modify your current schema so that it can hold that huge amount of data.
              ~~Modify the new tablespace to suite your requirements [can be estimated based on size of the blobs/clobs and number of rows]
              ~~Disable triggers, indexes & primary keys on tblfiles
    3. DATA MOVE: The data move now involves moving the HEX data in the .dat files to a CLOB.
         --The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.  This is where the HEX values will be stored.
         --MODIFY YOUR CONTROL FILE TO LOOK LIKE THIS
              ~~load data
              ~~infile '<tablename>.dat' "str '<er>'"
              ~~into table <tablename>
              ~~fields terminated by '<ec>'
              ~~trailing nullcols
              ~~(
              ~~ <blob_column>_CLOB CHAR(200000000),
              ~~)
    The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)
         --RUN sql_loader_script.bat
         --log into your schema to check if the data was loaded successfully
         --now you can see that the hex values were sent to the CLOB column
         --SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;
    4. LOG INTO YOUR SCHEMA
         --Run FINISH.SQL.  This script will do the following tasks:
              ~~Creates the procedure needed to perform the CLOB to BLOB transformation
              ~~Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
              ~~Alters the table back to its original form (removes the <blob_column>_clob)
              ~~Enables the triggers, indexes and primary keys
    Good luck
    Srinivas Nandavanam

  • How do I skip footer records in Data file through control file of sql*loade

    hi,
    I am using sql*loader to load data from data file and i have written control file for it. How do i skip last '5' records of data file or the footer records to be skiped to read.
    For first '5' records to be skiped we can use "skip" to achieve it but how do i acheive for last '5' records.
    2)
    Can I mention two data files in one control file if so what is the syntax(like we give INFILE Where we mention the path of data file can i mention two data file in same control file)
    3)
    If i have datafile with variable length (ie 1st record with 200 charcter, 2nd with 150 character and 3rd with 180 character) then how do i load data into table, i mean what will be the syntax for it in control file.
    4)if i want to insert sysdate into table through control file how do i do it.
    5) If i have variable length records in data file and i have first name then white space between then and then last name, how do i insert this value which includes first name and last name into single column of the table.( i mean how do you handle the white space in between first name and last name in data file)
    Thanks in advance
    ram

    You should read the documentation about SQL*Loader.

  • Loading multiple physical records into a logical record

    Hello,
    I'm not sure if this is the right place to post this thread.
    I have to import data from a fixed length positioned text file into a oracle table using sql loader.
    My sample input file (which has 3 columns) looks like:
    Col1 Col2 Col3
    1 A abcdefgh
    1 A ijklmnop
    1 A pqrstuv
    1 B abcdefgh
    1 B ijklmn
    2 A abcdefgh
    3 C hello
    3 C world
    The above text file should be loaded into the table as:
    Col1 Col2 Col3
    1 A abcdefghijklmnpqrstuv
    1 B abcdefghijklmn
    2 A abcdefgh
    3 C helloworld
    My question: Is there a way tht i can use the logic of loading multiple physical records into a logical record in my oracle tables. Please suggest.
    Thanks in advance.

    Hi,
    user1049091 wrote:
    Kulash,
    Thanks for your reply.
    The order of the concatenated strings is important as the whole text is split into several physical records in the flat file and has to be combined into 1 record in Oracle table.
    My scenario is we get these fixed length input files from mainframes on a daily basis and this data needs to be loaded into a oracle table for reporting purpose. It needs to be automated.
    Am still confused whether to use external table or a staging table using sql loader. Please advise with more clarity as am a beginner in sql loader. Thanks.I still think an external table would be better.
    You can create the external table like this:
    CREATE TABLE     fubar_external
    (      col1     NUMBER (2)
    ,      col2     VARCHAR2 (2)
    ,      col3     VARCHAR2 (50)
    ORGANIZATION  EXTERNAL
    (       TYPE             ORACLE_LOADER
         DEFAULT DIRECTORY  XYZ_DIR
         ACCESS PARAMETERS  (
                                 RECORDS DELIMITED BY     NEWLINE
                          FIELDS  (   col1        POSITION (1:2)
                                      ,   col2        POSITION (3:4)
                               ,   col3        POSITION (5:54)
         LOCATION        ('fubar.txt')
    );where XYZ_DIR is the Oracle Directory on the database server's file system, and fubar.txt is the name of the file on that directory. Every day, when you get new data, just overwrite fubar.txt. Whenever you query the table, Oracle will read the file that's currently on that directory. You don't have to drop and re-create the table every day.
    Note that the way you specify the columns is similar to how you do it in SQL*Loader, but the SEQUENCE generator doesn't work in external files; use ROWNUM instead.
    Do you need to populate a table with the concatenated col3's, or do you just need to display them in a query?
    Either way, you can reference the external table the same way you would reference a regular, internal table.

  • (8I) SQL*LOADER에서 | (PIPE LINE)을 RECORD SEPARATOR로 사용하기

    제품 : ORACLE SERVER
    작성날짜 : 2003-10-21
    ===============================================================
    (8I) SQL*LOADER에서 | (PIPE LINE)을 RECORD SEPARATOR로 사용하기
    ===============================================================
    PURPOSE
    Oracle8i부터는 , SQL*Loader을 사용할때 record terminator을 지정할 수 있게
    되었다.
    Explanation
    Oracle8i 이전에는 record seperator로 default로 linefeed(carriage return,
    newline 등)였다. 이전에는 VAR 또는 FIX 등의 적당한 file을 다루기 위한 옵션을
    주어야 하기 때문에 복잡한 감이 있었고 flexible하지 못했다.
    Oracle8i부터는 , SQL*Loader을 사용할때 record terminator을 지정할 수 있게
    되었다. newline 또는 carriage return 문자를 포함하는 data 또는 special 문자를
    포함하는 data를 load하고자 할때 record terminator를 hexadecimal로 지정하여 활용할 수 있다.
    Example
    다음의 예제는 '|' (pipe line)을 record separator로 사용한다.
    record separator를 사용하기 위해서 SQL*Loader의 control file에 'infile'절에 적당한 값을 지정하여야 한다.
    아래의 예는 '|' (pipe line)을 사용하기 위해서
    "str X'7c0a'"을 'infile'절에 지정하였다.
    --controlfile : test.ctl
    load data
    infile 'test.dat' "str X'7c0a'"
    into table test
    fields terminated by ',' optionally enclosed by '"'
    (col1, col2)
    --datafile: test.dat
    1,this is the first line of the first record
    this is the second|
    2,this is the first line of the second record
    this is the second|
    SQL> desc test
    Name Null? Type
    COL1 VARCHAR2(4)
    COL2 VARCHAR2(100)
    $ sqlldr scott/tiger control=test.ctl log=test.log
    load된 data을 보면 아래와 같이 carriage return이 들어가 있는 data가 한 column에
    제대로 들어간 것을 볼 수 있다.
    SQL> select * from test;
    COL1
    COL2
    1
    this is the first line of the first record
    this is the second
    2
    this is the first line of the second record
    this is the second
    RELATED DOCUMENT
    <Note:74719.1>

  • SQL LOADER , EXTERNAL  TABLE and ODBS DATA SOURCE

    hello
    Can any body help loading data from dbase file .dbt to an oracle 10g table.
    I tried last day with SQL LOADER, EXTERNAL table , and ODBC data source.
    Why all of these utilities still failing to solve my problem ?
    Is there an efficient way to reach this goal ?
    Thanks in advance

    export the dbase data file to text file,
    then you have choice of using either sql loader or external table option to use.
    regards

  • SQL Loader-How to insert -ve & date values from flat text file into coloumn

    Question: How to insert -ve & date values from flat text file into coloumns in a table.
    Explanation: In the text file, the negative values are like -10201.30 or 15317.10- and the date values are as DDMMYYYY (like 10052001 for 10th May, 2002).
    How to load such values in columns of database using SQL Loader?
    Please guide.

    Question: How to insert -ve & date values from flat text file into coloumns in a table.
    Explanation: In the text file, the negative values are like -10201.30 or 15317.10- and the date values are as DDMMYYYY (like 10052001 for 10th May, 2002).
    How to load such values in columns of database using SQL Loader?
    Please guide. Try something like
    someDate    DATE 'DDMMYYYY'
    someNumber1      "TO_NUMBER ('s99999999.00')"
    someNumber2      "TO_NUMBER ('99999999.00s')"Good luck,
    Eric Kamradt

  • On load, getting error:  Field in data file exceeds maximum length

    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    PL/SQL Release 11.2.0.3.0 - Production
    CORE    11.2.0.3.0    Production
    TNS for Solaris: Version 11.2.0.3.0 - Production
    NLSRTL Version 11.2.0.3.0 - Production
    I'm trying to load a table, small in size (110 rows, 6 columns).  One of the columns, called NOTES is erroring when I run the load.  It is saying that the column size exceeds max limit.  As you can see here, the table column is set to 4000 Bytes)
    CREATE TABLE NRIS.NRN_REPORT_NOTES
      NOTES_CN      VARCHAR2(40 BYTE)               DEFAULT sys_guid()            NOT NULL,
      REPORT_GROUP  VARCHAR2(100 BYTE)              NOT NULL,
      AREACODE      VARCHAR2(50 BYTE)               NOT NULL,
      ROUND         NUMBER(3)                       NOT NULL,
      NOTES         VARCHAR2(4000 BYTE),
      LAST_UPDATE   TIMESTAMP(6) WITH TIME ZONE     DEFAULT systimestamp          NOT NULL
    TABLESPACE USERS
    RESULT_CACHE (MODE DEFAULT)
    PCTUSED    0
    PCTFREE    10
    INITRANS   1
    MAXTRANS   255
    STORAGE    (
                INITIAL          80K
                NEXT             1M
                MINEXTENTS       1
                MAXEXTENTS       UNLIMITED
                PCTINCREASE      0
                BUFFER_POOL      DEFAULT
                FLASH_CACHE      DEFAULT
                CELL_FLASH_CACHE DEFAULT
    LOGGING
    NOCOMPRESS
    NOCACHE
    NOPARALLEL
    MONITORING;
    I did a little investigating, and it doesn't add up.
    when i run
    select max(lengthb(notes)) from NRIS.NRN_REPORT_NOTES
    I get a return of
    643
    That tells me that the largest size instance of that column is only 643 bytes.  But EVERY insert is failing.
    Here is the loader file header, and first couple of inserts:
    LOAD DATA
    INFILE *
    BADFILE './NRIS.NRN_REPORT_NOTES.BAD'
    DISCARDFILE './NRIS.NRN_REPORT_NOTES.DSC'
    APPEND INTO TABLE NRIS.NRN_REPORT_NOTES
    Fields terminated by ";" Optionally enclosed by '|'
      NOTES_CN,
      REPORT_GROUP,
      AREACODE,
      ROUND NULLIF (ROUND="NULL"),
      NOTES,
      LAST_UPDATE TIMESTAMP WITH TIME ZONE "MM/DD/YYYY HH24:MI:SS.FF9 TZR" NULLIF (LAST_UPDATE="NULL")
    BEGINDATA
    |E2ACF256F01F46A7E0440003BA0F14C2|;|DEMOGRAPHICS|;|A01003|;3;|Demographic results show that 46 percent of visits are made by females.  Among racial and ethnic minorities, the most commonly encountered are Native American (4%) and Hispanic / Latino (2%).  The age distribution shows that the Bitterroot has a relatively small proportion of children under age 16 (14%) in the visiting population.  People over the age of 60 account for about 22% of visits.   Most of the visitation is from the local area.  More than 85% of visits come from people who live within 50 miles.|;07/29/2013 16:09:27.000000000 -06:00
    |E2ACF256F02046A7E0440003BA0F14C2|;|VISIT DESCRIPTION|;|A01003|;3;|Most visits to the Bitterroot are fairly short.  Over half of the visits last less than 3 hours.  The median length of visit to overnight sites is about 43 hours, or about 2 days.  The average Wilderness visit lasts only about 6 hours, although more than half of those visits are shorter than 3 hours long.   Most visits come from people who are fairly frequent visitors.  Over thirty percent are made by people who visit between 40 and 100 times per year.  Another 8 percent of visits are from people who report visiting more than 100 times per year.|;07/29/2013 16:09:27.000000000 -06:00
    |E2ACF256F02146A7E0440003BA0F14C2|;|ACTIVITIES|;|A01003|;3;|The most frequently reported primary activity is hiking/walking (42%), followed by downhill skiing (12%), and hunting (8%).  Over half of the visits report participating in relaxing and viewing scenery.|;07/29/2013 16:09:27.000000000 -06:00
    Here is the full beginning of the loader log, ending after the first row return.  (They ALL say the same error)
    SQL*Loader: Release 10.2.0.4.0 - Production on Thu Aug 22 12:09:07 2013
    Copyright (c) 1982, 2007, Oracle.  All rights reserved.
    Control File:   NRIS.NRN_REPORT_NOTES.ctl
    Data File:      NRIS.NRN_REPORT_NOTES.ctl
      Bad File:     ./NRIS.NRN_REPORT_NOTES.BAD
      Discard File: ./NRIS.NRN_REPORT_NOTES.DSC
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table NRIS.NRN_REPORT_NOTES, loaded from every logical record.
    Insert option in effect for this table: APPEND
       Column Name                  Position   Len  Term Encl Datatype
    NOTES_CN                            FIRST     *   ;  O(|) CHARACTER
    REPORT_GROUP                         NEXT     *   ;  O(|) CHARACTER
    AREACODE                             NEXT     *   ;  O(|) CHARACTER
    ROUND                                NEXT     *   ;  O(|) CHARACTER
        NULL if ROUND = 0X4e554c4c(character 'NULL')
    NOTES                                NEXT     *   ;  O(|) CHARACTER
    LAST_UPDATE                          NEXT     *   ;  O(|) DATETIME MM/DD/YYYY HH24:MI:SS.FF9 TZR
        NULL if LAST_UPDATE = 0X4e554c4c(character 'NULL')
    Record 1: Rejected - Error on table NRIS.NRN_REPORT_NOTES, column NOTES.
    Field in data file exceeds maximum length...
    I am not seeing why this would be failing.

    HI,
    the problem is delimited data defaults to char(255)..... Very helpful I know.....
    what you need to two is tell sqlldr hat the data is longer than this.
    so change notes to notes char(4000) in you control file and it should work.
    cheers,
    harry

  • Using SQL LOADER in Oracle to import CSV file

    I'm pretty new to databases and programming. Im not very good with the computer lingo so stick with me. I have a csv file that I'm trying to load into my oracle database. It contains account information such as name telephone number service dates ect. I've installed Oracle 11g Release 2. This is what I've done so far step by step..
    1) Ran SQL Loader
    I created a new table with the columns that I needed. For example
    create table Billing ( TAP_ID char(10), ACCT_NUM char(10), MR_ID char(10), HOUSE_NUM char(10), STREET char(30), NAME char(50)
    2) It prompted me that the Table was created. Next I created a control file for the data in notepad which was located in the same directory as my Billing table and has a .ctl extension. GIS.csv is the file im getting the data from and is also in the same directory and named it Billing.ctl, which looked like so..
    load data
    infile GIS.csv
    into table Billing
    fields terminated by ','
    (TAP_ID, ACCT_NUM, MR_ID, HOUSE_NUM, STREET, NAME)
    3) Run sqlldr from command line to use the control file
    sqlldr myusername/mypassword Billing.ctl
    This is where I am stuck. Ive seen video tutorials of exactly what I'm doing but I get this error:
    SQL*Loader-350: Syntax error at line 1.
    Expecting keyword LOAD, found "SERV TAP ID". "SERV TAP ID","ACCT NUMBER","MTR ID","SERV HOUSE","SERV STREET","SERV ^'
    I dont understand why its coming up with that error. My billing.ctl has a load.
    load data
    infile GIS.csv
    into table Billing
    fields terminated by ','
    (TAP_ID, ACCT_NUM, MTR_ID, SERV_HOUSE, SERV_STREET, SERV_TOWN, BIL_NAME, MTR_DATE_SET, BIL_PHONE, MTR_SIZE, BILL_CYCLE, MTR_RMT_ID)
    Any thoughts?

    938115 wrote:
    I also got this text file after the command was executed along with the GIS.bad file
    SQL*Loader: Release 11.2.0.1.0 - Production on Fri Jun 1 09:56:52 2012
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Control File: bill.ctl
    Data File: GIS.csv
    Bad File: GIS.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array: 64 rows, maximum of 256000 bytes
    Continuation: none specified
    Path used: Conventional
    I have thousands of records in this file and only 64 of them updated.How many record were in the table before and after? I doubt the difference is 64, unless you have exactly 64 rows but you said thousands. I believe you are probably misinterpreting the log file. can you share the full log file? As a test, create a empty table, use the same script and load same data file in this empty table. Once loading is complete, check if it has 64 rows or more. I believe what log file is saying, it is 'commiting after every 64 rows', not 'stopping after loading 64 rows'.
    So, unless you show us the log file there is no way to be certain, feel free to mask confidential info, at-least top 15 and bottom 15 lines ?

  • ERROR LOADING bcp -w option generated data files

    Hi,
    I have to migrate data from SQL Server to Oracle.
    The unload_script is this:
    bcp "CLEPSIDRA.dbo.InversionesAceptadas" out "[CLEPSIDRA].[dbo].[InversionesAceptadas].dat" -q -c -t "<EOFD>" -r "<EORD>" -Usa -Pas -STIEPO
    When I execute the load script, the data is loaded ok.
    But I have a field that contains character like á,é,ñ,...so I changed my unload script to:
    bcp "CLEPSIDRA.dbo.InversionesAceptadas" out "[CLEPSIDRA].[dbo].[InversionesAceptadas].dat" -q -w -t "<EOFD>" -r "<EORD>" -Usa -Pas -STIEPO
    If I execute the new unload_script, the data file generated is ok, with the characters like á,é,ñ...
    But when I trying to execute the load script, I have errors like 'field is not a valid number..'...
    But if I open the first (-c option generated) data file and replace it's content with the content of the second (-w option generated) data file and save it, the load script works ok!
    How can I solve this?
    Thanks

    Hi user616069,
    There is a preference in SQLDeveloper for encoding.
    Tools->Preferences->Environment->Encoding
    Do you have a reproducible test case you can give us a URL to, including for example a single fictional record?
    -Turloch

  • SQL* Loader - Multi-line record problem

    Hi all,
    I've been given a ver dirty source file. One that has records of more than 1 line. The only record terminator that is available is that the first columns should always be in number in 7 digits (0000000-9999999). Is it possible to create a control file that checks if the current record is actually a new record or if it is just a continuation of the previous record?
    I've been checking on some parameters like the CONTINUEIF and the "str terminator_string".
    I tried both but neither fully satisfies the requirements.
    I tried using CONTINUEIF by adding CONTINUEIF THIS (1) = '0' to check if it only creates records that has a column value starting with '0'. Unfortunately it still adds those that doesn't start with '0'.
    Also tried the INFILE 'mydata.dat' "str '\r\n0'". This works for those who starts with 0 only but unfortunately it trims out the first zero (0) from that column.
    Can someone suggest any other approach?
    Thanks.
    Allen

    hi,
    thanks for the suggestion. i did try it. it works ok but i also found a way to use sql loader since this is better performance wise. i used
    CONTINUEIF NEXT PRESERVE (8) <> X'09'.
    basically, what this does is check if the character on the 8th position is a tab. if it is, then it's a new line. i used this since the first column in my text file uses a 7 digit number. it's not perfect but loads fast. although i'm still looking for some other ways and means.
    thanks again.

  • SQL Loader: specify position for pipe separated file

    Hi,
    I have a problem with sql loader. i need to load 5 columns in a table and my file only contains 3 fields (pipe separated file) i need to add a sequenced id and a loading date. When i try to run the below, the sequence and the sysdate is populated correctly, but column3 is inserted into column1. If i add my file two more null columns, then it works properly:
    ||column1|column2|column3
    also if i put my variables in the end it also works, but i cannot restructure the table so this solution doesnt work. Do u have any idea how to specify positions in pipe separated files? i should load file in the below format:
    column1|column2|column3
    it onlz works properly if i add the two pipe in the beginning
    my control file:
    LOAD DATA
    INFILE 'test201001.csv'
    APPEND
    INTO TABLE test_load
    FIELDS TERMINATED BY '|'
    id "seq.nextval",
    sys_creation_date "sysdate",
    column_1,
    column_2,
    column_3
    Thanx your help in advance
    Edited by: user9013593 on 2010.01.19. 6:18
    Edited by: user9013593 on 2010.01.19. 6:19

    user9013593 wrote:
    Hi,
    I have a problem with sql loader. i need to load 5 columns in a table and my file only contains 3 fields (pipe separated file) i need to add a sequenced id and a loading date. When i try to run the below, the sequence and the sysdate is populated correctly, but column3 is inserted into column1. If i add my file two more null columns, then it works properly:I hope someone provides a better solution below, but since no one has yet ...
    Can you load the data "as is" into a work table, then use a PL/SQL program to process the work table correctly according to the data you have?

Maybe you are looking for