Error while loading data using SQL*Loader

Hi All,
I am now in process of loading data from MS SQL to Oracle Database.
I am getting the data in excel format and i will convert them into csv.
Upto converting everything is working fine.
In MS SQL, table columns are case sensitive.
So i created those tables in oracle db as same.
There is one column "MaxNumber" which is of type float(49).
The column is case sensitive.
in the control file first i given as
"MaxNumber" "TO_NUMBER(:MaxNumber,'99,999.99')"
After executing the SQL*Loader i am getting the error
SQL*Loader-466: Column MAXNUMBER does not exist in table TABLEONE.
I changed the control file entry as
"MaxNumber" "TO_NUMBER(:"MaxNumber",'99,999.99')"
After the execution i got the error
SQL*Loader-350: Syntax error at line 13.
Expecting "," or ")", found "MaxDiscount".
"MaxNumber" "TO_NUMBER(:"MaxNumber",'99,999.99')"
Please Guide me in this issue.
Regards
Salih KM

What I'm saying is, verify the column name. Dont post if not possible.
Example follows, with one table intentionally "hidden".
SQL> create table "tEsT" ("MaxNumber" float, "MaxnumbeR" number);
SQL> select table_name,column_name from user_tab_columns where table_name like 't%';
TABLE_NAME                     COLUMN_NAME
tEsT                           MaxNumber
tEsT                           MaxnumbeR
teST                           iD
teST                           MaxNumberHth,
Fredrik

Similar Messages

  • Error in loading data using SQL loader

    I am getting a error like ‘SQL*Loader -350 syntax error of illegal combination of non-alphanumeric characters’ while loading a file using SQL loader in RHEL. The command used to run SQL*Loader is:
    Sqlldr userid=<username>/<password> control =data.ctl
    The control file, data.ctl is :
    LOAD data
    infile '/home/oraprod/data.txt'
    append  into table test
    empid terminated by ',',
    fname terminated by ',',
    lname terminated by ',',
    salary terminated by whitespace
    The data.txt file is:
    1,Kaushal,halani,5000
    2,Chetan,halani,1000
    I hope, my question is clear.
    Please revert with the reply to my query.
    Regards

    Replace ''{" by "(" in your control file
    LOAD data
    infile 'c:\data.txt'
    append  into table emp_t
    empid terminated by ',',
    fname terminated by ',',
    lname terminated by ',',
    salary terminated by whitespace
    C:\>sqlldr user/pwd@database control=c.ctl
    SQL*Loader: Release 10.2.0.3.0 - Production on Wed Nov 13 10:10:24 2013
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 1
    Commit point reached - logical record count 2
    SQL> select * from emp_t;
         EMPID FNAME                LNAME                    SALARY
             1 Kaushal              halani                     5000
             2 Chetan               halani                     1000
    Best regards
    Mohamed Houri

  • Problem with loading data using SQL LOADER

    I am having following files with me when i run following command at command prompt
    sqlldr scott/tiger@genuat control =c:\emp.ctl
    then giving error as
    SQL Loader 500: unable to open file
    SQL Loader 553: file not found
    emp.dat file data
    1111,sneha,CLERK     7902,17-Dec-80,800,20
    2222,manoj,SALESMAN,7698,20-Feb-72     ,1600,6500,30
    3333,sheela,MANAGER,7839,8-Apr-81,2975,20     
    emp.ctl file
    LOAD DATA
    INFILE 'c:\emp.dat'
    APPEND
    INTO TABLE emp
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    (EMPNO,
    ENAME ,
    JOB,
    MGR,
    HIREDATE,
    SAL ,
    COMM,
    DEPTNO)
    can anyone tell me what is problem in above file why data is not loaded in table??

    I don't find any problem if you invoke the SQLLDR using the below command(and if you are certain that the control file resides in the C: drive).
    sqlldr scott/tiger@genuat control =c:\emp.ctl
    If this doesn't work then invoke the SQLLDR from the C: prompt itself.
    sqlldr scott/tiger@genuat control=emp.ctl
    It would locate the control file and check whether the sqlldr completes successfully?

  • Help needed to load data using sql loader.

    Hi,
    I trying to load data from xls to oracle table(solaris OS) and its failing to load data.
    Control file:
    LOAD DATA
    CHARACTERSET UTF16
    BYTEORDER BIG ENDIAN
    INFILE cost.csv
    BADFILE consolidate.bad
    DISCARDFILE Sybase_inventory.dis
    INSERT
    INTO TABLE FIT_UNIX_NT_SERVER_COSTS
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
    HOST_NM,
    SERVICE_9071_DOLLAR DOUBLE,
    SERVICE_9310_DOLLAR DOUBLE,
    SERVICE_9700_DOLLAR DOUBLE,
    SERVICE_9701_DOLLAR DOUBLE,
    SERVICE_9710_DOLLAR DOUBLE,
    SERVICE_9711_DOLLAR DOUBLE,
    SERVICE_9712_DOLLAR DOUBLE,
    SERVICE_9713_DOLLAR DOUBLE,
    SERVICE_9720_DOLLAR DOUBLE,
    SERVICE_9721_DOLLAR DOUBLE,
    SERVICE_9730_DOLLAR DOUBLE,
    SERVICE_9731_DOLLAR DOUBLE,
    SERVICE_9750_DOLLAR DOUBLE,
    SERVICE_9751_DOLLAR DOUBLE,
    GRAND_TOTAL DOUBLE
    Log file:
    Table FIT_UNIX_NT_SERVER_COSTS, loaded from every logical record.
    Insert option in effect for this table: INSERT
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    HOST_NM FIRST * , CHARACTER
    SERVICE_9071_DOLLAR NEXT 8 DOUBLE
    SERVICE_9310_DOLLAR NEXT 8 DOUBLE
    SERVICE_9700_DOLLAR NEXT 8 DOUBLE
    SERVICE_9701_DOLLAR NEXT 8 DOUBLE
    SERVICE_9710_DOLLAR NEXT 8 DOUBLE
    SERVICE_9711_DOLLAR NEXT 8 DOUBLE
    SERVICE_9712_DOLLAR NEXT 8 DOUBLE
    SERVICE_9713_DOLLAR NEXT 8 DOUBLE
    SERVICE_9720_DOLLAR NEXT 8 DOUBLE
    SERVICE_9721_DOLLAR NEXT 8 DOUBLE
    SERVICE_9730_DOLLAR NEXT 8 DOUBLE
    SERVICE_9731_DOLLAR NEXT 8 DOUBLE
    SERVICE_9750_DOLLAR NEXT 8 DOUBLE
    SERVICE_9751_DOLLAR NEXT 8 DOUBLE
    GRAND_TOTAL NEXT 8 DOUBLE
    Record 1: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column HOST_NM.
    Field in data file exceeds maximum length
    Table FIT_UNIX_NT_SERVER_COSTS:
    0 Rows successfully loaded.
    1 Row not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Please help me ASAP.
    Awaiting u r reply.

    Hi,
    I verified and everything looks fine according to me.
    Table structure:
    OST_NM VARCHAR2(30)
    SERVICE_9071_DOLLAR NUMBER(8,2)
    SERVICE_9310_DOLLAR NUMBER(8,2)
    SERVICE_9700_DOLLAR NUMBER(8,2)
    SERVICE_9701_DOLLAR NUMBER(8,2)
    SERVICE_9710_DOLLAR NUMBER(8,2)
    SERVICE_9711_DOLLAR NUMBER(8,2)
    SERVICE_9712_DOLLAR NUMBER(8,2)
    SERVICE_9713_DOLLAR NUMBER(8,2)
    SERVICE_9720_DOLLAR NUMBER(8,2)
    SERVICE_9721_DOLLAR NUMBER(8,2)
    SERVICE_9730_DOLLAR NUMBER(8,2)
    SERVICE_9731_DOLLAR NUMBER(8,2)
    SERVICE_9750_DOLLAR NUMBER(8,2)
    SERVICE_9751_DOLLAR NUMBER(8,2)
    GRAND_TOTAL NUMBER(8,2)
    Control file:
    LOAD DATA
    BYTEORDER BIG ENDIAN
    INFILE cost.csv
    BADFILE consolidate.bad
    DISCARDFILE Sybase_inventory.dis
    INSERT
    INTO TABLE FIT_UNIX_NT_SERVER_COSTS
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
    HOST_NM ,
    SERVICE_9071_DOLLAR NUMBER(8,2),
    SERVICE_9310_DOLLAR NUMBER(8,2),
    SERVICE_9700_DOLLAR NUMBER(8,2),
    SERVICE_9701_DOLLAR NUMBER(8,2),
    SERVICE_9710_DOLLAR NUMBER(8,2),
    SERVICE_9711_DOLLAR NUMBER(8,2),
    SERVICE_9712_DOLLAR NUMBER(8,2),
    SERVICE_9713_DOLLAR NUMBER(8,2),
    SERVICE_9720_DOLLAR NUMBER(8,2),
    SERVICE_9721_DOLLAR NUMBER(8,2),
    SERVICE_9730_DOLLAR NUMBER(8,2),
    SERVICE_9731_DOLLAR NUMBER(8,2),
    SERVICE_9750_DOLLAR NUMBER(8,2),
    SERVICE_9751_DOLLAR NUMBER(8,2),
    GRAND_TOTAL NUMBER(8,2)
    Sample date file:
    ABOS12,122.46,,1315.00,,1400.00,,,,,,,,1855.62,,4693.07
    ABOS39,6391.16,,1315.00,,1400.00,,,,,,,,,4081.88,13188.04

  • Loading data with dates using SQL*Loader

    Dear everyone
    I am currently trying to load some data containing dates using SQL*Loader.
    For termination of fields I have been using ^ because I have some book titles which contain " and ' as part of their title. I found that the TO_DATE function did not seem to work using ^ instead of ". Would I be correct? I think the Oracle manual says that " must be used.
    After some Web research I eventually amended my control file to as follows:
    load data
    infile 'h:\insert_statements\22_insert_into_SCAN_FILE_INFO.txt'
    REPLACE
    into table SCAN_FILE_INFO
    fields terminated by "," optionally enclosed by '^'
    TRAILING NULLCOLS
    (scan_id, scan_filename
    file_format_id
    orig_scanning_resolution_dpi
    scanner_id, scanner_operator_id
    scanning_date "TO_DATE (:scanning_date, 'YYYY-MM-DD')"
    original_map_publication_id
    reprint_publication_id)
    A simple line of data is as follow:
    280001, ^1910 - London^, 270001, 400, 250001, 260001, "TO_DATE('2007-06-06', 'YYYY-MM-DD')", 200019,
    The final column being null.
    However when I attempt that I get the following error message:
    Record 1: Rejected - Error on table SCAN_FILE_INFO, column SCANNING_DATE.
    ORA-01841: (full) year must be between -4713 and +9999, and not be 0
    If I change the scanning_date part to:
    scanning_date "EXPRESSION TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    or
    scanning_date "CONSTANT TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    I get the error message:
    Record 1: Rejected - Error on table SCAN_FILE_INFO, column SCANNING_DATE.
    ORA-00917: missing comma
    As soon as I do the following:
    scanning_date "EXPRESSION, TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    or
    scanning_date "CONSTANT, TO_DATE (:scanning_date, 'YYYY-MM-DD')",
    I get too many values error message:
    Record 1: Rejected - Error on table SCAN_FILE_INFO.
    ORA-00913: too many values
    I also tested out scanning_date DATE "YYYY-MM-DD", but that just gave the same ORA-01841 error message as above.
    I must be doing something very simple which is wrong but I cannot figure it out.
    Kind regards
    Tim

    And why do you have scanning date as "TO_DATE('2007-06-06', 'YYYY-MM-DD')" in your infile? All you need is 2007-06-06. If you can not change infile generation code, use:
    load data
    infile 'h:\insert_statements\22_insert_into_SCAN_FILE_INFO.txt'
    REPLACE
    into table SCAN_FILE_INFO
    fields terminated by "," optionally enclosed by '^'
    TRAILING NULLCOLS
    (scan_id, scan_filename
    file_format_id
    orig_scanning_resolution_dpi
    scanner_id, scanner_operator_id
    scanning_date "TO_DATE(REPLACE(REPLACE(:scanning_date,'TO_DATE('),'''YYYY-MM-DD'')'), 'YYYY-MM-DD')"
    original_map_publication_id
    reprint_publication_id)SY.

  • Error while insert data using execute immediate in dynamic table in oracle

    Error while insert data using execute immediate in dynamic table created in oracle 11g .
    first the dynamic nested table (op_sample) was created using the executed immediate...
    object is
    CREATE OR REPLACE TYPE ASI.sub_mark AS OBJECT (
    mark1 number,
    mark2 number
    t_sub_mark is a class of type sub_mark
    CREATE OR REPLACE TYPE ASI.t_sub_mark is table of sub_mark;
    create table sam1(id number,name varchar2(30));
    nested table is created below:
    begin
    EXECUTE IMMEDIATE ' create table '||op_sample||'
    (id number,name varchar2(30),subject_obj t_sub_mark) nested table subject_obj store as nest_tab return as value';
    end;
    now data from sam1 table and object (subject_obj) are inserted into the dynamic table
    declare
    subject_obj t_sub_mark;
    begin
    subject_obj:= t_sub_mark();
    EXECUTE IMMEDIATE 'insert into op_sample (select id,name,subject_obj from sam1) ';
    end;
    and got the below error:
    ORA-00904: "SUBJECT_OBJ": invalid identifier
    ORA-06512: at line 7
    then when we tried to insert the data into the dynam_table with the subject_marks object as null,we received the following error..
    execute immediate 'insert into '||dynam_table ||'
    (SELECT

    887684 wrote:
    ORA-00904: "SUBJECT_OBJ": invalid identifier
    ORA-06512: at line 7The problem is that your variable subject_obj is not in scope inside the dynamic SQL you are building. The SQL engine does not know your PL/SQL variable, so it tries to find a column named SUBJECT_OBJ in your SAM1 table.
    If you need to use dynamic SQL for this, then you must bind the variable. Something like this:
    EXECUTE IMMEDIATE 'insert into op_sample (select id,name,:bind_subject_obj from sam1) ' USING subject_obj;Alternatively you might figure out to use static SQL rather than dynamic SQL (if possible for your project.) In static SQL the PL/SQL engine binds the variables for you automatically.

  • Need Help..!! Loading Data Using Outline Load Utility - No Data

    Hi All,
    I've tried to load data using Outline Load Utility but the data is not loaded into my Essbase.
    Here are the step what I've done :
    1. In planning, Administration > Data Load Settings
    - Entity as the data load dimension,
    - Account as the driver dimension and selected Rental, test_desc as the member
    2. The .csv file
    Entity, Rental, test_desc, Point-of-View,Data Load Cube Name
    TLT, 100, 100, "FY11,Plan,Working,No Product,No Condition Type,No Vehicle Type,No Term Period,No Term,No Movement, Local, Apr", Retail
    3. Run the Outline Load through cmd
    E:\Oracle\Middleware\user_projects\epmsystem1\Planning\planning1>OutlineLoad /A:PLNDEV /U:xyz /M /N /I:E:/testload.csv /D:Entity /L:E:/OutlineLogs /X:E:/outlineLoad.exc
    and below it the OutlineLogs :
    [Mon Jul 25 04:01:42 PDT 2011]Successfully located and opened input file "E:\testload.csv".
    [Mon Jul 25 04:01:42 PDT 2011]Header record fields: Entity, Rental, test_desc, Point-of-View, Data Load Cube Name
    [Mon Jul 25 04:01:42 PDT 2011]Located and using "Entity" dimension for loading data in "PLNDEV" application.
    [Mon Jul 25 04:01:42 PDT 2011]Load dimension "Entity" has been unlocked successfully.
    [Mon Jul 25 04:01:42 PDT 2011]A cube refresh operation will not be performed.
    [Mon Jul 25 04:01:42 PDT 2011]Create security filters operation will not be performed.
    [Mon Jul 25 04:01:42 PDT 2011]Examine the Essbase log files for status if Essbase data was loaded.
    [Mon Jul 25 04:01:42 PDT 2011]Planning Outline load process finished (with no data load as specified (/N)). 1 data record was read, 1 data record was processed, 1 was accepted, 0 were rejected.
    No data rejected there, but when I tried to retrieve the data from my Planning Form or Essbase Add-In, there is no data.
    Do I miss something in my step above..??
    Please advise.
    Thanks.
    Regards,
    VieN

    Hi John,
    thanks for reply..
    I'm not check the essbase log yet..
    From the OutlineLogs (Planning Outline load process finished (with no data load as specified (/N)). 1 data record was read, 1 data record was processed, 1 was accepted, 0 were rejected.), it doesn't mean the data successfully loaded, isn't it...??
    Thanks.
    Regards,
    VieN

  • ORA-01841 Error when value for date col is NULL in .dat (using SQL Loader)

    Hello Gurus,
    I have some data in .dat file which needs to be loaded into oracle table. I am using SQL * Loader to do the job. Although "NULLIF col_name =BLANKS" works for character datatype, but when value for date col is NULL then I get ORA-01841 error. I have to make NULL for all rows withour value for date column
    Early reply will be highly appreciated
    Farooq

    Hi,
    May be this problem is not with the NULLIF. The value for the date column is not in proper date format.
    create table:
    create table kk (empno number, ename varchar2(20), deptno number, hiredate date)
    Control file:
    LOAD DATA
    INFILE 'd:\kk\empdata.dat'
    insert into TABLE kk ( empno position (1:2) integer external,
    ename position(4:5) char NULLIF ename=BLANKS,
    deptno position (7:8) integer external NULLIF deptno=BLANKS,
    hiredate position (10:20) date NULLIF hiredate=BLANKS)
    data file:
    10 KK 01-jan-2005
    20 10
    SELECT * FROM KK;
    EMPNO ENAME DEPTNO HIREDATE
    10 KK 01-JAN-05
    20 10
    Verify the data file.
    Hope it will help

  • Sql@loader-704  and ORA-12154: error messages when trying to load data with SQL Loader

    I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
    The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
    The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
    The first time I loaded the data i did it from a command line with SQL loader
    Now when I try to load the data I get this message:
    sql@loader-704 Internal error: ulconnect OCISERVERATTACH
    ORA-12154: tns:could not resolve the connect identifier specified
    I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
    I am able to  connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
    However SQL Developer will not let me load a file this big
    I have also tried to load the file within Apex  (SQL Workshop/ Utilities) but again, the file is too big.
    So it seems like SQL Loader is the only option
    I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
    Not sure what else to try or where to look
    thanks

    Hi,
    You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
    open a command prompt
    set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
    This will tell oracle use the config files you find here and no others
    then try sqlldr user/pass@db (in the same dos window)
    see if that connects and let us know.
    Cheers,
    Harry
    http://dbaharrison.blogspot.com

  • Want to use sequence object of oracle when loading data in sql loader

    Hi,
    I want to use sequence when loading data in sqll loader, but the problem is i could not use sequence object of oracle to load the data by sql loader, i can use sequence of sql loader.
    I want to use sequence object because in later entries this sequence object will be used.If i use sequence of sql loader how can i use oracle sequence object
    Is there any other option

    I have a simillar problem, I also want to use a sequence when loading data by the SQL Loader.
    My control file is:
    load data
    infile '0testdata.txt'
    into table robertl.tbltest
    fields terminated by X'09'
    trailing nullcols
    (redbrojunos,
    broj,
    dolazak,
    odlazak nullif odlazak=blanks,
    komentar nullif komentar=blanks)
    And the datafile is:
    robertl.brojilo.nextval     1368     17.06.2003 08:02:46     17.06.2003 16:17:18     
    robertl.brojilo.nextval     2363     17.06.2003 08:18:18     17.06.2003 16:21:52     
    robertl.brojilo.nextval     7821     17.06.2003 08:29:22     17.06.2003 16:21:59     
    robertl.brojilo.nextval     0408     17.06.2003 11:20:27     17.06.2003 18:33:00     ispit
    robertl.brojilo.nextval     1111     17.06.2003 11:30:58     17.06.2003 16:09:34     Odlazak na ispit
    robertl.brojilo.nextval     6129     17.06.2003 14:02:42     17.06.2003 16:23:23     seminar
    But all records were rejected by the Loader, for every record I get the error:
    Record 1: Rejected - Error on table ROBERTL.TBLTEST, column REDBROJUNOS.
    ORA-01722: invalid number

  • How to export&import data using sql *loader

    Hi all,
    How to export&import data from sql*loader. Give me the clear steps..
    Thanks in Advance

    Hi did you already exported data from SQL SERVER? if not using SQL*LOADER you cannot export data. SQL*LOADER is only mean for importing data from flat files(usually text files) into ORACLE tables.
    for importing data into oracle tables using sql*loader use below steps
    1) create a sql*loader control file.
    it looks like as follows
    LOAD DATA
    INFILE 'sample.dat'
    BADFILE 'sample.bad'
    DISCARDFILE 'sample.dsc'
    APPEND
    INTO TABLE emp
    TRAILING NULLCOLS
    or for sample script of control file search google.
    2) at command prompt issue following
    $ sqlldr test/test
    enter control file=<give control file name which you create earlier>
    debug any errors (if occured)

  • Loading leap year date using SQL*Loader

    Hello,
    I have a problem loading a date '29/02/2000' using SQL*Loader. This date is on a leap year. I'm getting an error message from SQL*Loader as 'ORA-01839: date not valid for month specified'. My colleague and I have tried using various date functions to convert the data into date, but no luck.
    I would appreciate any helps,
    Bruce

    Thanks for your help, I found the bug on my control file. I was using the RTRIM function to remove bad timestamp such as '29/02/2000 0:00:00'. So instead of using this statement:
    LOG_DATE DATE "DD/MM/RRRR" "RTRIM(:LOG_DATE,'0:00:00')"
    I was using the statement below with a space before the '0:00:00' string literal, with the intention to remove a space also:
    LOG_DATE DATE "DD/MM/RRRR" "RTRIM(:LOG_DATE,' 0:00:00')"
    Well, it turned out that if there was a space before the string literal, RTRIM function would trim the matching string plus any '0' characters from the right, including the '000' that belongs to '2000'. Thus, the error.
    Thanks again,
    Bruce

  • Loading Data Using Outline Load Utility - Error

    Trying to load some data using Outline Load Utility. I followed the Oracle documentation on this and Under the Administration -> Manage Data Load area I defined Accounts as data load dimension, and period as driver dimension. I then added Jan - Feb as driver members. I wanted to load data for Jan. My data header looks like this:
    Accounts,Jan,Point-of-View,Data Load Cube Name
    acct1, 768, "2010, Ver1, Actuals, entity1, etc" Plan1
    The command i typed is this:
    C:\Oracle\Middleware\user_projects\epmsystem1\Planning\planning1>outlineload /A:Plan1 /U:Admin /M /I:C:\MockData2.txt /D:Accounts /L:C:\dataload.log /X:c:\dataload.exc
    I get the following errors:
    [Mon Nov 22 11:03:02 CST 2010] Unrecognized column header value "Jan".
    [Mon Nov 22 11:03:02 CST 2010] Unrecognized column header value "Point-of-View".
    [Mon Nov 22 11:03:02 CST 2010] Unrecognized column header value "Data Load Cube Name".
    [Mon Nov 22 11:03:02 CST 2010]Unable to obtain dimension information and/or perform a data load: Unrecognized column header value(s), refer to previous messages. (Note: column header values are case sensitive.)
    [Mon Nov 22 11:03:02 CST 2010]Planning Outline data store load process finished with exceptions: not all input records were read due to errors (or an empty input file). 0 data records were read, 0 data records were processed, 0 were successfully loaded, 0 were rejected.
    This is version 11.1.2. What am I doing wrong here? I also find it interesting that the command for data load and metadata load is the same as per oracle docs. I guess Planning knows if we're trying to load data or metadata based on the CSV file header?

    I don't usually bother with loading data using the outline load utility but as a test on 11.1.2 using the planning sample application I gave it a quick go.
    In planning, went to Administration > Data Load Settings > Picked Account as the data load dimension, Period as the driver dimension and selected Jan as a member
    I created a file :-
    Account,Jan,Point-of-View,Data Load Cube Name
    TestMember,100,"Local,E05,Actual,NoSegment,Working,FY11",Consol
    And used the following command line from the directory with the planning utilities
    OutlineLoad /A:plansamp /U:admin /M /N /I:F:/temp/dload.csv /D:Account /L:F:/temp/outlineLoad.log /X:F:/temp/outlineLoad.exc
    The log produced :-
    [Tue Nov 23 10:02:01 GMT 2010]Successfully located and opened input file "F:\temp\dload.csv".
    [Tue Nov 23 10:02:01 GMT 2010]Header record fields: Account, Jan, Point-of-View, Data Load Cube Name
    [Tue Nov 23 10:02:01 GMT 2010]Located and using "Account" dimension for loading data in "plansamp" application.
    [Tue Nov 23 10:02:01 GMT 2010]Load dimension "Account" has been unlocked successfully.
    [Tue Nov 23 10:02:01 GMT 2010]A cube refresh operation will not be performed.
    [Tue Nov 23 10:02:01 GMT 2010]Create security filters operation will not be performed.
    [Tue Nov 23 10:02:01 GMT 2010]Examine the Essbase log files for status if Essbase data was loaded.
    [Tue Nov 23 10:02:01 GMT 2010]Planning Outline load process finished (with no data load as specified (/N)). 1 data record was read, 1 data record was processed, 1 was accepted, 0 were rejected.
    There you go no problems.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Loading data by sql loader in oracle 10g on linux

    I am trying to load data in Oracle 10g on linux by using sql loader, but getting error
    Problem in log showing that field length of SURNAME field is more than table field size.
    Following is the error in log file of sql loader
    Record 21: Rejected - Error on table TABLE1, column
    SURNAME.
    ORA-12899: value too large for column SURNAME (actual: 65, maximum: 64)
    and it is evident from following controlfile that i am using trim to discard any space then why it is giving an error.
    LOAD DATA
    TRUNCATE
    INTO TABLE TABLE1
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    ID INTEGER EXTERNAL,
    OPTION1 CHAR,
    REF1 CHAR,
    OTHER_REF CHAR,
    TITLE "TRIM(:TITLE)",
    FORENAME "TRIM(:FORENAME)",
    SURNAME "TRIM(:SURNAME)",
    JOINT_TITLE "TRIM(:JOINT_TITLE)",
    JOINT_FORENAME "TRIM(:JOINT_FORENAME)",
    JOINT_SURNAME "TRIM(:JOINT_SURNAME)",
    I checked the bad file and count number of characters, they are 64 characters.
    When i am inserting individual record from bad file by sql loader, it is loading

    Probably your database character set is multi-byte. That is %UTF8 or AL16UTF16%
    Post your NLS Database Parameters value
    select * from nls_database_parameters;
    In General varchar2(65) by default means 65 BYTES unless
    you have changed your Defalut NLS_LENGTH_SEMANTICS parameter from BYTE to CHAR.
    With best regards
    Shan

  • Loading data with sql loader

    Hi Experts,
    I have a file with the following format. I have to insert the data of those files in a table. I can use SQL Loader to load those files.
    My question is I need to schedule the upload of those files. Can i incorporate sql loader in a procedure?
    Agent Id|Agent Type|Create Date|Termination CDC|Activation CDC|Deactivation CDC|Agent IdX|Agent Status|Status Date|Status Reason Code|Update CDC|Update Serial|Update User|New Owner Agent Id|Previous Owner Agent Id|Agent Name|Primary Address1|Primary Address2|Primary Address3|Secondary Address1|Secondary Address2|Secondary Address3| Primary City|Primary State|Primary Zip|Primary Zip Suffix|Primary Country|Secondary City|Secondary State|Secondary Zip|Secondary Zip Suffix|Secondary Country|Phone Number|Fax number|Mobile Number|Business Type|Field Rep|Bill to Chain Id|Mon Open Time|Mon Close Time|Tue Open Time|Tue Close Time|Wed Open Time|Wed Close Time|Thu Open Time|Thu Close Time|Fri Open Time|Fri Close Time|Sat Open Time|Sat Close Time|Sun Open Time|Sun Close Time|Zone Id|Line Charge Class|Chain Id|Chain Code| Primary Contact  Name| Primary Contact Title| Primary Contact Phone|Secondary Contact Name|Secondary Contact Title|Secondary Contact Phone|Tertiary contact Name|Tertiary Contact Title|Tertiary Contact Phone| Bank Id| Bank Account Id| bank Account Type| Bank Account Date| EFT Flag| Fund Limit|Invoicable|TaxCode|Tax Id|Sales Tax|Service Charge|Instant Cashing Type|Instant Telsel Rep| Instant Number of Bins| Instant Number Itvms| InstantCredit Limit|Auto Reorder| Instant Terminal Reorder| Instant Telsel Reorder| Instant Teleset Active CDC| Instant Initial Distribution|Auto Telsel Schedule| Instant Auto Settle| Instant Call Day| Instant Call Week| Instant Call Cycle| Instant Order Restriction| Instant Delivery Flag| Instant Account Type| Instant Settle Class| Region|County|Territory|Route|Chain Statement|Master Agent Id| Minority Owned| Tax Name| State Tax Id|Mailing Name| Bank Account Name| DSR
    0|1|0|0|0|0|0|1|0|0|302|0|0|0|0|||||||||||||||||||||0|0|0|||||||||||||||0|0|0|||||||||||||0|-2145916800|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0||0|0|0|||||
    1|1|1256213087|0|-39081|-39081|1|2|1256213087|999|302|0|0|0|0|Pseudo Outlet||||||||MU|||MU||MU|||MU||||0|0|1|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|0|0|0|||||||||||||
    {code)
    Edited by: Kevin CK on 02-Feb-2010 03:28                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    Yes sorry about that mishap
    Agent Id|Agent Type|Create Date|Termination CDC|Activation CDC|Deactivation CDC|Agent IdX|Agent Status|Status Date|Status Reason Code|Update CDC|Update Serial|Update User|New Owner Agent Id|Previous Owner Agent Id|Agent Name|Primary Address1|Primary Address2|Primary Address3|Secondary Address1|Secondary Address2|Secondary Address3| Primary City|Primary State|Primary Zip|Primary Zip Suffix|Primary Country|Secondary City|Secondary State|Secondary Zip|Secondary Zip Suffix|Secondary Country|Phone Number|Fax number|Mobile Number|Business Type|Field Rep|Bill to Chain Id|Mon Open Time|Mon Close Time|Tue Open Time|Tue Close Time|Wed Open Time|Wed Close Time|Thu Open Time|Thu Close Time|Fri Open Time|Fri Close Time|Sat Open Time|Sat Close Time|Sun Open Time|Sun Close Time|Zone Id|Line Charge Class|Chain Id|Chain Code| Primary Contact  Name| Primary Contact Title| Primary Contact Phone|Secondary Contact Name|Secondary Contact Title|Secondary Contact Phone|Tertiary contact Name|Tertiary Contact Title|Tertiary Contact Phone| Bank Id| Bank Account Id| bank Account Type| Bank Account Date| EFT Flag| Fund Limit|Invoicable|TaxCode|Tax Id|Sales Tax|Service Charge|Instant Cashing Type|Instant Telsel Rep| Instant Number of Bins| Instant Number Itvms| InstantCredit Limit|Auto Reorder| Instant Terminal Reorder| Instant Telsel Reorder| Instant Teleset Active CDC| Instant Initial Distribution|Auto Telsel Schedule| Instant Auto Settle| Instant Call Day| Instant Call Week| Instant Call Cycle| Instant Order Restriction| Instant Delivery Flag| Instant Account Type| Instant Settle Class| Region|County|Territory|Route|Chain Statement|Master Agent Id| Minority Owned| Tax Name| State Tax Id|Mailing Name| Bank Account Name| DSR
    0|1|0|0|0|0|0|1|0|0|302|0|0|0|0|||||||||||||||||||||0|0|0|||||||||||||||0|0|0|||||||||||||0|-2145916800|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0||0|0|0|||||
    1|1|1256213087|0|-39081|-39081|1|2|1256213087|999|302|0|0|0|0|Pseudo Outlet||||||||MU|||MU||MU|||MU||||0|0|1|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|0|0|0|||||||||||||1|-2145916800|1|0|1|0||0|0|0|0|0|0|0|0|0|0|-3287|0|0|0|1|1|2|0|0|0|1|0|999|0||5|0|0|||||This is my file format which is a .txt file

Maybe you are looking for