SQL*Loader and binary

i am trying to load binary data using SQL*Loader. given a table created a la:
create table oofa ( data raw(500) );
and a control file (running on x86 linux):
LOAD DATA
INFILE *
APPEND INTO TABLE OOFA
( DATA VARRAW(500) )
BEGINDATA
X'05003132333435'
i get an error "variable length field exceeds maximum length" and no record loaded. i tried using X'00053132333435' just in case i'd muffed the big- little-endian stuff, but the same thing happens.
anyone successfully loaded binary data as above? if so, how'd you do it?
also, is there a way to make SQL*Loader more verbose?
thanks.

i am trying to load binary data using SQL*Loader. given a table created a la:
create table oofa ( data raw(500) );
and a control file (running on x86 linux):
LOAD DATA
INFILE *
APPEND INTO TABLE OOFA
( DATA VARRAW(500) )
BEGINDATA
X'05003132333435'
i get an error "variable length field exceeds maximum length" and no record loaded. i tried using X'00053132333435' just in case i'd muffed the big- little-endian stuff, but the same thing happens.
anyone successfully loaded binary data as above? if so, how'd you do it?
also, is there a way to make SQL*Loader more verbose?
thanks.

Similar Messages

  • SQL*Loader and binary data

    i have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
    so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
    i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
    unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
    any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
    tia. john

    i have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
    so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
    i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
    unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
    any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
    tia. john

  • SQL Loader and Insert Into Performance Difference

    Hello All,
    Im in a situation to measure performance difference between SQL Loader and Insert into. Say there 10000 records in a flat file and I want to load it into a staging table.
    I know that if I use PL/SQL UTL_FILE to do this job performance will degrade(dont ask me why im going for UTL_FILE instead of SQL Loader). But I dont know how much. Can anybody tell me the performance difference in % (like 20% will decrease) in case of 10000 records.
    Thanks,
    Kannan.

    Kannan B wrote:
    Do not confuse the topic, as I told im not going to use External tables. This post is to speak the performance difference between SQL Loader and Simple Insert Statement.I don't think people are confusing the topic.
    External tables are a superior means of reading a file as it doesn't require any command line calls or external control files to be set up. All that is needed is a single external table definition created in a similar way to creating any other table (just with the additional external table information obviously). It also eliminates the need to have a 'staging' table on the database to load the data into as the data can just be queried as needed directly from the file, and if the file changes, so does the data seen through the external table automatically without the need to re-run any SQL*Loader process again.
    Who told you not to use External Tables? Do they know what they are talking about? Can they give a valid reason why external tables are not to be used?
    IMO, if you're considering SQL*Loader, you should be considering External tables as a better alternative.

  • SQL *Loader and External Table

    Hi,
    Can anyone tell me the difference between SQL* Loader and External table?
    What are the conditions under we can use SQL * Loader and External Table.
    Thanx

    External tables are accessible from SQL, which generally simplifies life if the data files are physically located on the database server since you don't have to coordinate a call to an external SQL*Loader script with other PL/SQL processing. Under the covers, external tables are normally just invoking SQL*Loader.
    SQL*Loader is more appropriate if the data files are on a different server or if it is easier to call an executable rather than calling PL/SQL (i.e. if you have a batch file that runs on a server other than the database server that wants to FTP a data file from a FTP server and then load the data into Oracle).
    Justin

  • Help in calling sql loader and an oracle procedure in a script

    Hi Guru's
    please help me in writing an unix script which will call sql loader and also an oracle procedure..
    i wrote an script which is as follows.
    !/bin/sh
    clear
    #export ORACLE_SID='HOBS2'
    sqlldr USERID=load/ps94mfo16 CONTROL=test_nica.ctl LOG=test_nica.log
    retcode=`echo $?`
    case "$retcode" in
    0) echo "SQL*Loader execution successful" ;;
    1) echo "SQL*Loader execution exited with EX_FAIL, see logfile" ;;
    2) echo "SQL*Loader execution exited with EX_WARN, see logfile" ;;
    3) echo "SQL*Loader execution encountered a fatal error" ;;
    *) echo "unknown return code";;
    esac
    sqlplus USERID=load/ps94mfo16 << EOF
    EXEC DO_TEST_SHELL_SCRIPT
    it is loading the data in to an oracle table
    but the procedure is not executed..
    any valuable suggestion is highly appriciated..
    Cheers

    multiple duplicate threads:
    to call an oracle procedure and sql loader in an unix script
    Re: Can some one help he sql loader issue.

  • HELP: SQL*LOADER AND Ref Column

    Hallo,
    I have already posted and I really need help and don't come further with this
    I have the following problem. I have 2 tables which I created the following way:
    CREATE TYPE gemark_schluessel_t AS OBJECT(
    gemark_id NUMBER(8),
    gemark_schl NUMBER(4),
    gemark_name VARCHAR2(45)
    CREATE TABLE gemark_schluessel_tab OF gemark_schluessel_t(
    constraint pk_gemark PRIMARY KEY(gemark_id)
    CREATE TYPE flurstueck_t AS OBJECT(
    flst_id NUMBER(8),
    flst_nr_zaehler NUMBER(4),
    flst_nr_nenner NUMBER(4),
    zusatz VARCHAR2(2),
    flur_nr NUMBER(2),
    gemark_schluessel REF gemark_schluessel_t,
    flaeche SDO_GEOMETRY
    CREATE TABLE flurstuecke_tab OF flurstueck_t(
    constraint pk_flst PRIMARY KEY(flst_id),
    constraint uq_flst UNIQUE(flst_nr_zaehler,flst_nr_nenner,zusatz,flur_nr),
    flst_nr_zaehler NOT NULL,
    flur_nr NOT NULL,
    gemark_schluessel REFERENCES gemark_schluessel_tab
    Now I have data in the gemark_schluessel_tab which looks like this (a sample):
    1 101 Borna
    2 102 Draisdorf
    Now I wanna load data in my flurstuecke_tab with SQL*Loader and there I have problems with my ref column gemark_schluessel.
    One data record looks like this in my file (it is without geometry)
    1|97|7||1|1|
    If I wanna load my data record, it does not work. The reference (the system generated OID) should be taken from gemark_schluessel_tab.
    LOAD DATA
    INFILE *
    TRUNCATE
    CONTINUEIF NEXT(1:1) = '#'
    INTO TABLE FLURSTUECKE_TAB
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS (
    flst_id,
    flst_nr_zaehler,
    flst_nr_nenner,
    zusatz,
    flur_nr,
    gemark_schluessel REF(CONSTANT 'GEMARK_SCHLUESSEL_TAB',GEMARK_ID),
    gemark_id FILLER
    BEGINDATA
    1|97|7||1|1|
    Is there a error I made?
    Thanks in advance
    Tig

    multiple duplicate threads:
    to call an oracle procedure and sql loader in an unix script
    Re: Can some one help he sql loader issue.

  • Using SQL*Loader and UTL_FILE to load and unload large files(i.e PDF,DOCs)

    Problem : Load PDF or similiar files( stored at operating system) into an oracle table using SQl*Loader .
    and than Unload the files back from oracle tables to prevoius format.
    I 've used SQL*LOADER .... " sqlldr " command as :
    " sqlldr scott/[email protected] control=c:\sqlldr\control.ctl log=c:\any.txt "
    Control file is written as :
    LOAD DATA
    INFILE 'c:\sqlldr\r_sqlldr.txt'
    REPLACE
    INTO table r_sqlldr
    Fields terminated by ','
    id sequence (max,1) ,
    fname char(20),
    data LOBFILE(fname) terminated by EOF )
    It loads files ( Pdf, Image and more...) that are mentioned in file r_sqlldr.txt into oracle table r_sqlldr
    Text file ( used as source ) is written as :
    c:\kalam.pdf,
    c:\CTSlogo1.bmp
    c:\any1.txt
    after this load ....i used UTL_FILE to unload data and write procedure like ...
    CREATE OR REPLACE PROCEDURE R_UTL AS
    l_file UTL_FILE.FILE_TYPE;
    l_buffer RAW(32767);
    l_amount BINARY_INTEGER ;
    l_pos INTEGER := 1;
    l_blob BLOB;
    l_blob_len INTEGER;
    BEGIN
    SELECT data
    INTO l_blob
    FROM r_sqlldr
    where id= 1;
    l_blob_len := DBMS_LOB.GETLENGTH(l_blob);
    DBMS_OUTPUT.PUT_LINE('blob length : ' || l_blob_len);
    IF (l_blob_len < 32767) THEN
    l_amount :=l_blob_len;
    ELSE
    l_amount := 32767;
    END IF;
    DBMS_LOB.OPEN(l_blob, DBMS_LOB.LOB_READONLY);
    l_file := UTL_FILE.FOPEN('DBDIR1','Kalam_out.pdf','w', 32767);
    DBMS_OUTPUT.PUT_LINE('File opened');
    WHILE l_pos < l_blob_len LOOP
    DBMS_LOB.READ (l_blob, l_amount, l_pos, l_buffer);
    DBMS_OUTPUT.PUT_LINE('Blob read');
    l_pos := l_pos + l_amount;
    UTL_FILE.PUT_RAW(l_file, l_buffer, TRUE);
    DBMS_OUTPUT.PUT_LINE('writing to file');
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.NEW_LINE(l_file);
    END LOOP;
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.FCLOSE(l_file);
    DBMS_OUTPUT.PUT_LINE('File closed');
    DBMS_LOB.CLOSE(l_blob);
    EXCEPTION
    WHEN OTHERS THEN
    IF UTL_FILE.IS_OPEN(l_file) THEN
    UTL_FILE.FCLOSE(l_file);
    END IF;
    DBMS_OUTPUT.PUT_LINE('Its working at last');
    END R_UTL;
    This loads data from r_sqlldr table (BOLBS) to files on operating system ,,,
    -> Same procedure with minor changes is used to unload other similar files like Images and text files.
    In above example : Loading : 3 files 1) Kalam.pdf 2) CTSlogo1.bmp 3) any1.txt are loaded into oracle table r_sqlldr 's 3 rows respectively.
    file names into fname column and corresponding data into data ( BLOB) column.
    Unload : And than these files are loaded back into their previous format to operating system using UTL_FILE feature of oracle.
    so PROBLEM IS : Actual capacity (size ) of these files is getting unloaded back but with quality decreased. And PDF file doesnt even view its data. means size is almot equal to source file but data are lost when i open it.....
    and for images .... imgaes are getting loaded an unloaded but with colors changed ....
    Also features ( like FFLUSH ) of Oracle 've been used but it never worked
    ANY SUGGESTIONS OR aLTERNATE SOLUTION TO LOAD AND UNLOAD PDFs through Oracle ARE REQUESTED.
    ------------------------------------------------------------------------------------------------------------------------

    Thanks Justin ...for a quick response ...
    well ... i am loading data into BLOB only and using SQL*Loader ...
    I've never used dbms_lob.loadFromFile to do the loads ...
    i 've opend a file on network and than used dbms_lob.read and
    UTL_FILE.PUT_RAW to read and write data into target file.
    actually ...my process is working fine with text files but not with PDF and IMAGES ...
    and your doubt of ..."Is the data the proper length after reading it in?" ..m not getting wat r you asking ...but ... i think regarding data length ..there is no problem... except ... source PDF length is 90.4 kb ..and Target is 90.8 kb..
    thats it...
    So Request u to add some more help ......or should i provide some more details ??

  • Is there any difference in Oracle 9i SQL Loader and Oracle 10g SQL Loader

    Hi
    Can anyone tell me whether is there any difference in Oracle 9i SQL Loader and Oracle 10g SQL Loader?
    I am upgrading the 9i db to 10g and wanted to run the 9i SQL Loader control files on upgraded 10g db. So please let me know is there any difference which I need to consider any modifications in the control files..
    Thank you in advance
    Adi

    answered

  • Import and process larger data with SQL*Loader and Java resource

    Hello,
    I have a project to import data from a text file in a schedule. A lager data, with nearly 20,000 record/1 hours.
    After that, we have to analysis the data, and export the results into a another database.
    I research about SQL*Loader and Java resource to do these task. But I have no experiment about that.
    I'm afraid of the huge data, Oracle could be slowdown or the session in Java Resource application could be timeout.
    Please tell me some advice about the solution.
    Thank you very much.

    With '?' mark i mean " How i can link this COL1 with column in csv file ? "
    Attilio

  • SQL Loader and foreign characters in the data file problem

    Hello,
    I have run into an issue which I can't find an answer for. When I run SQL Loader, one of my control files is used to get file content (LOBFILE) and one of the fields in the data file has a path to that file. The control file looks like:
    LOAD DATA
    INFILE 'PLACE_HOLDER.dat'
    INTO TABLE iceberg.rpt_document_core APPEND
    FIELDS TERMINATED BY ','
    doc_core_id "iceberg.seq_rpt_document_core.nextval",
    -- created_date POSITION(1) date "yyyy-mm-dd:hh24:mi:ss",
    created_date date "yyyy-mm-dd:hh24:mi:ss",
    document_size,
    hash,
    body_format,
    is_generic_doc,
    is_legacy_doc,
    external_filename FILLER char(275) ENCLOSED by '"',
    body LOBFILE(external_filename) terminated by EOF
    A sample data file looks like:
    0,2012-10-22:10:09:35,21,BB51344DD2127002118E286A197ECD4A,text,N,N,"E:\tmp\misc_files\index_testers\foreign\شیمیایی.txt"
    0,2012-10-22:10:09:35,17,CF85BE76B1E20704180534E19D363CF8,text,N,N,"E:\tmp\misc_files\index_testers\foreign\ลอบวางระเบิด.txt"
    0,2012-10-22:10:09:35,23552,47DB382558D69F170227AA18179FD0F0,binary,N,N,"E:\tmp\misc_files\index_testers\foreign\leesburgis_á_ñ_é_í_ó_ú_¿_¡_ü_99.doc"
    0,2012-10-22:10:09:35,17,83FCA0377445B60CE422DE8994900A79,binary,N,N,"E:\tmp\misc_files\index_testers\foreign\làm thế nào bạn làm ngày hôm nay"
    The problem is that whan I run this, SQL Loader throws an error that it can't find the file. It appears that it can't interpret the foreign characters in a way that allows it to find that path. I have tried adding a CHARACTERSET (using AL32UTF8 or UTF8) value in the control file but that only has some success with Western languages, not the ones listed above. Also, there is no set of defined languages that could be found in the data file. It essentaially could be any language.
    Does anyone know if there is a way to somehow get SQL Loader to "understand" the file system paths when a folder and/or file name could be in some other langauge?
    Thanks for any thoughts - Peter

    Thanks for the reply Harry. If I try to open the file in various text editors like Wordpad, Notepad, GVIM, andTextpad, they all display the foreign characters differently. Only Notepad comes close to displaying the characters properly. I have a C# app that will read the file and display the contents and it renders it fine. If you look at the directory of files in Windows Explorer, they all are displayed properly. So it seems things like .Net and Windows have some mechanism to understand the characters in order to render them properly. Other applications, again like Wordpad, do not know how to render them properly. It would seem that whatever SQL Loader is using to "read" the data files also is not rendering the characters properly which prevents it from finding the directory path to the file. If I add "CHARACTERSET AL32UTF8" in the control file, all is fine when dealing with Western langauges (ex, German, Spanish) but not for the Eastern languages (ex. Thai, Chinese). So .... telling SQL Loader to use a characterset seems to work, but not in all cases. The AL32UTF8 is the characterset that the Oracle database was created with. I have not had any luck if I try to set the CHARACTERSET to whatever the Thai character set is, for example. There problem there though is that even if that did work, I can't target specific lagauages because the data could come from anywhere. It's like I need some sort of global "super set" characterset to use. It seems like the CHARACTERSET is the right track to follow but I am not sure, and even if it is, is there a way to handle all languages.
    Thanks - Peter

  • SQL*Loader and "Variable length field was truncated"

    Hi,
    I'm experiencing this problem using SQL*Loader: Release 8.1.7.0.0
    Here is my control file (it's actually split into separate control and data files, but the result is the same)
    LOAD DATA
    INFILE *
    APPEND INTO TABLE test
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
    first_id,
    second_id,
    third_id,
    language_code,
    display_text VARCHAR(2000)
    begindata
    2,1,1,"eng","Type of Investment Account"
    The TEST table is defined as:
    Name Null? Type
    FIRST_ID NOT NULL NUMBER(4)
    SECOND_ID NOT NULL NUMBER(4)
    THIRD_ID NOT NULL NUMBER(4)
    LANGUAGE_CODE NOT NULL CHAR(3)
    DISPLAY_TEXT VARCHAR2(2000)
    QUESTION_BLOB BLOB
    The log file displays:
    Record 1: Warning on table "USER"."TEST", column DISPLAY_TEXT
    Variable length field was truncated.
    And the results of the insert are:
    FIRST_ID SECOND_ID THIRD_ID LANGUAGE_CODE DISPLAY_TEXT
    2 1 1 eng ype of Investment Account"
    The language_code field is imported correctly, but display_text keeps the closing delimiter, and loses the first character of the string. In other words, it is interpreting the enclosing double quote and/or the delimiter, and truncating the first two characters.
    I've also tried the following:
    LOAD DATA
    INFILE *
    APPEND INTO TABLE test
    FIELDS TERMINATED BY '|'
    first_id,
    second_id,
    third_id,
    language_code,
    display_text VARCHAR(2000)
    begindata
    2|1|1|eng|Type of Investment Account
    In this case, display_text is imported as:
    pe of Investment Account
    In the log file, I get this table which seems odd as well - why is the display_text column shown as having length 2002 when I explicitly set it to 2000?
    Column Name Position Len Term Encl Datatype
    FIRST_ID FIRST * | O(") CHARACTER
    SECOND_ID NEXT * | O(") CHARACTER
    THIRD_ID NEXT * | O(") CHARACTER
    LANGUAGE_CODE NEXT 3 | O(") CHARACTER
    DISPLAY_TEXT NEXT 2002 VARCHAR
    Am I missing something totally obvious in my control and data files? I've played with various combinations of delimiters (commas vs '|'), trailing nullcols, optional enclosed etc.
    Any help would be greatly appreciated!

    Use CHAR instead aof VARCHAR
    LOAD DATA
    INFILE *
    APPEND INTO TABLE test
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
      first_id,
      second_id,
      third_id,
      language_code,
      display_text    CHAR(2000)
    )From the docu:
    A VARCHAR field is a length-value datatype.
    It consists of a binary length subfield followed by a character string of the specified length.
    http://download-west.oracle.com/docs/cd/A87860_01/doc/server.817/a76955/ch05.htm#20324

  • SQL Loader and ^M character

    Hi Experts
    Please I need help with this problem:
    I am migrating data from a Solid Database to Oracle, I am using Flat Files to do that.
    1.- I download the data to flat files from Solid
    2.- I move the files to Oracle server
    3.- I upload the data to Oracle
    Now, I have done the 90% of the data base, but I have found some tables that has description columns and in this description the users writes enters, so when I try to upload the data to Oracle SQL loader cannot recognize this characters.
    Example:
    '25','0.','5.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
    '26','0.','2.','0.','0.','0.','0.','3.','0.','0.','0.','0.','0.','',''
    '27','0.','1.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
    '28','0.','1.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
    '29','0.','38.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
    '30','0.','13.','0.','0.','0.','0.','0.','6.','0.','6.','0.','0.','|SE RECHAZA B20CS50SNW ^M
    ^M
    SE RECHAZAN CINCO PZAS ^M
    DOS MOD. HSC15I41EH,DOS MOD. HSK15I41EH |Agregó: 06/06/2009 12:22:50
    |','DEV. A PROV.'
    '31','0.','50.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
    '32','0.','9.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
    '33','0.','2.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
    How can I solve this ?
    Kind regards
    J.A

    2.- I move the files to Oracle server Open and see the file at this level, you may find the junk characters appended at every line. This occurs due to file transfer. (if so which mode you used while transfering the file?)
    Workaround:
    Try moving files through Binary mode
    Rgd
    KSG

  • SQL*Loader and DECODE function

    Hi All,
    I am loading data from data files into oracle tables and while loading the data using SQL*Loader, the following requirement needs to be fulfilled.
    1) If OQPR < 300, RB = $ 0-299, SC = "SC1"
    2) If 300 < OQPR < 1200, RB = $ 300-1199, SC = "SC2"
    3) If 1200 < OQPR < 3000, RB = $ 1200-2999, SC = "SC3"
    4) If OQPR > 3000 USD, RB = > $3000, SC = "SC4"
    Here OPQR is a field in the data file.
    Can anyone suggest how do we handle this using DECODE function? Triggers and PL/SQL functions are not to be used.
    TIA.
    Regards,
    Ravi.

    The following expression gives you different values for your different intervals and boundaries :
    SIGN(:OQPR - 300) + SIGN(:OQPR - 1200) + SIGN(:OQPR - 3000)

  • Problem with SQL*Loader and different date formats in the same file

    DB: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    System: AIX 5.3.0.0
    Hello,
    I'm using SQL*Loader to import semi-colon separated values into a table. The files are delivered to us by a data provider who concatenates data from different sources and this results in us having different date formats within the same file. For example:
    ...;2010-12-31;22/11/1932;...
    I load this data using the following lines in the control file:
    EXECUTIONDATE1     TIMESTAMP     NULLIF EXECUTIONDATE1=BLANKS     "TO_DATE(:EXECUTIONDATE1, 'YYYY-MM-DD')",
    DELDOB          TIMESTAMP     NULLIF DELDOB=BLANKS          "TO_DATE(:DELDOB, 'DD/MM/YYYY')",
    The relevant NLS parameters:
    NLS_LANGUAGE=FRENCH
    NLS_DATE_FORMAT=DD/MM/RR
    NLS_DATE_LANGUAGE=FRENCH
    If I load this file as is the values loaded into the table are 31 dec 2010 and 22 nov *2032*, aven though the years are on 4 digits. If I change the NLS_DATE_FORMAT to DD/MM/YYYY then the second date value will be loaded correctly, but the first value will be loaded as 31 dec *2020* !!
    How can I get both date values to load correctly?
    Thanks!
    Sylvain

    This is very strange, after running a few tests I realized that if the year is 19XX then it will get loaded as 2019, and if it is 20XX then it will be 2020. I'm guessing it may have something to do with certain env variables that aren't set up properly because I'm fairly sure my SQL*Loader control file is correct... I'll run more tests :-(

  • SQL*LOADER and decode

    I'm setting up a sql*loader script and trying to use the decode function as referred to in 'Applying SQL Operators to Fields' I'm getting an error message ' Token longer than max allowable length of 258 chars'. Is there a limit to the size of the decode statement within sql*loader - or is it better to use a table trigger to handle this on insert? I ran the decode statement as a select through SQL*Plus and it works okay there. Oracle 8.0 Utilities shows example of decode in Ch. 5, but Oracle 9i Utilities Ch. 6 does not. Has anyone done this and what's the impact on performance of the load if I can get it to work? See my example below:
    LOAD DATA
    INFILE 'e2e_prod_cust_profile.csv'
    APPEND
    INTO TABLE APPS.RA_CUSTOMER_PROFILES_INTERFACE
    FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
    (Insert_update_flag CHAR(1),
    Orig_system_customer_ref CHAR(240),
    customer_profile_class_name CHAR(30) NULLIF customer_profile_class=BLANKS
    "decode(customer_profile_class_name,
    'NORTHLAND Default','(MIA) Default',
    'NORTHLAND Non Consolidated','(MIA) Non Cons',
    'NORTHLAND Consolidated A','(MIA) Cons A',
    'NORTHLAND Consolidated B','(MIA) Cons B',
    'NORTHLAND Consolidated C','(MIA) Cons C',
    'NORTHLAND Consolidated D','(MIA) Cons D',
    'NORTHLAND Cons A NonZS','(MIA) Cons A NonZS',
    'NORTHLAND Cons B NonZS','(MIA) Cons B NonZS',
    'NORTHLAND Cons C NonZS','(MIA) Cons C NonZS',
    'NORTHLAND Cons D NonZS','(MIA) Cons D NonZS',
    'NORTHLAND International Billing','(MIA) International Billing',
    customer_profile_class_name)",
    credit_hold CHAR(1),
    overall_credit_limit INTERGER EXTERNAL,
    "e2e_cust_profile.ctl" 49 lines, 1855 characters
    SQL*Loader-350: Syntax error at line 15.
    Token longer than max allowable length of 258 chars
    'NORTHLAND Consolidated D','(MIA) Cons D',
    ^

    Your controlfile is incomplete and has some typos, but you could try something like:
    create or replace function decode_profile_class_name (p_longname IN VARCHAR2)
    return VARCHAR2
    is
    begin
      CASE p_longname
        WHEN 'NORTHLAND Default'               THEN RETURN '(MIA) Default';
        WHEN 'NORTHLAND Non Consolidated'      THEN RETURN '(MIA) Non Cons';
        WHEN 'NORTHLAND Consolidated A'        THEN RETURN '(MIA) Cons A';
        WHEN 'NORTHLAND Consolidated B'        THEN RETURN '(MIA) Cons B';
        WHEN 'NORTHLAND Consolidated C'        THEN RETURN '(MIA) Cons C';
        WHEN 'NORTHLAND Consolidated D'        THEN RETURN '(MIA) Cons D';
        WHEN 'NORTHLAND Cons A NonZS'          THEN RETURN '(MIA) Cons A NonZS';
        WHEN 'NORTHLAND Cons B NonZS'          THEN RETURN '(MIA) Cons B NonZS';
        WHEN 'NORTHLAND Cons C NonZS'          THEN RETURN '(MIA) Cons C NonZS';
        WHEN 'NORTHLAND Cons D NonZS'          THEN RETURN '(MIA) Cons D NonZS';
        WHEN 'NORTHLAND International Billing' THEN RETURN '(MIA) International Billing';
        ELSE RETURN p_longname;
      END CASE;
    end;
    LOAD DATA
    INFILE 'e2e_prod_cust_profile.csv'
    APPEND
    INTO TABLE APPS.RA_CUSTOMER_PROFILES_INTERFACE
    FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
    Insert_update_flag          CHAR(1),
    Orig_system_customer_ref    CHAR(240),
    customer_profile_class_name CHAR(30) NULLIF customer_profile_class=BLANKS "decode_profile_class_name(:customer_profile_class_name)"
    credit_hold                 CHAR(1),
    overall_credit_limit        INTEGER EXTERNAL

Maybe you are looking for

  • Logical component in SolMan using SMSY

    Hi, What is a logical component and why is it necessary. Why do we create that in our solman system for the landscape. Also i was going through one tutor in rkt-solman link for creating logical component. It shows for a r/3 system. Is the same select

  • Formula in a narrative view

    Hi Gurus, Can you please help me with narrative views in OBIEE (11.1.1.5)? I have a table that lists customer sales., e.g. Customer Sales($mn) Rank Cust1 800 1 Cust2 450 2 Cust3 350 3 Cust4 300 4 Cust5 40 5 Cust6 30 6 Cust7 20 7 Cust8 10 8 Total 2000

  • Problem with slideshows

    I can't view my albums as slide shows. When I click on the slide show button nothing happens. I've never had this problem before. Why is this happening?

  • Remote object taking more time to respond

    I am using live cycle data services to access my remote java object. Sometimes remote object is taking a lot of time to respond (even 15 minutes). Please give me some solution.

  • Photoshop CS, Dreamweaver 8, Adobe Reader all suddenly failing to launch.

    The icon bounces once, then the little triangle disappears. This problem started around August 24th. This is on an eMac (PPC). No new software of any kind has been installed on this mac, and in fact the only changes to the system were automatic updat