SQL*LOADER and date formatted data

Hi there, I don't want to load complex things in my table - just a simple date....
The control file is:
load data
infile 'c:\data\mydata.csv'
into table test_table
fields terminated by "," optionally enclosed by '"'
( sampledate, name )
The mydata.csv file is:
30-12-2003, Test1
31-12-2003, Test2
However this format is not being accepted by the database... I tried other combinations of date (eg, 30122003, 20-DEC-2003, 30/12/2004, etc...) but always had a 'data error'
Thanks for any suggestions.

Example:
Control File for Case Study 3
This control file loads the same table as in case 2, but it loads three additional columns (hiredate, projno, and loadseq). The demonstration table emp does not have columns projno and loadseq. To test this control file, add these columns to the emp table with the command:
ALTER TABLE emp ADD (projno NUMBER, loadseq NUMBER);
The data is in a different format than in case 2. Some data is enclosed in quotation marks, some is set off by commas, and the values for deptno and projno are separated by a colon.
1) -- Variable-length, delimited, and enclosed data format
LOAD DATA
2) INFILE *
3) APPEND
INTO TABLE emp
4) FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
(empno, ename, job, mgr,
5) hiredate DATE(20) "DD-Month-YYYY",
sal, comm, deptno CHAR TERMINATED BY ':',
projno,
6) loadseq SEQUENCE(MAX,1))
7) BEGINDATA
8) 7782, "Clark", "Manager", 7839, 09-June-1981, 2572.50,, 10:101
7839, "King", "President", , 17-November-1981,5500.00,,10:102
7934, "Miller", "Clerk", 7782, 23-January-1982, 920.00,, 10:102
7566, "Jones", "Manager", 7839, 02-April-1981, 3123.75,, 20:101
7499, "Allen", "Salesman", 7698, 20-February-1981, 1600.00,
(same line continued) 300.00, 30:103
7654, "Martin", "Salesman", 7698, 28-September-1981, 1312.50,
(same line continued) 1400.00, 3:103
7658, "Chan", "Analyst", 7566, 03-May-1982, 3450,, 20:101
Joel Pèrez
http://otn.oracle.com/experts

Similar Messages

  • Import and process larger data with SQL*Loader and Java resource

    Hello,
    I have a project to import data from a text file in a schedule. A lager data, with nearly 20,000 record/1 hours.
    After that, we have to analysis the data, and export the results into a another database.
    I research about SQL*Loader and Java resource to do these task. But I have no experiment about that.
    I'm afraid of the huge data, Oracle could be slowdown or the session in Java Resource application could be timeout.
    Please tell me some advice about the solution.
    Thank you very much.

    With '?' mark i mean " How i can link this COL1 with column in csv file ? "
    Attilio

  • SQL Loader and weird source file structure

    I am familiar with using SQL Loader to process comma or tab delimited text files into proper table data. But how do you process "positionally delimited" files?!
    I mean, I have a text file which looks like this:
    Item A
    ---attribute1
    ---attribute2
    ---attribute3
    ---attribute4
    Item B
    ---attribute1
    ---attribute2
    ---attribute3
    ---attribute4
    Item C
    ---attribute1
    ---attribute2
    ---attribute3
    ---attribute4
    And so on. How do I tell SQL Loader, 'rows 1, 6 and 11 are the first column in the table; rows 2, 7 and 12 and the second column; rows 3, 8, 13 are the third and so on? I want to end up with data in the form
    Item A Attribute1 Attribute2 Attribute3 Attribute 4
    Item B Attribute1 Attribute2 Attribute3 Attribute 4
    Item C Attribute1 Attribute2 Attribute3 Attribute 4
    Is such a thing even possible with SQL Loader?

    If your data is exactly in that format like in sample provided by Enrique, you can try to play with that certain data pattern. Again, based on previous example:
    ~ >cat data.dat
    map1
        615050: ( 95, 83, 68) #5F5344 rgb(95,83,68)
        605728: ( 36, 27, 22) #241B16 rgb(36,27,22)
        225834: (154,107, 84) #9A6B54 rgb(154,107,84)
        176305: (218,187,155) #DABB9B rgb(218,187,155)
        141083: (188,142,111) #BC8E6F rgb(188,142,111)
    map2
        615050: ( 95, 83, 68) #5F5344 rgb(95,83,68)
        605728: ( 36, 27, 22) #241B16 rgb(36,27,22)
        225834: (154,107, 84) #9A6B54 rgb(154,107,84)
        176305: (218,187,155) #DABB9B rgb(218,187,155)
        141083: (188,142,111) #BC8E6F rgb(188,142,111)
    map3
        615050: ( 95, 83, 68) #5F5344 rgb(95,83,68)
        605728: ( 36, 27, 22) #241B16 rgb(36,27,22)
        225834: (154,107, 84) #9A6B54 rgb(154,107,84)
        176305: (218,187,155) #DABB9B rgb(218,187,155)
        141083: (188,142,111) #BC8E6F rgb(188,142,111)
    ~ >cat data.ctl
    LOAD DATA
    INFILE "data.dat" "str '\nm'"
    DISCARDFILE "data.dsc"
    DISCARDMAX 999
    REPLACE
    -- CONCATENATE 6
    INTO TABLE data
    fields terminated by X'0a' trailing nullcols
    ( item char "replace('m'||:item,'mm','m')",
      attrib1 char,
      attrib2 char,
      attrib3 char,
      attrib4 char,
      attrib5 char
    ~ >sqlplus scott/tiger
    SQL*Plus: Release 10.2.0.4.0 - Production on Wed Oct 15 16:27:25 2008
    Copyright (c) 1982, 2007, Oracle.  All Rights Reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    SQL> select count(*) from data;
      COUNT(*)
             0
    SQL> !sqlldr userid=scott/tiger control=data.ctl
    SQL*Loader: Release 10.2.0.4.0 - Production on Wed Oct 15 16:28:29 2008
    Copyright (c) 1982, 2007, Oracle.  All rights reserved.
    Commit point reached - logical record count 2
    Commit point reached - logical record count 3
    SQL> col item for a4
    SQL> col attrib1 for a50
    SQL> col attrib2 for a50
    SQL> select item,attrib1,attrib2 from data;
    ITEM ATTRIB1                                            ATTRIB2
    map1     615050: ( 95, 83, 68) #5F5344 rgb(95,83,68)        605728: ( 36, 27, 22) #241B16 rgb(36,27,22)
    map2     615050: ( 95, 83, 68) #5F5344 rgb(95,83,68)        605728: ( 36, 27, 22) #241B16 rgb(36,27,22)
    map3     615050: ( 95, 83, 68) #5F5344 rgb(95,83,68)        605728: ( 36, 27, 22) #241B16 rgb(36,27,22)
    SQL>As you see, i loaded on linux, on windows you may slightly adjust controlfile to reflect end of lines...
    Best regards
    Maxim

  • Using SQL*Loader and UTL_FILE to load and unload large files(i.e PDF,DOCs)

    Problem : Load PDF or similiar files( stored at operating system) into an oracle table using SQl*Loader .
    and than Unload the files back from oracle tables to prevoius format.
    I 've used SQL*LOADER .... " sqlldr " command as :
    " sqlldr scott/[email protected] control=c:\sqlldr\control.ctl log=c:\any.txt "
    Control file is written as :
    LOAD DATA
    INFILE 'c:\sqlldr\r_sqlldr.txt'
    REPLACE
    INTO table r_sqlldr
    Fields terminated by ','
    id sequence (max,1) ,
    fname char(20),
    data LOBFILE(fname) terminated by EOF )
    It loads files ( Pdf, Image and more...) that are mentioned in file r_sqlldr.txt into oracle table r_sqlldr
    Text file ( used as source ) is written as :
    c:\kalam.pdf,
    c:\CTSlogo1.bmp
    c:\any1.txt
    after this load ....i used UTL_FILE to unload data and write procedure like ...
    CREATE OR REPLACE PROCEDURE R_UTL AS
    l_file UTL_FILE.FILE_TYPE;
    l_buffer RAW(32767);
    l_amount BINARY_INTEGER ;
    l_pos INTEGER := 1;
    l_blob BLOB;
    l_blob_len INTEGER;
    BEGIN
    SELECT data
    INTO l_blob
    FROM r_sqlldr
    where id= 1;
    l_blob_len := DBMS_LOB.GETLENGTH(l_blob);
    DBMS_OUTPUT.PUT_LINE('blob length : ' || l_blob_len);
    IF (l_blob_len < 32767) THEN
    l_amount :=l_blob_len;
    ELSE
    l_amount := 32767;
    END IF;
    DBMS_LOB.OPEN(l_blob, DBMS_LOB.LOB_READONLY);
    l_file := UTL_FILE.FOPEN('DBDIR1','Kalam_out.pdf','w', 32767);
    DBMS_OUTPUT.PUT_LINE('File opened');
    WHILE l_pos < l_blob_len LOOP
    DBMS_LOB.READ (l_blob, l_amount, l_pos, l_buffer);
    DBMS_OUTPUT.PUT_LINE('Blob read');
    l_pos := l_pos + l_amount;
    UTL_FILE.PUT_RAW(l_file, l_buffer, TRUE);
    DBMS_OUTPUT.PUT_LINE('writing to file');
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.NEW_LINE(l_file);
    END LOOP;
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.FCLOSE(l_file);
    DBMS_OUTPUT.PUT_LINE('File closed');
    DBMS_LOB.CLOSE(l_blob);
    EXCEPTION
    WHEN OTHERS THEN
    IF UTL_FILE.IS_OPEN(l_file) THEN
    UTL_FILE.FCLOSE(l_file);
    END IF;
    DBMS_OUTPUT.PUT_LINE('Its working at last');
    END R_UTL;
    This loads data from r_sqlldr table (BOLBS) to files on operating system ,,,
    -> Same procedure with minor changes is used to unload other similar files like Images and text files.
    In above example : Loading : 3 files 1) Kalam.pdf 2) CTSlogo1.bmp 3) any1.txt are loaded into oracle table r_sqlldr 's 3 rows respectively.
    file names into fname column and corresponding data into data ( BLOB) column.
    Unload : And than these files are loaded back into their previous format to operating system using UTL_FILE feature of oracle.
    so PROBLEM IS : Actual capacity (size ) of these files is getting unloaded back but with quality decreased. And PDF file doesnt even view its data. means size is almot equal to source file but data are lost when i open it.....
    and for images .... imgaes are getting loaded an unloaded but with colors changed ....
    Also features ( like FFLUSH ) of Oracle 've been used but it never worked
    ANY SUGGESTIONS OR aLTERNATE SOLUTION TO LOAD AND UNLOAD PDFs through Oracle ARE REQUESTED.
    ------------------------------------------------------------------------------------------------------------------------

    Thanks Justin ...for a quick response ...
    well ... i am loading data into BLOB only and using SQL*Loader ...
    I've never used dbms_lob.loadFromFile to do the loads ...
    i 've opend a file on network and than used dbms_lob.read and
    UTL_FILE.PUT_RAW to read and write data into target file.
    actually ...my process is working fine with text files but not with PDF and IMAGES ...
    and your doubt of ..."Is the data the proper length after reading it in?" ..m not getting wat r you asking ...but ... i think regarding data length ..there is no problem... except ... source PDF length is 90.4 kb ..and Target is 90.8 kb..
    thats it...
    So Request u to add some more help ......or should i provide some more details ??

  • SQL Loader and Insert Into Performance Difference

    Hello All,
    Im in a situation to measure performance difference between SQL Loader and Insert into. Say there 10000 records in a flat file and I want to load it into a staging table.
    I know that if I use PL/SQL UTL_FILE to do this job performance will degrade(dont ask me why im going for UTL_FILE instead of SQL Loader). But I dont know how much. Can anybody tell me the performance difference in % (like 20% will decrease) in case of 10000 records.
    Thanks,
    Kannan.

    Kannan B wrote:
    Do not confuse the topic, as I told im not going to use External tables. This post is to speak the performance difference between SQL Loader and Simple Insert Statement.I don't think people are confusing the topic.
    External tables are a superior means of reading a file as it doesn't require any command line calls or external control files to be set up. All that is needed is a single external table definition created in a similar way to creating any other table (just with the additional external table information obviously). It also eliminates the need to have a 'staging' table on the database to load the data into as the data can just be queried as needed directly from the file, and if the file changes, so does the data seen through the external table automatically without the need to re-run any SQL*Loader process again.
    Who told you not to use External Tables? Do they know what they are talking about? Can they give a valid reason why external tables are not to be used?
    IMO, if you're considering SQL*Loader, you should be considering External tables as a better alternative.

  • SQL *Loader and External Table

    Hi,
    Can anyone tell me the difference between SQL* Loader and External table?
    What are the conditions under we can use SQL * Loader and External Table.
    Thanx

    External tables are accessible from SQL, which generally simplifies life if the data files are physically located on the database server since you don't have to coordinate a call to an external SQL*Loader script with other PL/SQL processing. Under the covers, external tables are normally just invoking SQL*Loader.
    SQL*Loader is more appropriate if the data files are on a different server or if it is easier to call an executable rather than calling PL/SQL (i.e. if you have a batch file that runs on a server other than the database server that wants to FTP a data file from a FTP server and then load the data into Oracle).
    Justin

  • Help in calling sql loader and an oracle procedure in a script

    Hi Guru's
    please help me in writing an unix script which will call sql loader and also an oracle procedure..
    i wrote an script which is as follows.
    !/bin/sh
    clear
    #export ORACLE_SID='HOBS2'
    sqlldr USERID=load/ps94mfo16 CONTROL=test_nica.ctl LOG=test_nica.log
    retcode=`echo $?`
    case "$retcode" in
    0) echo "SQL*Loader execution successful" ;;
    1) echo "SQL*Loader execution exited with EX_FAIL, see logfile" ;;
    2) echo "SQL*Loader execution exited with EX_WARN, see logfile" ;;
    3) echo "SQL*Loader execution encountered a fatal error" ;;
    *) echo "unknown return code";;
    esac
    sqlplus USERID=load/ps94mfo16 << EOF
    EXEC DO_TEST_SHELL_SCRIPT
    it is loading the data in to an oracle table
    but the procedure is not executed..
    any valuable suggestion is highly appriciated..
    Cheers

    multiple duplicate threads:
    to call an oracle procedure and sql loader in an unix script
    Re: Can some one help he sql loader issue.

  • HELP: SQL*LOADER AND Ref Column

    Hallo,
    I have already posted and I really need help and don't come further with this
    I have the following problem. I have 2 tables which I created the following way:
    CREATE TYPE gemark_schluessel_t AS OBJECT(
    gemark_id NUMBER(8),
    gemark_schl NUMBER(4),
    gemark_name VARCHAR2(45)
    CREATE TABLE gemark_schluessel_tab OF gemark_schluessel_t(
    constraint pk_gemark PRIMARY KEY(gemark_id)
    CREATE TYPE flurstueck_t AS OBJECT(
    flst_id NUMBER(8),
    flst_nr_zaehler NUMBER(4),
    flst_nr_nenner NUMBER(4),
    zusatz VARCHAR2(2),
    flur_nr NUMBER(2),
    gemark_schluessel REF gemark_schluessel_t,
    flaeche SDO_GEOMETRY
    CREATE TABLE flurstuecke_tab OF flurstueck_t(
    constraint pk_flst PRIMARY KEY(flst_id),
    constraint uq_flst UNIQUE(flst_nr_zaehler,flst_nr_nenner,zusatz,flur_nr),
    flst_nr_zaehler NOT NULL,
    flur_nr NOT NULL,
    gemark_schluessel REFERENCES gemark_schluessel_tab
    Now I have data in the gemark_schluessel_tab which looks like this (a sample):
    1 101 Borna
    2 102 Draisdorf
    Now I wanna load data in my flurstuecke_tab with SQL*Loader and there I have problems with my ref column gemark_schluessel.
    One data record looks like this in my file (it is without geometry)
    1|97|7||1|1|
    If I wanna load my data record, it does not work. The reference (the system generated OID) should be taken from gemark_schluessel_tab.
    LOAD DATA
    INFILE *
    TRUNCATE
    CONTINUEIF NEXT(1:1) = '#'
    INTO TABLE FLURSTUECKE_TAB
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS (
    flst_id,
    flst_nr_zaehler,
    flst_nr_nenner,
    zusatz,
    flur_nr,
    gemark_schluessel REF(CONSTANT 'GEMARK_SCHLUESSEL_TAB',GEMARK_ID),
    gemark_id FILLER
    BEGINDATA
    1|97|7||1|1|
    Is there a error I made?
    Thanks in advance
    Tig

    multiple duplicate threads:
    to call an oracle procedure and sql loader in an unix script
    Re: Can some one help he sql loader issue.

  • Is there any difference in Oracle 9i SQL Loader and Oracle 10g SQL Loader

    Hi
    Can anyone tell me whether is there any difference in Oracle 9i SQL Loader and Oracle 10g SQL Loader?
    I am upgrading the 9i db to 10g and wanted to run the 9i SQL Loader control files on upgraded 10g db. So please let me know is there any difference which I need to consider any modifications in the control files..
    Thank you in advance
    Adi

    answered

  • Problem with SQL*Loader and different date formats in the same file

    DB: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    System: AIX 5.3.0.0
    Hello,
    I'm using SQL*Loader to import semi-colon separated values into a table. The files are delivered to us by a data provider who concatenates data from different sources and this results in us having different date formats within the same file. For example:
    ...;2010-12-31;22/11/1932;...
    I load this data using the following lines in the control file:
    EXECUTIONDATE1     TIMESTAMP     NULLIF EXECUTIONDATE1=BLANKS     "TO_DATE(:EXECUTIONDATE1, 'YYYY-MM-DD')",
    DELDOB          TIMESTAMP     NULLIF DELDOB=BLANKS          "TO_DATE(:DELDOB, 'DD/MM/YYYY')",
    The relevant NLS parameters:
    NLS_LANGUAGE=FRENCH
    NLS_DATE_FORMAT=DD/MM/RR
    NLS_DATE_LANGUAGE=FRENCH
    If I load this file as is the values loaded into the table are 31 dec 2010 and 22 nov *2032*, aven though the years are on 4 digits. If I change the NLS_DATE_FORMAT to DD/MM/YYYY then the second date value will be loaded correctly, but the first value will be loaded as 31 dec *2020* !!
    How can I get both date values to load correctly?
    Thanks!
    Sylvain

    This is very strange, after running a few tests I realized that if the year is 19XX then it will get loaded as 2019, and if it is 20XX then it will be 2020. I'm guessing it may have something to do with certain env variables that aren't set up properly because I'm fairly sure my SQL*Loader control file is correct... I'll run more tests :-(

  • SQL Loader and formatting dates

    I've got dates formatted like so: 2012-05-10T17:04:51-08:00
    How can I get SQL Loader to load these into a Date column??
    Thanks

    Thanks for the reply-
    SQL Loader is running on Win, Oracle 11 on Linux
    The format is standard UTC
    YYYY-MM-DDTHH24:MI:SS±HH:MM Where the ±HH:MM refers to time zone offset from GMT. The "T" is just a separator between date and time (and is always "T").
    I looked at your references, and tried a few dozen variants without success.
    Most of my attempts have been using something similar to this
    "YYYY-MM-DDTHH24:MI:SSTZH:TZM"
    or this:
    "YYYY-MM-DD'T'HH24:MI:SSTZH:TZM"
    or this:
    'YYYY-MM-DD"T"HH24:MI:SSTZH:TZM'
    Thanks

  • SQL*Loader and binary data

    i have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
    so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
    i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
    unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
    any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
    tia. john

    i have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
    so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
    i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
    unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
    any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
    tia. john

  • SQL Loader and foreign characters in the data file problem

    Hello,
    I have run into an issue which I can't find an answer for. When I run SQL Loader, one of my control files is used to get file content (LOBFILE) and one of the fields in the data file has a path to that file. The control file looks like:
    LOAD DATA
    INFILE 'PLACE_HOLDER.dat'
    INTO TABLE iceberg.rpt_document_core APPEND
    FIELDS TERMINATED BY ','
    doc_core_id "iceberg.seq_rpt_document_core.nextval",
    -- created_date POSITION(1) date "yyyy-mm-dd:hh24:mi:ss",
    created_date date "yyyy-mm-dd:hh24:mi:ss",
    document_size,
    hash,
    body_format,
    is_generic_doc,
    is_legacy_doc,
    external_filename FILLER char(275) ENCLOSED by '"',
    body LOBFILE(external_filename) terminated by EOF
    A sample data file looks like:
    0,2012-10-22:10:09:35,21,BB51344DD2127002118E286A197ECD4A,text,N,N,"E:\tmp\misc_files\index_testers\foreign\شیمیایی.txt"
    0,2012-10-22:10:09:35,17,CF85BE76B1E20704180534E19D363CF8,text,N,N,"E:\tmp\misc_files\index_testers\foreign\ลอบวางระเบิด.txt"
    0,2012-10-22:10:09:35,23552,47DB382558D69F170227AA18179FD0F0,binary,N,N,"E:\tmp\misc_files\index_testers\foreign\leesburgis_á_ñ_é_í_ó_ú_¿_¡_ü_99.doc"
    0,2012-10-22:10:09:35,17,83FCA0377445B60CE422DE8994900A79,binary,N,N,"E:\tmp\misc_files\index_testers\foreign\làm thế nào bạn làm ngày hôm nay"
    The problem is that whan I run this, SQL Loader throws an error that it can't find the file. It appears that it can't interpret the foreign characters in a way that allows it to find that path. I have tried adding a CHARACTERSET (using AL32UTF8 or UTF8) value in the control file but that only has some success with Western languages, not the ones listed above. Also, there is no set of defined languages that could be found in the data file. It essentaially could be any language.
    Does anyone know if there is a way to somehow get SQL Loader to "understand" the file system paths when a folder and/or file name could be in some other langauge?
    Thanks for any thoughts - Peter

    Thanks for the reply Harry. If I try to open the file in various text editors like Wordpad, Notepad, GVIM, andTextpad, they all display the foreign characters differently. Only Notepad comes close to displaying the characters properly. I have a C# app that will read the file and display the contents and it renders it fine. If you look at the directory of files in Windows Explorer, they all are displayed properly. So it seems things like .Net and Windows have some mechanism to understand the characters in order to render them properly. Other applications, again like Wordpad, do not know how to render them properly. It would seem that whatever SQL Loader is using to "read" the data files also is not rendering the characters properly which prevents it from finding the directory path to the file. If I add "CHARACTERSET AL32UTF8" in the control file, all is fine when dealing with Western langauges (ex, German, Spanish) but not for the Eastern languages (ex. Thai, Chinese). So .... telling SQL Loader to use a characterset seems to work, but not in all cases. The AL32UTF8 is the characterset that the Oracle database was created with. I have not had any luck if I try to set the CHARACTERSET to whatever the Thai character set is, for example. There problem there though is that even if that did work, I can't target specific lagauages because the data could come from anywhere. It's like I need some sort of global "super set" characterset to use. It seems like the CHARACTERSET is the right track to follow but I am not sure, and even if it is, is there a way to handle all languages.
    Thanks - Peter

  • Sql loader and bulk data

    hi,
    I want to insert 100,000 records daily in a table for the first month and then in next month these records are going to be replaced by new updated records.
    there might be few addition and deletion in the previous records also.
    actually its consumer data so there might be few consumer who have withdrawn the utility and there will be some more consumer added in the database.
    but almost 99% of the previous month data have to be updated/replaced with the fresh month data.
    For instance, what i have in my mind is that i will use sql loader to load data for the first month and then i will delete the previous data using sqlPlus and load the fresh month data using sql loader again.
    1. Is this ok ? or there is some better solution to this.
    2. I have heard of external files, are they feasible in my scenario?
    3. I have planned that i will make scripts for sqlPlus and Loader and use them in batch files. (OS windows 2003 server, Oracle 9i database). is there some better choice to make all the procedure automatic?
    looking for your suggestions
    nadeem ameer

    I would suggest u use External tables since its more flexible then
    sqlloader & is a better option.
    For using external tables
    1)u will have to create a directory first
    2)Generally creation od directory is done by sys,hence after creating the directory
    privileges read & write to be provided to user .
    3)Creation of external tables.
    4) Now use the table as a normal table to insert ,update delete in
    ur table.
    U can get more information from
    http://www.oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
    Create Directory <directory_name> as <Directory path where file be present>
    Grant read,write on directory <directory_name> to <username>
    CREATE TABLE <table_name>
    (<column names>)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY ,directory_name>
    ACCESS PARAMETERS
    RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY ','
    LOCATION (<filename>)
    PARALLEL 5
    REJECT LIMIT 200;
    Hope this helps.

  • SQL*Loader and data enclosed by single quotes

    Hello,
    I need to migrate some data from Sybase to Oracle. Can't use the Migration Workbench because this is Sybase's SQL Anywhere product. We are therefore trying to load via SQL*Loader.
    Sybase outputs its data into text files and encloses it with single quotes, like:
    'ABC123',
    'UR94LL',
    '7YUHII'
    We are running into a problem because we cannot figure out how to use the OPTIONALLY ENCLOSED BY parameter with single quotes. If we just use the FIELDS TERMINATED BY parameter, then the entire string, including the quotation marks, gets loaded into the Oracle table. This causes bad records to be created when the data is exactly the length of the column width - the two quotation markss make the field longer than is allowed in the table. Plus - the data shouldn't have quotes around it in normal situations.
    We can do 'Find and Replace' to replace the single quotation marks with double quotations, but some of the Sybase tables are huge, and we'd like to avoid having to open and edit them. However, if this is the only way to go, then we'll have to use it. I just wondered whether anyone had run into this before and been able to solve it.
    Thanks.
    -melissa

    We are running into a problem because we cannot figure out how to use the OPTIONALLY ENCLOSED BY parameter with single quotesTry
    OPTIONALLY ENCLOSED BY X'27'http://download-uk.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1015083

Maybe you are looking for

  • Syntax Error with EXPORT statement in ECC 6

    Hi All, I have one issue with EXPORT statement syntax. I have declared data like below: DATA: BEGIN OF mem_id,           mandt LIKE sy-mandt,           uname LIKE sy-uname,           modno LIKE sy-modno,         END OF mem_id. export the memory id   

  • Using UI Automation Framework to interact with Flex applications

    Hi all, I created a small flash app using Flex builder 4. I am writing a tool to interact with flash applications using UI Automation Framework which is similar to UI Spy. I've read in so many forums that accessibility can be done using UI Automation

  • Ecma script add item with people picker field

    hello, I need to add item to a list which is having people picker ,i need to insert the value but it is not working for that field alone. is there any different procedure is there. thanks

  • Outlook Sync Error Code 0x80040fb3

    I suddenly am having trouble syncing my contacts on my Storm.  Calendar syncs fine, but I get that error message for my contacts.  I've looked through the Knowledge Base and thus looked at the pttrace log, but I can't tell what contact is causing the

  • How to know what causes SocketException to be thrown ?

    Hello Javatizers, I use the following code to check if the socket connection is closed abnormally from the client or not.            try {                     .this.dos.write(m); //DataOutputStream                 } catch (SocketException se) {