Problem with SQL*Loader and different date formats in the same file

DB: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
System: AIX 5.3.0.0
Hello,
I'm using SQL*Loader to import semi-colon separated values into a table. The files are delivered to us by a data provider who concatenates data from different sources and this results in us having different date formats within the same file. For example:
...;2010-12-31;22/11/1932;...
I load this data using the following lines in the control file:
EXECUTIONDATE1     TIMESTAMP     NULLIF EXECUTIONDATE1=BLANKS     "TO_DATE(:EXECUTIONDATE1, 'YYYY-MM-DD')",
DELDOB          TIMESTAMP     NULLIF DELDOB=BLANKS          "TO_DATE(:DELDOB, 'DD/MM/YYYY')",
The relevant NLS parameters:
NLS_LANGUAGE=FRENCH
NLS_DATE_FORMAT=DD/MM/RR
NLS_DATE_LANGUAGE=FRENCH
If I load this file as is the values loaded into the table are 31 dec 2010 and 22 nov *2032*, aven though the years are on 4 digits. If I change the NLS_DATE_FORMAT to DD/MM/YYYY then the second date value will be loaded correctly, but the first value will be loaded as 31 dec *2020* !!
How can I get both date values to load correctly?
Thanks!
Sylvain

This is very strange, after running a few tests I realized that if the year is 19XX then it will get loaded as 2019, and if it is 20XX then it will be 2020. I'm guessing it may have something to do with certain env variables that aren't set up properly because I'm fairly sure my SQL*Loader control file is correct... I'll run more tests :-(

Similar Messages

  • URGENT: XML-SQL: Error parsing different date formats in the same XML file.

    Hi Everyone,
    I have a big problem while using the XML-SQL utility. I have a XML file with different date formats. I get exception like the one below.
    "oracle.xml.sql.OracleXMLSQLException: 'java.text.ParseException: Unparseable date:"
    And the XML file is :
    <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
    <Info Key="ID">
         <Insert>
              <ID>1</ID>
              <BookingDate>10.12.2001</BookingDate>
              <OpeningTime>19:16</OpeningTime>
         </Insert>
    </Info>
    I have tried using different setDateFormat which is contrary.
    sav.setDateFormat("d.M.y");
    sav.setDateFormat("H:m:s");
    Can someone help me on the same.
    Thanks in advance.
    Subramanian.

    You might have to describe the columns as varchar then modify the table to the right format.

  • Different Date Formats in the same "From"

    Hello
    We used different date formats depending on the users language.
    In a report this is simple to implement ... read the date field twice (in different formats dd/mm/yyyy and mm/dd/yyyy) and display only one based on the language.
    I want to be able to do the same in a "form" which updates a date. In this case the about is not allowed.
    Has anyone have solution for this using "APEX" forms or do a need to write a process to achieve this?
    Thanks
    Pete

    Pete,
    You don't specify what version of Application Express you're using, so I'll assume APEX 3.1 or later.
    Are you setting the date format yourself, based upon the user's language?
    1) You could have the application's language derived from Browser, and then APEX would automatically set the NLS_LANGUAGE and NLS_TERRITORY settings accordingly. Then, I would simply use 'DS' as the Application Date Format. For Date Pickers on a form, I would use item type 'Date Picker (use Application Date Format)'.
    2) If you're already determining the date format yourself and you don't want APEX to do it for you, ensure that you are setting this value in an application item (e.g., PETE_DATE_FORMAT). Then, you could still use Application Date Format with a value of &PETE_DATE_FORMAT. (inclusive of the trailing period).
    When you use Application Date Format, the conversion should happen automatically for you so that you don't have to code this yourself on the Form.
    I hope this helps.
    Joel

  • Problems with SQL*Loader and java Runtime

    Hi. I'm trying to start SQL*Loader on Oracle 8 by using Runtime class in this way:
    try{
    Process p = Runtime.getRuntime().exec( "c:\oracle\ora81\bin\sqlldr.exe parfile=c:\parfile\carica.par" );
    /*If i insert this line my application never stops*/
    p.waitFor();
    }catch( Exception e ){
    . I have seen that if lines to insert are less then 400 all works very fine, but if lines number is greater than 400, all data go in my tables but my log file is opened always in writing.Can anyone tell me why?
    Thanks

    Just a note if the executable "sqlldr.exe" does not stop (quit running) by itself the p.waitFor() will wait for ever.

  • I have many photos with file extension of .PDD and that Photo Deluxe 4 no longer will operate in Win 7. How can I open?  Next in Elements 11, how do I load and print different pictures and different sizes options on the same page?

    I have many photos with file extension of .PDD and that Photo Deluxe 4 no longer will operate in Win 7. How can I open?  Next in Elements 11, how do I load and print different pictures and different sizes options on the same page?
    Thanks,
    Shir

    sbmgrams wrote:
    I have many photos with file extension of .PDD and that Photo Deluxe 4 no longer will operate in Win 7. How can I open?
    See here:
    Reading PhotoDeluxe PDD Files

  • Error with the data format in the TXT file, sending as an Email attachment

    Hi all,
    I have an problem in the data formating in the TXT file while sending as an attachment via an email by using the FM "SO_DOCUMENT_SEND_API1".
    For eg:
    The data in the TXT file is looking like as follows:
    0 0 0 0 2        L O U D S P E A K R   O T H E R   3 8 W h i t e                  0 0
    0031  L O U D S P E A K R   O T H E R   3 8 Black                               0 000
    38    L O U D S P E A K R   O T H E R   3 8 Brown                                  0 00040
    L O U D S P E A K R   O T H E R   3 8 Brown                                         0 00042
    and so on
    But it should come as :
    0 0 0 0 2  L O U D S P E A K R   O T H E R   3 8 W h i t e
    0 0 0031   L O U D S P E A K R   O T H E R   3 8 Black
    0 00038    L O U D S P E A K R   O T H E R   3 8 Brown
    0 00040    L O U D S P E A K R   O T H E R   3 8 Brown
    All the internal tables are correctly filled.
    The code is as follows:
    gwa_objtxt = 'Please find attached DATA EXTRACT Sheet'.
      append gwa_objtxt to git_objtxt.
    describe table git_objtxt lines gv_cnt.
      clear git_doc_data.
      read table git_objtxt index gv_cnt.
      git_doc_data-doc_size = ( gv_cnt - 1 ) * 255 + strlen( gwa_objtxt ).
      git_doc_data-obj_langu = sy-langu.
      git_doc_data-obj_descr = lv_mtitle.
      append git_doc_data.
      clear git_packing_list.
      refresh git_packing_list.
    git_packing_list-transf_bin = space.
      git_packing_list-head_start = 1.
      git_packing_list-head_num = 0.
      git_packing_list-body_start = 1.
      git_packing_list-body_num   = gv_cnt.
      git_packing_list-doc_type = 'RAW'.
      append git_packing_list.
      Clear : gv_cnt.
      Describe table git_objbin lines gv_cnt.
      git_packing_list-transf_bin = 'X'.
      git_packing_list-head_start = 1.
      git_packing_list-head_num = 1.
      git_packing_list-body_start = 1.
      git_packing_list-body_num = gv_cnt.
      git_packing_list-doc_type = 'TXT'.
      git_packing_list-obj_descr = 'ATTACH.TXT'.
      git_packing_list-obj_name = 'book'.
      git_packing_list-doc_size = gv_cnt * 255.
      APPEND git_packing_list.
      clear git_receivers.
      refresh git_receivers.
      git_receivers-receiver = gv_eid.
      git_receivers-rec_type = 'U'.
      append git_receivers.
    CALL FUNCTION 'SO_DOCUMENT_SEND_API1'
        EXPORTING
          DOCUMENT_DATA              = git_doc_data
          PUT_IN_OUTBOX              = 'X'
          COMMIT_WORK                = 'X'
        TABLES
          PACKING_LIST               = git_packing_list
          CONTENTS_BIN               = git_objbin
          CONTENTS_TXT               = git_objtxt
          RECEIVERS                  = git_receivers
        EXCEPTIONS
          TOO_MANY_RECEIVERS              = 1
          DOCUMENT_NOT_SENT                = 2
          DOCUMENT_TYPE_NOT_EXIST      = 3
          OPERATION_NO_AUTHORIZATION = 4
          PARAMETER_ERROR                    = 5
          X_ERROR                                      = 6
          ENQUEUE_ERROR                        = 7
          OTHERS                                        = 8.

    please give the code of
    contents bin =  git_objbin " how  this is getting populated.
    0 0 0 0 2 L O U D S P E A K R O T H E R 3 8 W h i t e <b>0 0</b>
    0031 L O U D S P E A K R O T H E R 3 8 Black
    0 000
    38
    from this im not able to understand is this over population or concatenation problem
    y dont u make a append to the final table
    like
    data : begin of itxt occurs 0, ,
    s1(132) type c ,
    end of itxt.
    loop at itab.
    itxt-s1+0(4) = itab-f1.
    itxt-s1+4(6) = itab-f2.
    itxt-s1+10(8) = itab-f3.
    itxts1+18(4) = itab-f4.
    append itxt.
    clear itxt.
    endloop.
    exchange this to the contents bin of hte Fm .
    regards,
    vijay.
    can u please mail the text file and the expected o/p to my mail id [email protected]  so that i can see the same from the data provided i m not able to check the result properly .

  • I meet a lot of 3G signal when a call signal disappears 3G! I read the manuals and anything I find the problem! I do so with other brands and other cell does not, the same goes when I receive a call!

    I meet a lot of 3G signal when a call signal disappears 3G! I read the manuals and anything I find the problem! I do so with other brands and other cell does not, the same goes when I receive a call!

    Who is your carrier?  Are you actually on 3G?
    Do you have 3G turned on or off?  Settings > General > Network > Enable 3G

  • I'm having problems with iPad 2 and my smart cover.  The attachment seems to be depolarized and it moves around causing the device to toggle and flash like its coming on or shorting.  Anyone having problems?

    I'm having problems with iPad 2 and my smart cover.  The attachment seems to be depolarized, trys to connect using the back (curved) edge of the magnetized bracket.  When I try to attach it, it moves around causing the device to toggle in a way that looks like the camera flash.  Anyone having problems? Help.

    Update. My carrier says the antenna of my iPhone 4s is not so good as the one in the iPhone 3G. They can't solve the problem. Their signal is too weak for the 4s at places I spend most of my time. Not the answer I expected, but I'm free to choose an other carrier. Hope this will help.
    This was not the correct answer, I accidentially clicked the button..
    Message was edited by: vasch

  • Can't i use xml schema and oledb data connection at the same time?

    Hello to all and thanks in advance.I use xml schema and oledb data connection at the same time and the problem is that when I try to export the xml, the outcome is not what i expect.Without the oledb connection everything is ok (just the schema) and the xml complies with the schema.
    Can't i have both schema and oledb and the exported xml be as i want it?

    You can use both at the same time, but not gor Internet access if that's what you're asking.
    Now there is a thing called Link Aggregation, which combines a number of interfaces for speed/redundancy, but it really only works locally, and then only with ALL special equipment in the route, and most likely OSX Server involved.
    Sorry.

  • Loading and Playing 2 videos at the same time

    Hello everyone
    I'm having a problem on loading and playing two videos at the same time on XletView.
    The background video, I'm loading by editing channels.xml that is located at config folder into xletview folder. I've just to add the file path and when i start the video it starts playing.
    But now, I have a problem. I have to load a file that is into my HD, and play it at the same time of the main(background) video.
    Do anyone know how to do this?

    anyone?
    PowerBook G4 1.67 GHz Mac OS X (10.4.7) 1.5 GB ram

  • Import and process larger data with SQL*Loader and Java resource

    Hello,
    I have a project to import data from a text file in a schedule. A lager data, with nearly 20,000 record/1 hours.
    After that, we have to analysis the data, and export the results into a another database.
    I research about SQL*Loader and Java resource to do these task. But I have no experiment about that.
    I'm afraid of the huge data, Oracle could be slowdown or the session in Java Resource application could be timeout.
    Please tell me some advice about the solution.
    Thank you very much.

    With '?' mark i mean " How i can link this COL1 with column in csv file ? "
    Attilio

  • Problem with SQL loader - "maximum length"

    using SQL*Loader: Release 8.1.7.0.0
    ===================================
    (full CTL enclosed below)
    I have a problem with several rows, in which I'm getting the "Field in data file exceeds maximum length" error.
    the DB field (referer) is a VARCHAR2(4000), and the field in the error rows never exceeds few hundred characters. According to Oracle docs I should be able to load fields which are no bigger than the DB field, what gives?
    I tried the variation
    referer CHAR "SUBSTR(:referer,1,100)"
    for this field, which causes all the "referer" columns in the "good" rows to load no more than 100 characters, but the same error repeats for the same rows!
    the input file is an IIS log, and the field is the REFERER field. its pure ASCII encoded, is there some character that cause Oracle to behave this way? is this a bug?
    here is one "bad" row: the "bad" field starts with "ht
    tp://web , is enclosed with quotes. I have replaced the client IP and other fields with xxx for privacy reasons.
    after that, I have enclosed my CTL as well.
    any help ?
    Yoram Ayalon
    BTW - I verified in the LOG file that the loader is reading my options for the columns as I described in the CTL. no problem there.
    "2003-06-30 11:11:12" xxx.xxx.xxx.xxx WEBSRVXX 80 GET /xxx.xxx 200 0 778 1359 "ht
    tp://web.ask.com/redir?bpg=http%3a%2f%2fweb.ask.com%2fweb%3fq%3dWhat%2bis%2bsign
    al%2bcommunication%253f%26o%3d0%26page%3d1&q=What+is+signal+communication%3f&u=h
    ttp%3a%2f%2ftm.wc.ask.com%2fr%3ft%3dan%26s%3da%26uid%3d032EBF1A318A100F3%26sid%3
    d3d2bbe4f8d2bbe4f8%26qid%3d4B2346DA8A56C6418CB4DCB9091EEBA7%26io%3d0%26sv%3dza5c
    b0db2%"
    LOAD DATA
    INFILE '/tmp/mod_websrvxx.txt'
    APPEND INTO TABLE tmpLogs
    FIELDS TERMINATED BY WHITESPACE optionally enclosed by '"'
    (LogDate DATE "YYYY-MM-DD HH24:MI:SS", ClientIP, ServerName, ServerPort, ClientM
    ethod,UriStem,Status,BytesSent,BytesReceived,TimeTaken,Referer CHAR "SUBSTR(:Re
    ferer, 1, 100)" )

    Use:
    readsize=3000000
    or some large number to raise this limit.
    Check the below link for detailed explanation
    http://asktom.oracle.com/pls/ask/f?p=4950:8:380010202423963671::NO::F4950_P8_DISPLAYID,F4950_P8_CRITERIA:2167288643374,

  • How to handle Multiple date formats for the same date field in SQL*Loader

    Dear All,
    I got a requirement where I need to get data from a text file and insert the same into oracle table.
    I am using SQL*Loader to populate the data from the text file into my table.
    The file has one field where I am expecting date date data in multiple formats, like dd/mon/yyyy, yyyy/dd/mon, yyyy/mon/dd, ,mm/dd/yyyy, mon/dd/yyyy.
    While using SQL*Loader, I can see Loading is failing for records where we have formats like yyyy/dd/mon, yyyy/mon/dd, mon/dd/yyyy.
    Is there any way in SQL*Loader where we can mention all these date formats so that this date data should go smoothly into the underlying date column in the table.
    Appreciate your response on this.
    Thanks,
    Madhu K.

    The point being made was, are you sure that you can uniquely identify a date format from the value you receieve? Are you sure that the data stored is only of a particular limited set of formats?
    e.g. if you had a value of '07/08/03' how do you know what format that is?
    It could be...
    7th August 2003 (thus assuming it's DD/MM/RR format)
    or
    8th July 2003 (thus assuming it's MM/DD/RR format)
    or
    3rd August 2007 (thus assuming it's RR/MM/DD format)
    or
    8th March 2007 (thus assuming it's RR/DD/MM format)
    or even more obscurely...
    3rd July 2008 (MM/RR/DD)
    or
    7th March 2008 (DD/RR/MM)
    Do you have any information to tell you what formats are valid that would allow you to be specific and know what date format is meant?
    This is a classic example of why dates should be stored on the database using DATE datatype and not VARCHAR2. It can lead to corruption of data, especially if the date can be entered in any format a user wishes.

  • SQL*Loader and binary data

    i have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
    so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
    i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
    unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
    any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
    tia. john

    i have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
    so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
    i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
    unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
    any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
    tia. john

  • Problem with SQL Authenticator and SOA EM console.

    Hi,
    In my project, i have a need to authenticate USERS from data base tables. To acheve this i defined SQL Authenticator, inside the weblogic admin console.But, problem is after this, when i login to EM console of SOA, then, in the home page, it is showing all deployments as DOWN sate, including SOA-INFRA. But, i am able to deploy BPEL project and execute it normally. Why EM console is showing all deployments and soa-infra as down?.
    Thanks,
    Naga.
    Edited by: 984573 on Feb 5, 2013 9:56 PM

    Hi Anuj & Nicolas,
    Thanks for your reply.
         Really wonder, i found the solution. This is the problem with order of Authentication provider. If i put the "SQL Authenticator" in the top of the order as the first item in the list, then i am facing the above error in EM console. Now, i re-ordered the "SQL Authenticator" and keep it last in the list. Now, SOA EM console is working fine. I really do not understand, what is the significance of the order of Authentication provider.
    Thanks for your help.
    Regards,
    Naga.

Maybe you are looking for