SQL loader load data very slow...

Hi,
On my production server have issue of insert. Regular SQL loder load file, it take more time for insert the data in database.
First 2 and 3 hours one file take 8 to 10 seconds after that it take 5 minutes.
As per my understanding OS I/O is very slow, First 3 hours DB buffer is free and insert data in buffer normal.
But when buffer is fill then going for buffer waits and then insert is slow on. If it rite please tell me how to increase I/O.
Some analysis share here of My server...................
[root@myserver ~]# iostat
Linux 2.6.18-194.el5 (myserver) 06/01/2012
avg-cpu: %user %nice %system %iowait %steal %idle
3.34 0.00 0.83 6.66 0.00 89.17
Device: tps Blk_read/s Blk_wrtn/s Blk_read Blk_wrtn
sda 107.56 2544.64 3140.34 8084953177 9977627424
sda1 0.00 0.65 0.00 2074066 16
sda2 21.57 220.59 1833.98 700856482 5827014296
sda3 0.00 0.00 0.00 12787 5960
sda4 0.00 0.00 0.00 8 0
sda5 0.69 2.75 15.07 8739194 47874000
sda6 0.05 0.00 0.55 5322 1736264
sda7 0.00 0.00 0.00 2915 16
sda8 0.50 9.03 5.24 28695700 16642584
sda9 0.51 0.36 24.81 1128290 78829224
sda10 0.52 0.00 5.98 9965 19004088
sda11 83.71 2311.26 1254.71 7343426336 3986520976
[root@myserver ~]# hdparm -tT /dev/sda11
/dev/sda11:
Timing cached reads: 10708 MB in 2.00 seconds = 5359.23 MB/sec
Timing buffered disk reads: 540 MB in 3.00 seconds = 179.89 MB/sec
[root@myserver ~]# sar -u -o datafile 1 6
Linux 2.6.18-194.el5 (mca-webreporting2) 06/01/2012
09:57:19 AM CPU %user %nice %system %iowait %steal %idle
09:57:20 AM all 6.97 0.00 1.87 16.31 0.00 74.84
09:57:21 AM all 6.74 0.00 1.25 17.48 0.00 74.53
09:57:22 AM all 7.01 0.00 1.75 16.27 0.00 74.97
09:57:23 AM all 6.75 0.00 1.12 13.88 0.00 78.25
09:57:24 AM all 6.98 0.00 1.37 16.83 0.00 74.81
09:57:25 AM all 6.49 0.00 1.25 14.61 0.00 77.65
Average: all 6.82 0.00 1.44 15.90 0.00 75.84
[root@myserver ~]# sar -u -o datafile 1 6
Linux 2.6.18-194.el5 (mca-webreporting2) 06/01/2012
09:57:19 AM CPU %user %nice %system %iowait %steal %idle
mca-webreporting2;601;2012-05-27 16:30:01 UTC;2.54;1510.94;3581.85;0.00
mca-webreporting2;600;2012-05-27 16:40:01 UTC;2.45;1442.78;3883.47;0.04
mca-webreporting2;599;2012-05-27 16:50:01 UTC;2.44;1466.72;3893.10;0.04
mca-webreporting2;600;2012-05-27 17:00:01 UTC;2.30;1394.43;3546.26;0.00
mca-webreporting2;600;2012-05-27 17:10:01 UTC;3.15;1529.72;3978.27;0.04
mca-webreporting2;601;2012-05-27 17:20:01 UTC;9.83;1268.76;3823.63;0.04
mca-webreporting2;600;2012-05-27 17:30:01 UTC;32.71;1277.93;3495.32;0.00
mca-webreporting2;600;2012-05-27 17:40:01 UTC;1.96;1213.10;3845.75;0.04
mca-webreporting2;600;2012-05-27 17:50:01 UTC;1.89;1247.98;3834.94;0.04
mca-webreporting2;600;2012-05-27 18:00:01 UTC;2.24;1184.72;3486.10;0.00
mca-webreporting2;600;2012-05-27 18:10:01 UTC;18.68;1320.73;4088.14;0.18
mca-webreporting2;600;2012-05-27 18:20:01 UTC;1.82;1137.28;3784.99;0.04
[root@myserver ~]# vmstat
procs -----------memory---------- -swap -----io---- system -----cpu------
r b swpd free buff cache si so bi bo in cs us sy id wa st
0 1 182356 499444 135348 13801492 0 0 3488 247 0 0 5 2 89 4 0
[root@myserver ~]# dstat -D sda
----total-cpu-usage---- dsk/sda -net/total- -paging -system
usr sys idl wai hiq siq| read writ| recv send| in out | int csw
3 1 89 7 0 0|1240k 1544k| 0 0 | 1.9B 1B|2905 6646
8 1 77 14 0 1|4096B 3616k| 433k 2828B| 0 0 |3347 16k
10 2 77 12 0 0| 0 1520k| 466k 1332B| 0 0 |3064 15k
8 2 77 12 0 0| 0 2060k| 395k 1458B| 0 0 |3093 14k
8 1 78 12 0 0| 0 1688k| 428k 1460B| 0 0 |3260 15k
8 1 78 12 0 0| 0 1712k| 461k 1822B| 0 0 |3390 15k
7 1 78 13 0 0|4096B 6372k| 449k 1950B| 0 0 |3322 15k
AWR sheet output
Wait Events
ordered by wait time desc, waits desc (idle events last)
Event Waits %Time -outs Total Wait Time (s) Avg wait (ms) Waits /txn
free buffer waits 1,591,125 99.95 19,814 12 129.53
log file parallel write 31,668 0.00 1,413 45 2.58
buffer busy waits 846 77.07 653 772 0.07
control file parallel write 10,166 0.00 636 63 0.83
log file sync 11,301 0.00 565 50 0.92
write complete waits 218 94.95 208 955 0.02
SQL> select 'free in buffer (NOT_DIRTY)',round((( select count(DIRTY) N_D from v$bh where DIRTY='N')*100)/(select count(*) from v$bh),2)||'%' DIRTY_PERCENT from dual
union
2 3 select 'keep in buffer (YES_DIRTY)',round((( select count(DIRTY) N_D from v$bh where DIRTY='Y')*100)/(select count(*) from v$bh),2)||'%' DIRTY_PERCENT from dual;
'FREEINBUFFER(NOT_DIRTY)' DIRTY_PERCENT
free in buffer (NOT_DIRTY) 10.71%
keep in buffer (YES_DIRTY) 89.29%
Rag....

1)
Yah This is partition table and on it Local partition index.
SQL> desc GR_CORE_LOGGING
Name Null? Type
APPLICATIONID VARCHAR2(20)
SERVICEID VARCHAR2(25)
ENTERPRISENAME VARCHAR2(25)
MSISDN VARCHAR2(15)
STATE VARCHAR2(15)
FROMTIME VARCHAR2(25)
TOTIME VARCHAR2(25)
CAMP_ID VARCHAR2(50)
TRANSID VARCHAR2(25)
MSI_INDEX NUMBER
SQL> select index_name,column_name from user_ind_columns where table_name='GR_CORE_LOGGING';
INDEX_NAME
COLUMN_NAME
GR_CORE_LOGGING_IND
MSISDN
2) I was try direct but after that i was drop this table and again create new partition table and create fresh index. but still same issue.

Similar Messages

  • Create sql loader data file dynamically

    Hi,
    I want a sample program/approach which is used to create a sql loader data file.
    The program will read table name as i/p and will use
    select stmt will column list derived from user_tab_columns from data dictionary
    assuming multiple clob columns in the column list.
    Thanks
    Manoj

    I 'm writing clob and other columns to a sql loader dat file.
    Below sample code for writing clob column is giving file write error.
    How can I write multiple clobs to dat file so that control file will handle it correctly
    offset NUMBER := 1;
    chunk VARCHAR2(32000);
    chunk_size NUMBER := 32000;
    WHILE( offset < dbms_lob.getlength(l_rec_type.narrative) )
    LOOP
    chunk := dbms_lob.substr(l_rec_type.narrative, chunk_size, offset );
    utl_file.put( l_file_handle, chunk );
         utl_file.fflush(l_file_handle);
    offset := offset + chunk_size;
    END LOOP;
         utl_file.new_line(l_file_handle);

  • Facebook loading is very slow with nearly no images , i'm using mac

    facebook loading is very slow with nearly no images , i'm using mac os x 10.8

    i did it many times and the same problem exists

  • Youtube app loads videos very slow On my ipad mini

    Youtube app loads videos very slow On my ipad mini . I can let it buffer for 10 seconds, it plays and then stops because it has to buffer again and convert the video in very poor quality. i have 20mb internet connection. my sofware version is iOS 6.1.3

    Check your internet upload speed. You probably have a very slow speed.
    https://itunes.apple.com/sg/app/speedtest.net-mobile-speed/id300704847?mt=8

  • SQL server is Running Very Slow

    hi,
    I used this following query for finding out how many connection are connected, it is showing that 70 connections,my ram is 16GB,  but still my sql server is running very slow. how can i increase my sql server speed. or how can i analyses that.I checked 
    even when no job was running, but still same speed. I'm using store procedures, SQL jobs no inline queries.
    SELECT
    DB_NAME(dbid) as DBName,
    COUNT(dbid) as NumberOfConnections,
    loginame as LoginName
    FROM
    sys.sysprocesses
    WHERE
    dbid > 0
    GROUP BY
    dbid, loginame

    hi,
    I used this following query for finding out how many connection are connected, it is showing that 30 connections,my ram is 16GB,  but still my sql server is running very slow. how can i increase my sql server speed. or how can i analyses that.I checked 
    even when no job was running, but still same speed.
    SELECT
    DB_NAME(dbid) as DBName,
    COUNT(dbid) as NumberOfConnections,
    loginame as LoginName
    FROM
    sys.sysprocesses
    WHERE
    dbid > 0
    GROUP BY
    dbid, loginame

  • LKM step Load huge data very slow

    Hi ALL,
    when you try to load 4 million data from flat file to my oracle database.
    it took me around 2.5 hours to finish the step 3 "LOAD DATA" which i thought odi will load data from flat file to C$ table.
    it is anyway to speed it up?
    I just chose the simple LKM FILE TO SQL option.
    thanks
    I Z

    this is only the step 3 create temp table. it is nothing about the sql load yet.
    create table <%=odiRef.getObjectName("T"+odiRef.getTableName("COLL_SHORT_NAME"),"W")%>
    <%=odiRef.getSrcColList("","[COL_NAME] [DEST_CRE_DT]","[COL_NAME] [DEST_CRE_DT]",",\n","")%>
    <%=odiRef.getUserExit("WORK_TABLE_OPTIONS")%>
    this one will auto-generate the sqll like
    create table ESBUSER.TC$_0TEST
    ROW CODE VARCHAR2(1), --> this one is auto-generated by the above code. i have not way to change ROW CODE TO ROW_CODE.
    B VARCHAR2(20),
    C VARCHAR2(20),
    D VARCHAR2(40)
    i dont know why it will generate the ROW CODE......without the "_"...
    please help
    thanks
    I Z

  • Yahoo Finance Portfolios - add/remove page and update data very slow to load

    The Add/Remove symbols page only partially loads then stops dead. Can take a minute or more to complete loading. And when I try to update quantities or price I get a whirling color gif which only sometimes reaches the point of letting me load data. Firefox, no problem.
    Thanks!
    Fresnel1
    New Jersey

    The Add/Remove symbols page only partially loads then stops dead. Can take a minute or more to complete loading. And when I try to update quantities or price I get a whirling color gif which only sometimes reaches the point of letting me load data. Firefox, no problem.
    Thanks!
    Fresnel1
    New Jersey

  • Data load becomes very slow

    Hi,after a migration from Version 5 to 6.5 the dataload becomes very slow.With V5 the dataload takes 1 hour, with 6.5 it takes about 3 hours.the calculation takes the same time.Any idea?

    To many sub VIs could not be found so I can not give you more than some advises. But I see that you run all your loops at full speed. I do not think it is very wise. Insert the "wait (ms)" functions in all while loops, but not the loops handling the daq functions. Since they are controlled by a occurrence. In Loops handling user input only you may set the wait time as high as 500ms. In more important loops use shorter time.
    Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
    (Sorry no Labview "brag list" so far)

  • Pre parse terrible SQL Loader data file

    Long story, but I'm on Oracle EBS 11.5.10.2 which means i'm stuck with an SQL Loader from the dark ages (8.06 - yes, I know and yes, it's still supported under EBS).
    Anyhow. I have a vendor sending very annoying data. For example:
    \|"Fan-Hou" dessert|
    \|"Donald is cool | yes”|
    \|"Fred “is” cool | yes”|
    Due to the limitations of such an old SQL Loader, I need to end up with:
    \|"Fan-Hou dessert"|
    \|"Donald is cool | yes”|
    \|"Fred “is” cool | yes”|
    So you can see that really it's only the first row which is causing the issue, I need to shift the quote out to the end of the string (just before the |). The other two lines are just for reference, to show that I get data like that too, and I need to be able to keep that as is.
    I have a rather un-elegant solution solution using perl but I'm sure someone out there knows a clever way to achieve the same result. Before you ask - no, the vendor won't change the data.
    Edited by: user13007502 on 07-May-2013 01:35

    There are probably various ways you can do that. I would try the SED command.
    The idea is the following:
    1. Replace all double quotes of words that start and end with a space with a special character.
    2. Remove all all double quotes.
    3. Add double quotes to the beginning and end.
    4. Reconstruct double quotes of words according to 1.
    For instance:
    $ cat testfile
    |"Fan-Hou" dessert|
    |"Donald is cool | yes"|
    |"Fred "is" cool | yes"|
    Replace double quotes with {} for words that start and end with a space:
    sed -i 's: "\(.*\)" : {\1} :g' testfile
    $ cat testfile
    |"Fan-Hou" dessert|
    |"Donald is cool | yes"|
    |"Fred {is} cool | yes"|
    Remove all double quotes
    $ sed -i 's:"::g' testfile
    $ cat testfile
    |Fan-Hou dessert|
    |Donald is cool | yes|
    |Fred {is} cool | yes|
    Replace all lines that begin and end with | with |" and "|.
    $ sed -i 's:^|:|":g' testfile
    $ sed -i 's:\(.*\)|:\1"|:g' testfile
    $ cat testfile
    |"Fan-Hou dessert"|
    |"Donald is cool | yes"|
    |"Fred {is} cool | yes"|
    Finally replace { and } with double qoutes:
    sed -i 's:{:":g' testfile
    sed -i 's:}:":g' testfile
    Here is the result:
    # cat testfile
    |"Fan-Hou dessert"|
    |"Donald is cool | yes"|
    |"Fred "is" cool | yes"|
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Page loading is very slow

    Hi,
    We are using BAPI to fetch data from R/3, at the time of displaying it on the screen, page load is taking time, and scrolling of page is also very slow.
    Can you help me in improving the performance of the page.
    Thanks in advance.

    Hi
    Firstly identify it is a issue with the BAPI side or not. Test the BAPI in R/3 side. If it takes too long, then tune the ABAP code. If it is fine, then try using cache settings at Iview level.
    Regards,
    Murali.

  • SQL Loader - data exceeds maximum length

    I am having an issue with SQL Loader falsely reporting that a column is too long in a CSV upload file. The offending column, Notes, is defined in the staging table as VARCHAR2(1000). The text in the Notes column in the upload file for the record that is being rejected is only 237 characters long. I examined the raw data file with a hex editor and there are no special cahracters embedded in the column. The CSV upload was recreated but the false error remains.
    Any ideas what to check? Any suggestion appreciated.
    Here are the pertinent files.
    Control File:LOAD DATA
       INFILE 'Mfield_Upl.dat'
       BADFILE 'Mfield_Upl.bad'
       TRUNCATE
       INTO TABLE Mfield_UPL_Staging
       FIELDS TERMINATED BY ',' optionally enclosed by '"'
         ControlNo CHAR,
         PatientID CHAR,
         CollectDate DATE "MM/DD/YYYY",
         TestDate DATE "MM/DD/YYYY",
         AnalyteDesc CHAR,
         Results CHAR,
         HiLoFlag CHAR,
         LoRange CHAR,
         HiRange CHAR,
         UnitOfMeas CHAR,
         Comments CHAR,
         Notes CHAR,
         ClinicalEvent CHAR,
         OwnerLName CHAR,
         OwnerFName CHAR,
         PetName CHAR,
         AssecNo CHAR,
         SpecimenID CHAR
    {code}
    Staging Table:{code}
    CREATE TABLE Mfield_UPL_Staging
        ControlNo      VARCHAR2(20),
        PatientID      VARCHAR2(9),
        CollectDate    DATE,
        TestDate       DATE,
        AnalyteDesc    VARCHAR2(100),
        Results        VARCHAR2(100),
        HiLoFlag       CHAR(10),
        LoRange        VARCHAR2(15),
        HIRange        VARCHAR2(15),
        UnitOfMeas     VARCHAR2(25),
        Comments       VARCHAR2(100),
        Notes          VARCHAR2(1000),
        ClinicalEvent  VARCHAR2(20),
        OwnerLName     VARCHAR(50),
        OwnerFName     VARCHAR(50),
        PetName        VARCHAR(50),
        AssecNo        NUMBER(10),
        SpecimenID     NUMBER(10)
    {Code}
    Error Log File:{code}
    SQL*Loader: Release 9.2.0.1.0 - Production on Wed Aug 11 08:22:58 2010
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Control File:   Mfield_UPL_CSV.ctl
    Data File:      Mfield_UPL.dat
      Bad File:     Mfield_Upl.bad
      Discard File:  none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table MFIELD_UPL_STAGING, loaded from every logical record.
    Insert option in effect for this table: TRUNCATE
       Column Name                  Position   Len  Term Encl Datatype
    CONTROLNO                           FIRST     *   ,  O(") CHARACTER           
    PATIENTID                            NEXT     *   ,  O(") CHARACTER           
    COLLECTDATE                          NEXT     *   ,  O(") DATE MM/DD/YYYY     
    TESTDATE                             NEXT     *   ,  O(") DATE MM/DD/YYYY     
    ANALYTEDESC                          NEXT     *   ,  O(") CHARACTER           
    RESULTS                              NEXT     *   ,  O(") CHARACTER           
    HILOFLAG                             NEXT     *   ,  O(") CHARACTER           
    LORANGE                              NEXT     *   ,  O(") CHARACTER           
    HIRANGE                              NEXT     *   ,  O(") CHARACTER           
    UNITOFMEAS                           NEXT     *   ,  O(") CHARACTER           
    COMMENTS                             NEXT     *   ,  O(") CHARACTER           
    NOTES                                NEXT     *   ,  O(") CHARACTER           
    CLINICALEVENT                        NEXT     *   ,  O(") CHARACTER           
    OWNERLNAME                           NEXT     *   ,  O(") CHARACTER           
    OWNERFNAME                           NEXT     *   ,  O(") CHARACTER           
    PETNAME                              NEXT     *   ,  O(") CHARACTER           
    ASSECNO                              NEXT     *   ,  O(") CHARACTER           
    SPECIMENID                           NEXT     *   ,  O(") CHARACTER           
    Record 1042: Rejected - Error on table MFIELD_UPL_STAGING, column NOTES.
    Field in data file exceeds maximum length
    Table MFIELD_UPL_STAGING:
      3777 Rows successfully loaded.
      1 Row not loaded due to data errors.
      0 Rows not loaded because all WHEN clauses were failed.
      0 Rows not loaded because all fields were null.
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Try:
    -- Etc ...
      Notes CHAR(1000),
    -- Etc ...SQL*Loader limits string buffer to 256 unless specified different.
    :p

  • SQL*Loader Date issue

    How do i get sql loader to recognize recognize AM/PM. My control file is below. The error I get is syntax error at line 14. Expecting field-name, found ")".
    OPTIONS ( SKIP=0)
    LOAD DATA
    INFILE '/ftp_data/labor_scheduling/kronos_punch_temp/kronos_punches.csv'
    BADFILE '/ftp_data/labor_scheduling/kronos_punches/kronos_punches.bad'
    DISCARDFILE '/ftp_data/labor_scheduling/kronos_punches/kronos_punches.dsc'
    INTO TABLE "STAFFING"."KRONOS_TEMP_PUNCHES"
    TRUNCATE
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
    (EMPLOYEE_ID,
    SCHEDULE_DATE DATE "MM/DD/YY",
    PUNCH DATE "MM/DD/YY HH:MI AM"
    )

    Hi Gary,
    Could your problem be with this?
    (EMPLOYEE_ID,You have no data type for that field.
    Regards
    Peter

  • SQL Loader Date Problems

    Hi Guys,
    We have a SQL Loader script that used to data in the 'mm/dd/yyyy' format. Post migration of the script onto a seprate server the format is getting stored in 'dd/mm/yyyy' format in a VARCHAR datatype column for DOB.
    Please find below the Loader script :
    "insert into tsa_lists values ('" & ttsa_list_typ & "','" & ttsa_list_num & "',"
                                        if ttsa_list_typ <> "AuthRep" then
                                        inscmd.CommandText= "insert into tsa_lists values ('" & ttsa_list_typ & "','" & ttsa_list_num & "',?,?,?,?,?,?,?,?,?,?,sysdate,?,?)"
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_sid",200,1,30,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_lastname",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_firstname",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_firstletter",200,1,1,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_middlename",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_dob",200,1,100,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_pob",200,1,200,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_citizenship",200,1,200,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_cleared",200,1,3,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_misc",200,1,2000,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_firstname_orig",200,1,500,"")
                                        inscmd.Parameters.Append inscmd.CreateParameter("ttsa_lastname_orig",200,1,500,"")Data coming in the below format:
    SID     CLEARED     LASTNAME     FIRSTNAME     MIDDLENAME     TYPE     DOB     POB     CITIZENSHIP     PASSPORT/IDNUMBER     MISC     
    3799509          A     ABD AL SALAM MARUF ABDALLAH               01-Jul-55                         
    3799512          A     ABD AL SALAM MARUF ABDALLAH               01-Jan-80                         
    3727959          A     KHALID KHALIL IBRAHIM MAHDI               11-Nov-50                         
    3458238          A     KHALID KHALIL IBRAHIM MAHDI               08-Jan-81                         
    3458242          A     KHALID KHALIL IBRAHIM MAHDI               31-Jul-81                         
    3458231          A     KHALID KHALIL IBRAHIM MAHDI               01-Aug-81                         
    2407275          A     MUSA BARARUDDIN Y DAGAM               19-Aug-62                         
    Please can you guys suggest a way.
    Cheers,
    Shazin

    1) That does not look like anything recognized by "SQL*Loader" utility.
    2) If you just really mean a "sql" statement to load data...then:
    a) If table column is date type, then use the TO_DATE() function
    b) If column is Varchar2 you should not save date's in a varchar2 column.
    In any case check out the NLS_DATE_FORMAT parameter on the database and or the same parameter on client side.
    :p

  • Sql loader - Data loading issue with no fixed record length

    Hi All,
    I am trying to load the following data through sql loader. However the records # 1, 3 & 4 are only loading succesfully into the table and rest of the records showing as BAD. What is missing in my syntax?
    .ctl file:
    LOAD DATA
    INFILE 'C:\data.txt'
    BADFILE 'c:\data.BAD'
    DISCARDFILE 'c:\data.DSC' DISCARDMAX 50000
    INTO TABLE icap_gcims
    TRAILING NULLCOLS
         CUST_NBR_MAIN          POSITION(1:9) CHAR NULLIF (CUST_NBR_MAIN=BLANKS),
         CONTACT_TYPE          POSITION(10:11) CHAR NULLIF (CONTACT_TYPE=BLANKS),
         INQUIRY_TYPE          POSITION(12:13) CHAR NULLIF (INQUIRY_TYPE=BLANKS),
         INQUIRY_MODEL          POSITION(14:20) CHAR NULLIF (INQUIRY_MODEL=BLANKS),
         INQUIRY_COMMENTS     POSITION(21:60) CHAR NULLIF (INQUIRY_COMMENTS=BLANKS),
         OTHER_COLOUR POSITION(61:75) CHAR NULLIF (OTHER_COLOUR=BLANKS),
         OTHER_MAKE          POSITION(76:89) CHAR NULLIF (OTHER_MAKE=BLANKS),
         OTHER_MODEL_DESCRIPTION POSITION(90:109) CHAR NULLIF (OTHER_MODEL_DESCRIPTION=BLANKS),
         OTHER_MODEL_YEAR POSITION(110:111) CHAR NULLIF (OTHER_MODEL_YEAR=BLANKS)
    data.txt file:
    000000831KHAN
    000000900UHFA WANTS NEW WARRANTY ID 000001017OHAL
    000001110KHAP
    000001812NHDE231291COST OF SERVICE INSPECTIONS TOO HIGH MAXIMA 92 MK
    000002015TPFA910115CUST UPSET WITH AIRPORT DLR. $200 FOR PLUGS,OIL,FILTER CHANGE. FW
    Thanks,

    Hi,
    Better if you have given the table structure, I check your script it was fine
    11:39:01 pavan_Real>create table test1(
    11:39:02   2  CUST_NBR_MAIN  varchar2(50),
    11:39:02   3  CONTACT_TYPE varchar2(50),
    11:39:02   4  INQUIRY_TYPE varchar2(50),
    11:39:02   5  INQUIRY_MODEL varchar2(50),
    11:39:02   6  INQUIRY_COMMENTS varchar2(50),
    11:39:02   7  OTHER_COLOUR varchar2(50),
    11:39:02   8  OTHER_MAKE varchar2(50),
    11:39:02   9  OTHER_MODEL_DESCRIPTION varchar2(50),
    11:39:02  10  OTHER_MODEL_YEAR varchar2(50)
    11:39:02  11  );
    Table created.
    11:39:13 pavan_Real>select  * from test1;
    no rows selected
    C:\Documents and Settings\ivy3905>sqlldr ara/ara@pavan_real
    control = C:\control.ctl
    SQL*Loader: Release 9.2.0.1.0 - Production on Sat Sep 12 11:41:27 2009
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Commit point reached - logical record count 5
    11:42:20 pavan_Real>select count(*) from test1;
      COUNT(*)                                                                     
             5    control.ctl
    LOAD DATA
    INFILE 'C:\data.txt'
    BADFILE 'c:\data.BAD'
    DISCARDFILE 'c:\data.DSC' DISCARDMAX 50000
    INTO TABLE test1
    TRAILING NULLCOLS
    CUST_NBR_MAIN POSITION(1:9) CHAR NULLIF (CUST_NBR_MAIN=BLANKS),
    CONTACT_TYPE POSITION(10:11) CHAR NULLIF (CONTACT_TYPE=BLANKS),
    INQUIRY_TYPE POSITION(12:13) CHAR NULLIF (INQUIRY_TYPE=BLANKS),
    INQUIRY_MODEL POSITION(14:20) CHAR NULLIF (INQUIRY_MODEL=BLANKS),
    INQUIRY_COMMENTS POSITION(21:60) CHAR NULLIF (INQUIRY_COMMENTS=BLANKS),
    OTHER_COLOUR POSITION(61:75) CHAR NULLIF (OTHER_COLOUR=BLANKS),
    OTHER_MAKE POSITION(76:89) CHAR NULLIF (OTHER_MAKE=BLANKS),
    OTHER_MODEL_DESCRIPTION POSITION(90:109) CHAR NULLIF (OTHER_MODEL_DESCRIPTION=BLANKS),
    OTHER_MODEL_YEAR POSITION(110:111) CHAR NULLIF (OTHER_MODEL_YEAR=BLANKS)
    data.txt
    000000831KHAN
    000000900UHFA WANTS NEW WARRANTY ID 000001017OHAL
    000001110KHAP
    000001812NHDE231291COST OF SERVICE INSPECTIONS TOO HIGH MAXIMA 92 MK
    000002015TPFA910115CUST UPSET WITH AIRPORT DLR. $200 FOR PLUGS,OIL,FILTER CHANGE. FW
    CUST_NBR_MAIN     CONTACT_TYPE     INQUIRY_TYPE     INQUIRY_MODEL     INQUIRY_COMMENTS     OTHER_COLOUR     OTHER_MAKE     OTHER_MODEL_DESCRIPTION     OTHER_MODEL_YEAR
    000000831     KH     AN     NULL     NULL     NULL     NULL     NULL     NULL
    000000900     UH     FA      WANTS     NEW WARRANTY ID 000001017OHAL     NULL     NULL     NULL     NULL
    000001110     KH     AP     NULL     NULL     NULL     NULL     NULL     NULL
    000001812     NH     DE     231291C     OST OF SERVICE INSPECTIONS TOO HIGH MAXI     MA 92 MK     NULL     NULL     NULL
    000002015     TP     FA     910115C     UST UPSET WITH AIRPORT DLR. $200 FOR PLU     GS,OIL,FILTER C     HANGE. FW     NULL     NULL- Pavan Kumar N
    Edited by: Pavan Kumar on Sep 12, 2009 11:46 AM

  • Sql Loader Data Import Error!!!

    Hi,
    I have to import a huge volume of records from a CSV(150MB) file to table using sql loader. The input file contains '3286909' records. I am using the following control file and command to run the sql loader.
    Control File:
    lload data
    infile Test.csv
    into table TEST_LOAD
    fields terminated by ',' optionally enclosed by '"'
    ID integer external,
    PATH char
    *Command:* --------------
    C:\CSVFiles\CSV>sqlldr system/tiger control= test.ctl log=test.log readsize=200000000 bindsize=200000000
    After running the above command I am able to import '1215717'. Once sql loader reaches this limit then it stops without any error. Somtimes I would get 'SQL*Loader-510: Physical record in data file (test.csv) is longer than the maximum(1048576)'
    Please help me to perform this import operation.
    Thanks

    http://www.morganslibrary.org/reference/externaltab.html
    Example:
    I have a file on my server in a folder c:\mydata called text.csv which is a comma seperated file...
    1,"Fred",200
    2,"Bob",300
    3,"Jim",50As sys user:
    CREATE OR REPLACE DIRECTORY TEST_DIR AS "c:\mydata";
    GRANT READ, WRITE ON DIRECTORY TEST_DIR TO myuser;Note: creates a directory object, pointing to a directory on the server and must exist on the server (it doesn't create the physical directory).
    As myuser:
    SQL> CREATE TABLE ext_test
      2    (id      NUMBER,
      3     empname VARCHAR2(20),
      4     rate    NUMBER)
      5  ORGANIZATION EXTERNAL
      6    (TYPE ORACLE_LOADER
      7     DEFAULT DIRECTORY TEST_DIR
      8     ACCESS PARAMETERS
      9       (RECORDS DELIMITED BY NEWLINE
    10        FIELDS TERMINATED BY ","
    11        OPTIONALLY ENCLOSED BY '"'
    12        (id,
    13         empname,
    14         rate
    15        )
    16       )
    17     LOCATION ('test.csv')
    18    );
    Table created.
    SQL> select * from ext_test;
            ID EMPNAME                    RATE
             1 Fred                        200
             2 Bob                         300
             3 Jim                          50
    SQL>
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

Maybe you are looking for

  • XML Elements Ordering in Generated XML Output

    I have a requirement where I need the xml element tags to appear in the generated xml output in the same order as in data template definition but in my case this is not happening. I have one master group - Customer and two subgroups - BillingAddress

  • ICM_HTTP_SSL_ERROR when calling web service

    Summary: ICM_HTTP_SSL_ERROR was met when I called web service in ABAP with logical port & RFC Connection of type G. Details: 1. the <b>test of RFC Connection of type G in SM59 works OK</b>, with SSL inactive and basic authentication. 2. while <b>in A

  • Problems with my WDS

    Hi guys. I set up a wds last week (my extreme is the "head" and the express is the addon). I added the express to give my self a larger range. Unfortunately, I often get the same results as I did with just one station (one bar). I'm trying to find ou

  • Re: Clone oracle 8i re create database

    Hi, I have a RedHat Server with Oracle Ent. 8i I did a full export of all my current data bases ( DCTEST, DCPROG, DCUSAC, DCCAND) I now have the DC*.dmp files. How can I know import these files into Oracle Database 11R2? Do you have a Step-by-Step ho

  • Flash plugin with firefox 3, no sound [SOLVED]

    Good morning.   I would like to report a problem that I have with the flashplugin in firefox. I installed it using the nspluginwrapper-flash 9.0.124.0-1 in AUR. It works, but I get no sound at all. I nstalled the suggested lib32-alsa-lib package, but