How to use header information in line records while inserting into table?

Hi,
Requirement: Data in flat file needs to be loaded into Oracle Tables.
File has header information (basically file information, some account info) and Line information ( Actual data need to be loaded to oracle tables).
File has position based data, first 2 char for Header is 00 and line is 50
Now, header has some extra info which needs to be loaded along with details lines ( This has to be done using sql loader control file).
What should be the control file logic to capture header information in details lines?
Please suggest.
I am calling this control file from Unix, so I can change the control file definition dynamically by retrieving the first line of the data file and modify the control file before executing the sql loader command. Is there any option to achieve this using SQL LOADER commands?
Thanks,
Venkat.

Pl post details of OS and database versions.
Use the WHEN clause to load header and detail lines into separate tables, then write PL/SQL code to perform needed post-processing.
See the examples in section titled "Distinguishing Different Input Record Formats" here http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_control_file.htm
HTH
Srini

Similar Messages

  • Avoiding duplicate records while inserting into the table

    Hi
    I tried the following insert statement , where i want to avoid the duplicate records while inserting itself
    but giving me the errror like invalid identifier, though the column exists in the table
    Please let me know Where i'm doing the mistake.
    INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
       SELECT 100,
              sk.obj_id,
              sk.key_txt,
              sk.obj_typ,
              sysdate,
              FROM S_KEY sk
        WHERE     sk.obj_typ = 'AY'
              AND SYSDATE BETWEEN sk.start_date AND sk.end_date
              AND sk.obj_id IN (100170,1001054)
               and   not exists  (select 1
                                                                   FROM t_map tm1 where tm1.O_ID=tm.o_id
                                                                        and tm1.sn_id=tm.sn_id
                                                                        and tm1.txt=tm.txt
                                                                        and tm1.typ=tm.typ
                                                                        and tm1.sn_time=tm.sn_time )

    Then
    you have to join the table with alias tml where is that ?do you want like this?
    INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
       SELECT 100,
              sk.obj_id,
              sk.key_txt,
              sk.obj_typ,
              sysdate,
              FROM S_KEY sk
        WHERE     sk.obj_typ = 'AY'
              AND SYSDATE BETWEEN sk.start_date AND sk.end_date
              AND sk.obj_id IN (100170,1001054)
               and   not exists  (select 1
                                                                   FROM t_map tm where sk.obj_ID=tm.o_id
                                                                        and 100=tm.sn_id
                                                                        and sk.key_txt=tm.txt
                                                                        and sk.obj_typ=tm.typ
                                                                        and sysdate=tm.sn_time )

  • Could anyone tell me how to read 4 lines altogether and insert into table

    Hi ,
    I want to load the below data into table by using sql loader.
    First 4 lines should insert in one single row in table and again it should from '01' load the next 4 lines in another row.
    01suresh
    02works
    03in
    04bankok
    01kumar
    02works
    03in
    04abudabi
    01Raju
    02works
    03in
    04france
    Could you please give me some suggestion how i can accomplish it.
    Thanks in advance.
    Regards,
    Vino

    user1142030 wrote:
    First 4 lines should insert in one single row in table and again it should from '01' load the next 4 lines in another row.Not a problem, use CONTINUEIF. Control file:
    LOAD DATA
    INFILE *
    REPLACE
    CONTINUEIF NEXT PRESERVE(1:2) != "01"
    INTO TABLE continueif
    TRAILING NULLCOLS
    DUMMY FILLER POSITION(1:2),
    COL1 TERMINATED BY '02',
    COL2 TERMINATED BY '03',
    COL3 TERMINATED BY '04',
    COL4 TERMINATED BY '01'
    BEGINDATA
    01suresh
    02works
    03in
    04bankok
    01kumar
    02works
    03in
    04abudabi
    01Raju
    02works
    03in
    04france Now:
    SQL> create table continueif(
      2                          col1 varchar2(10),
      3                          col2 varchar2(10),
      4                          col3 varchar2(10),
      5                          col4 varchar2(10)
      6                         )
      7  /
    Table created.
    SQL> host sqlldr scott@orcl/tiger control=c:\temp\continueif.ctl log=c:\temp\continueif.log
    SQL> select * from continueif
      2  /
    COL1       COL2       COL3       COL4
    suresh     works      in         bankok
    kumar      works      in         abudabi
    Raju       works      in         france
    3 rows selected.
    SQL> SY.

  • Use of sequence numbers while inserting into tables

    Hi,
    Consider i have a stored procedure where i insert records into 30 to 40 tables.
    for example i have a table A and a sequence SEQ_A for that table.
    Is there any difference in performance in the following two ways of
    insertion,
    1. select SEQ_A.NEXTVAL into variable_a FROM DUAL;
    INSERT INTO A VALUES(variable_a);
    2.INSERT INTO A VALUES (SEQ_A.NEXTVAL).
    I have 30 to 40 insert statements like that in method 1.
    if i replace it by method 2. will there be any improvement in performance?
    Thanks & Regards
    Sundar

    This could be interesting. I agree that triggers should be used for data integrity, though even better is to only allow write access to tables via stored procedures.
    Anyway while you are waiting for the test case that shows triggers are faster, which could take some time, it is quick and easy to come up with a test case that shows the opposite.
    SQL> create table t (id number, dummy varchar2(1));
    Table created.
    real: 40
    SQL> insert into t select s.nextval, 'a' from all_objects;
    29625 rows created.
    real: 5038
    SQL> rollback;
    Rollback complete.
    real: 40
    SQL> create trigger tt before insert on t
      2  for each row
      3  begin
      4     select s.nextval into :new.id from dual;
      5  end;
      6  /
    Trigger created.
    real: 291
    SQL> insert into t select null, 'a' from all_objects;
    29626 rows created.
    real: 13350
    SQL> rollback;
    Rollback complete.
    real: 1082
    SQL>Note with triggers even the rollback took longer.
    Martin

  • How do you keep display only fields from being inserted into tables?

    Morning I have a problem where I am displaying a couple fields on a screen and they are tring to insert in to a table.
    I am running Apex 4.1 on 11Gr2, i am building a set of screens, on the first one the user inputs a subject ID, that branchs to a second screen that uses the subject id to do a lookup for a couple fields for display. there are a couple field the user feels in and a row is inserted in to a differnet table. But Apex is tring to insert the lookup fields in to the second table when I really only need the subject ID.
    How can I tell Apex which fields to use in the insert and which not to try to uses. I see in the debug it is getting all the screen fields in the inserty statement.
    Thanks

    In the insert process, take the cols out which you don't want to insert, say,
    insert into tbl (col1, col2, ....) value (p1, p2, ...)
    =>
    insert into tbl (col1, col5, ....) values (p1, p5, ...)

  • SSIS - Script Task creates a DTS var and using Foreach loop, Execute SQL needs to INSERT into table from this DTS var.

    I have a script task written in C# that creates an array of strings "arrayFields" after parsing a text file. It saves the array of strings in a DTS variable.
    Each row in array represents a row is comma separated and is a row that must be inserted into a table. For example,
    X and Z are fields in the table
    X1, X2,....Xn
    Z1,Z2,...Zn
    I am using a Foreach Loop  to grab each row and then  Execute SQL Task to take each row from the array and insert each field per row in a table,
    The SQL is something like,
    INSERT dbo.table values(field1, field2,...fieldn) arrayFields?
    What should this this INSERT look like?

    I guess you implemented
    Shredding a Recordset
    Based on what I understood (correct me if I am wrong) you have difficulties mapping the input parameters, if so here is the guide
    http://www.sqlis.com/sqlis/post/The-Execute-SQL-Task.aspx
    In short it might look like
    INSERT dbo.table  (ColumnA, ColumnB,...) VALUES (?,?...)
    The syntax for the T-SQL INSERT is http://technet.microsoft.com/en-us/library/dd776381%28v=sql.105%29.aspx
    Arthur
    MyBlog
    Twitter

  • Validate the record while uploading into table

    Hi,
    I am uploading records into the table, and it so happened that an incorrect record full of text data was there in theinput file, it got uploaded along with the records,
    hence I need to modify the code to check for such data -it is text data.
    I have an 'empid' field (only numbers). I used that field to check for numeric data some thing like this...
    if itab-empid+0(1) between '0' and '1'.
    move-corresponding itab to DTABLE.
    APPEND DTABLE.
    else.
    continue.
    endif.
    it is not working, throwing a short dump.
    is there a better way to validate this? All I am doing is if empid field has any text data do not read into the internal table.
    Thanks,
    SK

    Hi,
    Check this, It will work.
    if itab-empid CO '12345 67890'.
    move-corresponding itab to DTABLE.
    APPEND DTABLE.
    else.
    endif.
    Regards
    L Appana

  • How to use stripline in ssrs line chatrs

    Hi,
    I want to know how to use striplines in ssrs line charts?

    Hi ,
    You can use below link;
    https://msdn.microsoft.com/en-us/library/dd239316.aspx?f=255&MSPPError=-2147217396
    https://sqlserverbiblog.wordpress.com/tag/strip-line/
    http://blogs.adatis.co.uk/blogs/jeoc/archive/2013/07/23/adding-strip-lines-to-reports.aspx
    Thanks
    Please Mark This As Answer or vote for Helpful Post if this helps you to solve your question/problem. http://techequation.com

  • How to use Heading 1: h1 in while posting answer(formatting the ans) on SDN

    Hi,
    please let me know
    how to use Heading 1: h1 in while posting answer(formatting the ans) on SDN.
    Regards,
    RS

    >
    raj.mohans wrote:
    Hi,
    I did couple of mistake while posting answers...
    like spoon feeding to users give some basic links for there queries...
    posted 3-4 links for same answer for better under standing of topics...
    took help from other site.
    which i admit and i will not follow in future.
    please let me know how much is chances that they will block my account.
    Regards
    Rajeev
    Why do you say "hi Rajeev" and then sign off as Rajeev and then edit your post to remove it but forget that you copy&pasted (again..) into the subject title field...?
    No, I am not really puzzled much.
    Are you dizzy? 
    Cheers,
    Julius

  • How to use complex function as condition in Oracle Rule Decision Table?

    How to use complex function as condition in Oracle Rule Decision Table?
    We want to compare an incoming date range with the date defined in the rules. This date comparison is based on the input date in the fact & the date as defined for each rule. Can this be done in a decision table?

    I see a couple of problems here.
    First, what you posted below is not a syntactically valid query. It seems to be part of a larger query, specifically, this looks to be only the GROUP BY clause of a query.
    Prabu ammaiappan wrote:
    Hi,
    I Have a group function in the Query. Below is the Query i have used it,
    GROUP BY S.FREIGHTCLASS,
    R.CONTAINERKEY,
    S.SKU,
    S.DESCR ||S.DESCRIPTION2,
    S.PVTYPE,
    RD.LOTTABLE06,
    R.WAREHOUSEREFERENCE,
    RD.TOLOC,
    R.ADDWHO,
    R.TYPE,
    S.CWFLAG,
    S.STDNETWGT,
    S.ORDERUOM,
    R.ADDDATE,
    C.DESCRIPTION,
    (CASE WHEN P.POKEY LIKE '%PUR%' THEN 'NULL' ELSE to_char(P.PODATE,'dd/mm/yyyy') END),
    NVL((CASE WHEN R.ADDWHO='BOOMI' THEN RDD.SUPPLIERNAME END),SS.COMPANY),
    RDD.BRAND,
    S.NAPA,
    RD.RECEIPTKEY,
    R.SUSR4,
    P.POKEY,
    RDD.SUSR1,
    r.STATUS, DECODE(RDD.SUSR2,' ',0,'',0,RDD.SUSR2),
    rd.SUSR3Second, the answer to your primary question, "How do I add a predicate with with a MAX() function to my where clause?" is that you don't. As you discovered, if you attempt to do so, you'll find it doesn't work. If you stop and think about how SQL is processed, it should make sense to you why the SQL is not valid.
    If you want to apply a filter condition such as:
    trunc(max(RD.DATERECEIVED)) BETWEEN TO_DATE('01/08/2011','DD/MM/YYYY') AND TO_DATE('01/08/2011','DD/MM/YYYY')you should do it in a HAVING clause, not a where clause:
    select ....
      from ....
    where ....
    group by ....
    having max(some_date) between this_date and that_date;Hope that helps,
    -Mark

  • How insert into table select from table works in jdbc driver?

    Hi, Supposing one table has two LOB fields, one LOB field is BLOB type while the other is CLOB type, I use the following sql statement to copy a row into a new row:
    insert into table (id, file_body, file_content) select new_id, file_body, file_content from table where id = '111';
    After commit on the connection, I can see the copied record in the table and both LOB fields are not null and this's the expected behavior.
    However after some days later, the copy function becomes to be a problem, the BLOB field named file_body can be null when the copy job is done, while other fields copy successfully.
    The issue can not be reproduced every time.
    I suppose the jdbc driver may try to allocate byte buffer in the heap to perform copy operation for BLOB fields,if there is no enough memory available in the heap the copy operation may fail but the commit on the connection can be successful.b/c I can see a lot of OOM errors in the log files and I believe this can contribute to the issue.
    Hope someone can give me a hint or comments.
    Thanks,
    SuoNayi

    I want to figure out what's memory leak point and I have tried the following solutions but none worked:
    1.I have tried to dump the memory of the JVM but failed,I can see the following errors :
    [root@localhost xxx]# jmap -heap:format=b 3027
    Attaching to process ID 3027, please wait...
    Debugger attached successfully.
    Server compiler detected.
    JVM version is 1.5.0_16-b02
    Unknown oop at 0x00002b21a24cd550
    Oop's klass is null
    Finding object size using Printezis bits and skipping over...
    Unknown oop at 0x00002b21a3634380
    Oop's klass is null
    Finding object size using Printezis bits and skipping over...
    2.and the thread stack can not be dumped successfully as well:
    Thread 3046: (state = BLOCKED)
    - java.lang.Object.wait(long) @bci=0 (Compiled frame; information may be imprecise)
    Error occurred during stack walking:
    the version of java is:
    java version "1.5.0_16"
    Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_16-b02)
    Java HotSpot(TM) 64-Bit Server VM (build 1.5.0_16-b02, mixed mode)
    I have to added dump thread stack option in JVM arguments, -XX:+HeapDumpOnOutOfMemoryError.
    If there are other solutions please let me know, thanks.

  • Trigger a process when a record is inserted into database

    Hi, could someone tell me how to do the following (if it's possible):
    when a record is inserted into a table in a database,
    I want it to somehow trigger an action which will put an item into user's To-Do inBox of the WorkSpace,
    the user can then open the form in the To-Do and use it to view the data in the newly inserted record.
    this "user"'s login ID is part of the inserted record.
    thanks

    Mmm, trigger?  I don't know of a way.
    But I supposed you could have a process that has a timer that peroidically queries the database for changes and then do the same thing.
    You could also perhaps write a Service and then use the Java API to start a process?
    Maybe take a look here
    http://blogs.adobe.com/livecycle/2010/12/how-to-invoke-a-livecycle-process-periodically.ht ml

  • How do i insert into table through forms

    Hi
    I have developed a custom form based on custom table.
    the only way to insert data into database table is through form.
    there are two tables: one table is to store all contract details & second table is to maintain history forthis.
    one condition(col1,col2,col3,col4) are unique combination,we are not creating any PK or FK at database level.evrythng is captured at form level.
    if all 4 columns combination exist thn e should not insert that record.
    if 4 columns combination doesnot exist then insert into table.
    I have used just pre insert,pre update triggers.
    I think its a basic form functionality ,by itself it inserts ,update record.now it is doing the same thng.
    But I have to add the above condition ,how can i do that.
    Pl provide me some ex code .
    Thank you.
    Hope any one can help me

    SQL> create table t
      2  (object_id    number
      3  ,object_name  varchar2(30));
    Table created.
    SQL>
    SQL> create sequence t_seq;
    Sequence created.
    SQL>
    SQL> insert into t (object_id, object_name)
      2  select t_seq.nextval
      3        ,object_name
      4  from   all_objects
      5  ;
    52637 rows created.

  • Commit for every 1000 records in  Insert into select statment

    Hi I've the following INSERT into SELECT statement .
    The SELECT statement (which has joins ) has around 6 crores fo data . I need to insert that data into another table.
    Please suggest me the best way to do that .
    I'm using the INSERT into SELECT statement , but i want to use commit statement for every 1000 records .
    How can i achieve this ..
    insert into emp_dept_master
    select e.ename ,d.dname ,e.empno ,e.empno ,e.sal
       from emp e , dept d
      where e.deptno = d.deptno       ------ how to use commit for every 1000 records .Thanks

    Smile wrote:
    Hi I've the following INSERT into SELECT statement .
    The SELECT statement (which has joins ) has around 6 crores fo data . I need to insert that data into another table.Does the another table already have records or its empty?
    If its empty then you can drop it and create it as
    create your_another_table
    as
    <your select statement that return 60000000 records>
    Please suggest me the best way to do that .
    I'm using the INSERT into SELECT statement , but i want to use commit statement for every 1000 records .That is not the best way. Frequent commit may lead to ORA-1555 error
    [url http://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:275215756923]A nice artical from ASKTOM on this one
    How can i achieve this ..
    insert into emp_dept_master
    select e.ename ,d.dname ,e.empno ,e.empno ,e.sal
    from emp e , dept d
    where e.deptno = d.deptno       ------ how to use commit for every 1000 records .
    It depends on the reason behind you wanting to split your transaction into small chunks. Most of the time there is no good reason for that.
    If you are tying to imporve performance by doing so then you are wrong it will only degrade the performance.
    To improve the performance you can use APPEND hint in insert, you can try PARALLEL DML and If you are in 11g and above you can use [url http://docs.oracle.com/cd/E11882_01/appdev.112/e25788/d_parallel_ex.htm#CHDIJACH]DBMS_PARALLEL_EXECUTE to break your insert into chunks and run it in parallel.
    So if you can tell the actual objective we could offer some help.

  • Read text file insert into table using utl_file

    Hi
    i have script for read and insert into table but i want error records load into error table so i sent you my script and please fix the error log table
    script
    DECLARE
    v_line VARCHAR2(2000);
    v_file utl_file.file_type;
    v_dir VARCHAR2(250);
    v_filename VARCHAR2(50);
    BEGIN
    v_dir :='MID5010_DOC1TP';
    v_filename := 'OPT_CM_BASE.txt';
    v_file := utl_file.fopen(v_dir, v_filename, 'r');
    LOOP
    BEGIN
    utl_file.get_line(v_file, v_line);
    EXCEPTION
    WHEN no_data_found THEN
    EXIT;
    END ;
    v_line := REPLACE(v_line,'|','|~');
    INSERT
    INTO optum_icd10cm_base VALUES
    ( REPLACE(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,1),'a~','a'),'.'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,2),'a~','a'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,3),'a~','a'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,4),'a~','a'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,5),'a~','a'),
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,6)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,6),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,6),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,7)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,7),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,7),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,8)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,8),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,8),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,9)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,9),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,9),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,10)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,10),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,10),'a~','a'),'mm-dd-yyyy')
    END,
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,11),'a~','a')
    -----commit;
    END LOOP;
    utl_file.fclose(v_file);
    END;
    text file
    A50.0||Short|Long|Full|01-01-2009|01-2009||01-01-2013|09-18-2012|C|
    A50.1||Short|Long|Full|01-01-2009|01-01-2009||001-2013|09-18-2012|C|
    A50.2||Short|Long|Full|01-01-2009|01-01-2009|67|01-01-2013|09-18-2012|C|
    A50.3||Short|Long|Full|011-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A50.4||Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|5|
    A50.5|R|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A50.6||Short|Long||01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A50.7||Short||Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    2345||Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.5|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.6|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.7|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.8|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.9|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    B222|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.5|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.6|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.7|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.8|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.9|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.2|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    A5.3|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    D642|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    A5.5|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    A5.6|D|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.7|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A001|C|Short Updated|Long Updated|Full Updated|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A009|C|Short Updated|Long Updated|Full Updated|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.10|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A0109|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.5|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.6|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.7|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A30|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A316|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A317|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    ----clearly read text file insert into table and error records load into error table
    please help me

    hI
    i am using utl_file prepared script but i got error like 01861. 00000 - "literal does not match format string"
    script:
    DECLARE
    f utl_file.file_type;
    s VARCHAR2(32000);
    f1 VARCHAR2(100);
    f2 varchar2(100);
    F3 VARCHAR2(100);
    F4 VARCHAR2(100);
    F5 VARCHAR2(100);
    F6 DATE;
    F7 DATE;
    F8 DATE;
    F9 DATE;
    F10 DATE;
    f11 CHAR(1);
    BEGIN
    --DBMS_OUTPUT.ENABLE(100000);
    f := utl_file.fopen('MID5010_DOC1TP', 'OPT_CM_BASE.txt', 'R');
    LOOP
    BEGIN
    UTL_FILE.GET_LINE(f, s);
    f1 := REGEXP_SUBSTR (s,'[^|]+',1,1);
    f2 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,2);
    F3 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,3);
    F4 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,4);
    F5 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,5);
    F6 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,6),'mm-dd-yyyy');
    F8 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,8),'mm-dd-yyyy');
    F7 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,7),'mm-dd-yyyy');
    F9 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,9),'mm-dd-yyyy');
    F10 :=to_date(REGEXP_SUBSTR (REPLACE(s,'||','||') ,'[^|]+',1,10),'mm-dd-yyyy');
    f11 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,11);
    INSERT
    INTO OPTUM_ICD10CM_BASE
    ( CODE,
    STATUS,
    SHORT_DESCRIPTION,
    LONG_DESCRIPTION,
    FULL_DESCRIPTION,
    CODE_EFFECTIVE_DATE,
    CHANGE_EFFECTIVE_DATE,
    TERMINATION_DATE,
    RELEASE_DATE,
    CREATION_DATE,
    VALIDITY
    VALUES
    F1,
    F2,
    F3,
    F4,
    F5,
    F6,
    F7,
    F8,
    F9,
    F10,
    f11
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    EXIT;
    END;
    END LOOP;
    UTL_FILE.FCLOSE(F);
    END;
    please help me(in my org looks utl_file standards only)

Maybe you are looking for

  • Shockwave won't play linked Director files from server

    I created a bunch of director files for the course content of my  class. All can be accessed from a contents file. When I created an html  contents file on my desktop (in the same folder as the linked director  files), the contents pages would open i

  • After a year of using PS, Adobe tells me my serial number is invalid.

    Has anyone had any problems with serial numbers not working anymore? I'm told by a person on their chat support that my serial number is not valid. I have CS4 that was purchased from ebay, but it's all original, not a copy. I've been using it for the

  • Dual Boot on Yoga11S (windows 8 + windows 7)

    I'm trying to install Windows 7 as a dual boot with the Windows 8 preinstalled on a  new Yoga 11S.. Has anyone succeeded at trying to accomplish this?

  • Best tutorial on DVD for DVD Studio Pro?

    Totally new to FC Especially need to learn how to compose/create a menu, using Photoshop I am not comfortable with online tutorials. Recommendations appreciated

  • QM - VC integration

    Hi, How VC is integrated in QM . Please mention the complete cycle with tcode. Thanks Jaya