Exporting data from text file to a table using utl_file

Dear all,
I have a text file as below and i have a table having 12 columns. Now i need to insert this text file into the table story_books.
CREATE TABLE story_books
book_id NUMBER,
Category VARCHAR2(100 BYTE),
Book_type VARCHAR2(100 BYTE),
Name VARCHAR2(700 BYTE),
Location VARCHAR2(700 BYTE),
Ownership_code VARCHAR2(700 BYTE),
Author VARCHAR2(700 BYTE),
Less_Sel_fact VARCHAR2(700 BYTE),
Reason VARCHAR2(700 BYTE),
Buying VARCHAR2(700 BYTE),
Suspected Book VARCHAR2(700 BYTE),
Conditions VARCHAR2(700 BYTE)
-------------------------text file---------------
Books Out Table: Books
Book. Type          Name          Location               Ownership Code
Story               SL          hyd               SS-HYD
Known Author:     Unknown               
Less Selling Factors: Thunderstorms     
Reason:     Unknown               
Buying (if applicable):
Not Applicable
Suspected Book:
Unknown
Conditions to increace sales:
Advertisement in all areas
i was able to read the data and storing if it is in the same line.But i dont know how to read below data
Book. Type          Name          Location               Ownership Code
Story               SL          hyd               SS-HYD
In this data i have to search for 'Book. type' and then i need to save the word 'Story' to the column 'Book_type'
Then i need to search for 'Name' and i need to save 'SL' into the column into 'Name'
Then i need to search for 'Location' and i need to save 'hyd' into the column into 'Location'
I was able to extract the data if it is in below format using utl_file.get_line
Known Author:     Unknown               
Less Selling Factors: Thunderstorms     
Reason:     Unknown     
Any one can explain me how to solve the above criteria.
Thanks in advance.

Dear all,
I have a text file as below and i have a table having 12 columns. Now i need to insert this text file into the table story_books.
CREATE TABLE story_books
book_id NUMBER,
Category VARCHAR2(100 BYTE),
Book_type VARCHAR2(100 BYTE),
Name VARCHAR2(700 BYTE),
Location VARCHAR2(700 BYTE),
Ownership_code VARCHAR2(700 BYTE),
Author VARCHAR2(700 BYTE),
Less_Sel_fact VARCHAR2(700 BYTE),
Reason VARCHAR2(700 BYTE),
Buying VARCHAR2(700 BYTE),
Suspected Book VARCHAR2(700 BYTE),
Conditions VARCHAR2(700 BYTE)
-------------------------text file---------------
Books Out Table: Books
Book. Type          Name          Location               Ownership Code
Story               SL          hyd               SS-HYD
Known Author:     Unknown               
Less Selling Factors: Thunderstorms     
Reason:     Unknown               
Buying (if applicable):
Not Applicable
Suspected Book:
Unknown
Conditions to increace sales:
Advertisement in all areas
i was able to read the data and storing if it is in the same line.But i dont know how to read below data
Book. Type          Name          Location               Ownership Code
Story               SL          hyd               SS-HYD
In this data i have to search for 'Book. type' and then i need to save the word 'Story' to the column 'Book_type'
Then i need to search for 'Name' and i need to save 'SL' into the column into 'Name'
Then i need to search for 'Location' and i need to save 'hyd' into the column into 'Location'
I was able to extract the data if it is in below format using utl_file.get_line
Known Author:     Unknown               
Less Selling Factors: Thunderstorms     
Reason:     Unknown     
Any one can explain me how to solve the above criteria.
Thanks in advance.

Similar Messages

  • Import data from text file to a table using t-sql

    Hi,
    I am trying to import data from a text file to a table using the below query but it is returning an error.
    SELECT *, 2 FROM OPENROWSET(BULK 'W:\file.txt',
    FORMATFILE = 'W:\format_file.xml', FIRSTROW = 1, LASTROW = 1) as f
    The error is:
    Bulk load data conversion error (truncation) for row 1, column 1 (COPYRIGHT_DETAIL_CODE).

    <BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
     <RECORD>
      <FIELD ID="1" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="1" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="2" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="5" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="3" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="5" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="4" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="10" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="5" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="1" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="6" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="1" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="7" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="10" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="8" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="10" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="9" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="8" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="10" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="8" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="11" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="8" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="12" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="5" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="13" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="8" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="14" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="2" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="15" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="3" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="16" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="4" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="17" xsi:type="CharTerm" TERMINATOR="\r\n" MAX_LENGTH="43" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
     </RECORD>
     <ROW>
      <COLUMN SOURCE="1" NAME="Name1" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="2" NAME="Name2" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="3" NAME="Name3" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="4" NAME="Name4" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="5" NAME="Name5" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="6" NAME="Name6" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="7" NAME="Name7" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="8" NAME="Name8" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="9" NAME="Name9" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="10" NAME="Name10" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="11" NAME="Name11" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="12" NAME="Name12" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="13" NAME="Name13" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="14" NAME="Name14" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="15" NAME="Name15" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="16" NAME="Name16" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="17" NAME="Name17" xsi:type="SQLVARYCHAR"/>
     </ROW>
    </BCPFORMAT>
    The format file that I am using is just like the same one above.
       

  • How to extract data from  text file to database table

    Hi ,
    I am trying to upload  data in text file to database table  using GUI_UPLOAD function .what would be the program for that.
    thanks in advance.

    Hi,
    I don't think you have a standard sap program to upload data from file to database table...
    Instead you can create a custom program like this..
    DATA: T_FILEDATA(1000) OCCURS 0 WITH HEADER LINE.
    DATA: T_ZTABLE LIKE ZTABLE OCCURS 0 WITH HEADER LINE.
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
        filename                      = 'C:\TEST.TXT'
      tables
        data_tab                      = T_FILEDATA
    EXCEPTIONS
       FILE_OPEN_ERROR               = 1
       FILE_READ_ERROR               = 2
       NO_BATCH                      = 3
       GUI_REFUSE_FILETRANSFER       = 4
       INVALID_TYPE                  = 5
       NO_AUTHORITY                  = 6
       UNKNOWN_ERROR                 = 7
       BAD_DATA_FORMAT               = 8
       HEADER_NOT_ALLOWED            = 9
       SEPARATOR_NOT_ALLOWED         = 10
       HEADER_TOO_LONG               = 11
       UNKNOWN_DP_ERROR              = 12
       ACCESS_DENIED                 = 13
       DP_OUT_OF_MEMORY              = 14
       DISK_FULL                     = 15
       DP_TIMEOUT                    = 16
       OTHERS                        = 17
    IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    LOOP AT T_FILEDATA.
      T_ZTABLE = T_FILEDATA.
      APPEND T_ZTABLE.
    ENDLOOP.
    MODIFY ZTABLE FROM TABLE T_ZTABLE.
    COMMIT WORK..
    Thanks,
    Naren

  • Read data from text file one by one using for loop

    Dear Forum
    I want to read data of single colum of text file inside the for loop one by one(for each iteration value must be replaced by another value of text file).This value is used by a formula inside for loop. also after completion of iterations the values must be plotted on xy graph. How to do it.? please help me.
    profravi

    It's not very efficient to read a file line by line. Much simpler to read it all at once and then auto-index the results inside a for loop. The image below shows a generic solution to your problem. This assumes that the data in the file is in a single column.
    I would also recomend checking ou the learning LabVIEW resources here.
    I would aslos suggest that since this type of question is not specific to either NI-Elvis or the LabVIEW SE, you should post similar questions to the LabVIEW board. If you have problems, attaching a sample of the text file would help.
    Message Edited by Dennis Knutson on 10-18-2007 09:11 AM
    Attachments:
    XY Graph From File.PNG ‏4 KB

  • Read text file insert into table using utl_file

    Hi
    i have script for read and insert into table but i want error records load into error table so i sent you my script and please fix the error log table
    script
    DECLARE
    v_line VARCHAR2(2000);
    v_file utl_file.file_type;
    v_dir VARCHAR2(250);
    v_filename VARCHAR2(50);
    BEGIN
    v_dir :='MID5010_DOC1TP';
    v_filename := 'OPT_CM_BASE.txt';
    v_file := utl_file.fopen(v_dir, v_filename, 'r');
    LOOP
    BEGIN
    utl_file.get_line(v_file, v_line);
    EXCEPTION
    WHEN no_data_found THEN
    EXIT;
    END ;
    v_line := REPLACE(v_line,'|','|~');
    INSERT
    INTO optum_icd10cm_base VALUES
    ( REPLACE(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,1),'a~','a'),'.'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,2),'a~','a'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,3),'a~','a'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,4),'a~','a'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,5),'a~','a'),
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,6)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,6),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,6),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,7)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,7),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,7),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,8)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,8),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,8),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,9)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,9),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,9),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,10)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,10),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,10),'a~','a'),'mm-dd-yyyy')
    END,
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,11),'a~','a')
    -----commit;
    END LOOP;
    utl_file.fclose(v_file);
    END;
    text file
    A50.0||Short|Long|Full|01-01-2009|01-2009||01-01-2013|09-18-2012|C|
    A50.1||Short|Long|Full|01-01-2009|01-01-2009||001-2013|09-18-2012|C|
    A50.2||Short|Long|Full|01-01-2009|01-01-2009|67|01-01-2013|09-18-2012|C|
    A50.3||Short|Long|Full|011-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A50.4||Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|5|
    A50.5|R|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A50.6||Short|Long||01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A50.7||Short||Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    2345||Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.5|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.6|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.7|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.8|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.9|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    B222|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.5|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.6|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.7|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.8|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.9|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.2|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    A5.3|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    D642|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    A5.5|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    A5.6|D|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.7|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A001|C|Short Updated|Long Updated|Full Updated|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A009|C|Short Updated|Long Updated|Full Updated|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.10|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A0109|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.5|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.6|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.7|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A30|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A316|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A317|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    ----clearly read text file insert into table and error records load into error table
    please help me

    hI
    i am using utl_file prepared script but i got error like 01861. 00000 - "literal does not match format string"
    script:
    DECLARE
    f utl_file.file_type;
    s VARCHAR2(32000);
    f1 VARCHAR2(100);
    f2 varchar2(100);
    F3 VARCHAR2(100);
    F4 VARCHAR2(100);
    F5 VARCHAR2(100);
    F6 DATE;
    F7 DATE;
    F8 DATE;
    F9 DATE;
    F10 DATE;
    f11 CHAR(1);
    BEGIN
    --DBMS_OUTPUT.ENABLE(100000);
    f := utl_file.fopen('MID5010_DOC1TP', 'OPT_CM_BASE.txt', 'R');
    LOOP
    BEGIN
    UTL_FILE.GET_LINE(f, s);
    f1 := REGEXP_SUBSTR (s,'[^|]+',1,1);
    f2 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,2);
    F3 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,3);
    F4 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,4);
    F5 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,5);
    F6 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,6),'mm-dd-yyyy');
    F8 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,8),'mm-dd-yyyy');
    F7 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,7),'mm-dd-yyyy');
    F9 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,9),'mm-dd-yyyy');
    F10 :=to_date(REGEXP_SUBSTR (REPLACE(s,'||','||') ,'[^|]+',1,10),'mm-dd-yyyy');
    f11 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,11);
    INSERT
    INTO OPTUM_ICD10CM_BASE
    ( CODE,
    STATUS,
    SHORT_DESCRIPTION,
    LONG_DESCRIPTION,
    FULL_DESCRIPTION,
    CODE_EFFECTIVE_DATE,
    CHANGE_EFFECTIVE_DATE,
    TERMINATION_DATE,
    RELEASE_DATE,
    CREATION_DATE,
    VALIDITY
    VALUES
    F1,
    F2,
    F3,
    F4,
    F5,
    F6,
    F7,
    F8,
    F9,
    F10,
    f11
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    EXIT;
    END;
    END LOOP;
    UTL_FILE.FCLOSE(F);
    END;
    please help me(in my org looks utl_file standards only)

  • How to load date and time from text file to oracle table through sqlloader

    hi friends
    i need you to show me what i miss to load date and time from text file to oracle table through sqlloader
    this is my data in this path (c:\external\my_data.txt)
    7369,SMITH,17-NOV-81,09:14:04,CLERK,20
    7499,ALLEN,01-MAY-81,17:06:08,SALESMAN,30
    7521,WARD,09-JUN-81,17:06:30,SALESMAN,30
    7566,JONES,02-APR-81,09:24:10,MANAGER,20
    7654,MARTIN,28-SEP-81,17:24:10,SALESMAN,30my table in database emp2
    create table emp2 (empno number,
                      ename varchar2(20),
                      hiredate date,
                      etime date,
                      ejob varchar2(20),
                      deptno number);the control file code in this path (c:\external\ctrl.ctl)
    load data
    infile 'C:\external\my_data.txt'
    into table emp2
    fields terminated by ','
    (empno, ename, hiredate, etime, ejob, deptno)this is the error :
    C:\>sqlldr scott/tiger control=C:\external\ctrl.ctl
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon May 31 09:45:10 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 5
    C:\>any help i greatly appreciated
    thanks
    Edited by: user10947262 on May 31, 2010 9:47 AM

    load data
    infile 'C:\external\my_data.txt'
    into table emp2
    fields terminated by ','
    (empno, ename, hiredate, etime, ejob, deptno)Try
    load data
    infile 'C:\external\my_data.txt'
    into table emp2
    fields terminated by ','
    (empno, ename, hiredate, etime "to_date(:etime,'hh24:mi:ss')", ejob, deptno)
    this is the error :
    C:\>sqlldr scott/tiger control=C:\external\ctrl.ctl
    SQL*Loader: Release 10.2.0.1.0 - Production on Mon May 31 09:45:10 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 5
    C:\>
    That's not an error, you can see errors within log and bad files.

  • How to transfer data from excel files into z-tables

    Please help me with a code which transfers data from excel file to z-table.
    Thanks in advance
    Shuvir

    Hi Daniel,
    Export Data
    Purpose
    Use this procedure to export SAP data to a local file such as Microsoft Excel.
    Menu Path
    Use the following menu path to begin this process:
    ·         SystemèListèSaveèLocal File
    Helpful Hints
    When reviewing fields, R = Required,  O = Optional and C = Conditional.
    Procedure
    1.       Start the transaction using the menu path or transaction code.
          Whatever Data You Want to Export
    2.       Select SystemèListèSaveèLocal File.  This works well for any data in SAP.  This is the only option for the top-level (first page) of a report. 
    In a drill-down view within a report the Local File button  on the toolbar may be used and has the same options.
          Choose File Format
    3.       Click  .
    4.       Click  to continue.  If prompted for a format after choosing Spreadsheet, select Excel Table to get an Excel file that can be modified more easily.
          Choose File Save Location Step 1
    5.       Click  to the right of the Directory field to choose a different location.
          Choose File Save Location Step 2
    6.       Click  or browse your computer to locate the directory where you want to save your file.
    7.       Complete the following field:
    ·         File name:
        You must add the proper file extension to the name of your file (.xls for Excel, .rtf for Rich Text, .html for HTML).  The file extension tells your computer what program to open the file with.  If you do not have the file extension at the end, you may not be able to open it.
    8.       Click  to continue.
          Generate File in Location and Format Selected
    9.       Click  to create the file in the location and format selected.  In this example the file was named "example.xls" and saved on the desktop.
    Result
    You have completed the export process.
    thanks
    karthik

  • Failed to read data from report file Reason: The table could not be found.

    BO Enterprise XI R2, cannot publish crystal reports using the publishing wizard.
    Failed to read data from report file Reason: The table could not be found.
    Any ideas to get around this would really help out.
    Regards

    Connection used Views, ODBC System DSN is setup properly.
    Approach for import from business view manager and import wizard.  both methods failed to import the Business View and underlying reports.
    I figure I may have imported the Business View wrong? From Business View Manager I exported from my dev server then imported to prod server.
    Apparently I learned exporting my business view also includes the Data Connections that the Business Views are dependent upon.
    So which ever folder you specify it copies it there. Originally the all Data Connections Resides on the root folder. To return it to the original location. I deleted what I had exported. Exported this time to the root folder, then only deleted the business views, foundation, elements. Then exported again to the folder where I intended then only deleted the Data Connection.
    Makes any sense? So I then had to re point the business views and all the dependent objects to the data connection that resides in the root folder.
    I tested the connection, it works fine. I properly updated my crystal reports to the business view in production. Did a sample extract it works as expected.
    However when i try to publish, either from Crystal or Publish wizard i get the same error?
    As a work around i am thinking, after updating the business view in the crystal reports, shall i re map the fields?? or reexport the business views again?
    Any help will be surely appreciated.

  • Upload data from excel file to Oracle table

    Dear All,
    I have to upload data from excel file to Oracle table without using third party tools and without converting into CSV file.
    Could you tell me please how can i do this using PLSQl or SQL Loader.
    Thnaks in Advance..

    Dear All,
    I have to upload data from excel file to
    Oracle table without using third party tools and
    without converting into CSV file.
    Could you tell me please how can i do this
    using PLSQl or SQL Loader.
    Thnaks in Advance..As billy mentioned using ODBC interface ,the same HS service which is a layer over using traditional ODBC to access non oracle database.Here is link you can hit and trial and come out here if you have any problem.
    http://www.oracle-base.com/articles/9i/HSGenericConnectivity9i.php[pre]
    Khurram                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Cant get data from text file to print into Jtable

    Instead of doing JDBC i am using text file as database. I cant get data from text file to print into JTable when i click find button. Goal is to find a record and print that record only, but for now i am trying to print all the records. Once i get that i will change my code to search desired record and print it. when i click the find button nothing happens. Can you please take a look at my code, dbTest() method. thanks.
    void dbTest() {
    DataInputStream dis = null;
    String dbRecord = null;
    String hold;
    try {
    File f = new File("customer.txt");
    FileInputStream fis = new FileInputStream(f);
    BufferedInputStream bis = new BufferedInputStream(fis);
    dis = new DataInputStream(bis);
    Vector dataVector = new Vector();
    Vector headVector = new Vector(2);
    Vector row = new Vector();
    // read the record of the text database
    while ( (dbRecord = dis.readLine()) != null) {
    StringTokenizer st = new StringTokenizer(dbRecord, ",");
    while (st.hasMoreTokens()) {
    row.addElement(st.nextToken());
    System.out.println("Inside nested loop: " + row);
    System.out.println("inside loop: " + row);
    dataVector.addElement(row);
    System.out.println("outside loop: " + row);
    headVector.addElement("Title");
    headVector.addElement("Type");
    dataTable = new JTable(dataVector, headVector);
    dataTableScrollPane.setViewportView(dataTable);
    } catch (IOException e) {
    // catch io errors from FileInputStream or readLine()
    System.out.println("Uh oh, got an IOException error!" + e.getMessage());
    } finally {
    // if the file opened okay, make sure we close it
    if (dis != null) {
    try {
    dis.close();
    } catch (IOException ioe) {
    } // end if
    } // end finally
    } // end dbTest

    Here's a thread that loads a text file into a JTable:
    http://forum.java.sun.com/thread.jsp?forum=57&thread=315172
    And my reply in this thread shows how you can use a text file as a simple database:
    http://forum.java.sun.com/thread.jsp?forum=31&thread=342380

  • Loading data from .csv file into Oracle Table

    Hi,
    I have a requirement where I need to populate data from .csv file into oracle table.
    Is there any mechanism so that i can follow the same?
    Any help will be fruitful.
    Thanks and regards

    You can use Sql Loader or External tables for your requirement
    Missed Karthick's post ...alredy there :)
    Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM

  • How to load the data from .csv file to oracle table???

    Hi,
    I am using oracle 10g , plsql developer. Can anyone help me in how to load the data from .csv file to oracle table. The table is already created with the required columns. The .csv file is having about 10lakh records. Is it possible to load 10lakh records. can any one please tell me how to proceed.
    Thanks in advance

    981145 wrote:
    Can you tell more about sql * loader??? how to know that utility is available for me or not??? I am using oracle 10g database and plsql developer???SQL*Loader is part of the Oracle client. If you have a developer installation you should normally have it on your client.
    the command is
    sqlldrType it and see if you have it installed.
    Have a look also at the FAQ link posted by Marwin.
    There are plenty of examples also on the web.
    Regards.
    Al

  • Loading data from multiple files to multiple tables

    How should I approach on creating SSIS package to load data from multiple files to multiple tables. Also, Files will have data which might overlap so I might have to create stored procedure for it. Ex. 1st day file -data from au.1 - aug 10 and 2nd day
    file might have data from aug.5 to aug 15.  So I might have to look for max and min date and truncate table with in that date range.

    thats ok. ForEachLoop would be able to iterate through the files. You can declare a variable inside loop to capture the filenames. Choose fully qualified as the option in loop
    Then inside loop
    1. Add execute sql task to delete overlapping data from the table. One question here is where will you get date from? Does it come inside filename?
    2. Add a data flow task with file source pointing to file .For this add a suitable connection manager (Excel/Flat file etc) and map the connection string property to filename variable using expressions
    3. Add a OLEDB Destination to point to table. You can use table or view from variable - fast load option and map to variable to make tablename dynamic and just set corresponding value for the variable to get correct tablename
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Issue in loading specific columns from a file to teradata table using IKM

    Hi,
    Can any one help to resolve the issue in loading specific columns from text file to teradata table.
    i tried using IKM file teradata and columns are getting displaced.
    my requirement suppose i have 5 columns in file and i have to load only 3columns to table using IKM.
    same thing can be achived using LKM file to teradata but i want use IKM.
    please suggest me on this
    Regards
    Vinod

    Hi,
    I believe that the problem you are having is that you have a delimited file, of which you want to pick columns from position 2,3,5. In this case, ODI will pick the first 3 columns of a delimited file regardless of position.
    For example, if you a tab delimited file with c1,c2,c3,c4,c5 columns, and you want only columns c2,c3,c5 - when mapping these in an ODI interface, and executing, you will actually pick up the data from c1,c2,c3 as these are the first three columns in the file (reading from left to right). You can ignore "columns" on the right hand side of a file, but not the left. E.g delimited file with c1,c2,c3,c4,c5. Only pick columns c1,c2 will give you data for the first 2 columns
    Create a temporary table to load all the data from the file, and use you temp table to extract the data you require. Or you could get the file created with the first three columns as the columns you require.
    Cheers
    Bos
    Edited by: Bos on Jan 18, 2011 1:06 PM

  • How to insert data from *.dmp file to  oracle 11g using Oracle SQL Develope

    hi
    i backup my database using PL/SQL developer and made *.dmp file
    how to insert data from *.dmp file to oracle 11g using Oracle SQL Developer 2.1.1.64
    and how to make *.dmp file from sql*plus ?
    thanks in advance

    Pl/Sql developer has a config window, there you choose the exec to do the import/export.
    Find it and his home version, it may be exp or expdp, the home version is the version of the client where the exp executable is.
    Then use the same version of imp or impdp to execute the import, you do not need to use Oracle SQL Developer 2.1.1.64. If you want to use it, you must have the same version in the oracle home that exp/imp of sql developer use.

Maybe you are looking for

  • Purchase Order Tax Code

    Dear Experts, I just run MRP Procedure and generate Order Recommendations. In Order Recommendation created Purchase Order and got message as P.O. sent to approval. All is going well but there is one thing I just want to clarify that How can we update

  • OBIEE : Schedule a Publisher report

    Hi, Is there any solution to schedule a report on BI Publisher ? Thanks in advance

  • Having 2 Views in WDA, can I call the non-default View by itself?

    I am developing a web dynpro that has 2 Views in 1 Window. View 1 is a selection screen with 2 parameters - Project Number & Project Manager View 2 is an Adobe Form which displays project info selected from what the user entered on V1. I created a We

  • Is it possible to create a list of NI-DAQmx tasks used in a VI in LV7.1?

    Hello, I was wondering if there is a built in function for Labview 7.1 that would allow me to generate a list of DAQmx tasks or global channels that are being used in the VI. Thanks, Steve

  • Linking/Syncing Audio and Video

    I think I'm just on the edge of understanding if this can work: I have a rough cut of a few clips using on-camera audio. I also recorded with a separate audio recorder. I want to somehow link the secondary audio to the primary audio and the source vi