Loading a CSV file into a table as in dataworkshop

Data Workshop has a feature to load a CSV file and create a table based on it, similarly i want to create it in my application.
i have gone through the forum http://forums.oracle.com/forums/thread.jspa?threadID=334988&start=60&tstart=0
but could not download all the files(application , HTMLDB_TOOLS package and the PAGE_SENTRY function) couldnt find the PAGE_SENTRY function.
AND when i open this link http://apex.oracle.com/pls/apex/f?p=27746
I couldn't run the application. I provided a CSV file and when I click SUBMIT, I got the error:
ORA-06550: line 1, column 7: PLS-00201: identifier 'HTMLDB_TOOLS.PARSE_FILE' must be declared
tried it in apex.oracle.com host as given in the previous post.
any help pls..,
any other method to load data into tables.., (similar to dataworkshop)

hi jari,
I'm using the HTMDB_TOOLS to parse my csv files,It works well for creating a report, but if i want to create a new table and upload the data giving error *"missing right parenthesis"*
I've been looking through the package body and i think there is some problem in the code,
        IF (p_table_name is not null)
        THEN
          BEGIN
            execute immediate 'drop table '||p_table_name;
          EXCEPTION
            WHEN OTHERS THEN NULL;
          END;
          l_ddl := 'create table '||p_table_name||' '||v(p_ddl_item);
          htmldb_util.set_session_state('P149_DEBUG',l_ddl);
          execute immediate l_ddl;
          l_ddl := 'insert into '||p_table_name||' '||
                   'select '||v(p_columns_item)||' '||
                   'from htmldb_collections '||
                   'where seq_id > 1 and collection_name='''||p_collection_name||'''';
          htmldb_util.set_session_state('P149_DEBUG',v('P149_DEBUG')||'/'||l_ddl);
          execute immediate l_ddl;
          RETURN;
    END IF;it is droping table but not creating new table.
P149_DEBUG contains create table EMP_D (Emp ID number(10),Name varchar2(20),Type varchar2(20),Join Date varchar2(20),Location varchar2(20))and if i comment the
          BEGIN
            execute immediate 'drop table '||p_table_name;
          EXCEPTION
            WHEN OTHERS THEN NULL;
          END;
          l_ddl := 'create table '||p_table_name||' '||v(p_ddl_item);
          htmldb_util.set_session_state('P149_DEBUG',l_ddl);
          execute immediate l_ddl;then it is working well, i.e i enter the exsisting table name then i works fine, inserting the rows.
but unable to create new table
can you pls help to fix it.
Regards,
Little Foot

Similar Messages

  • Loading a CSV file into a table

    HTML DB Data Workshop has a feature to load a CSV file and create a table based on it.
    I use a simplified version of this feature in my applications.
    See a quick demo at
    http://htmldb.oracle.com/pls/otn/f?p=38131:1
    Create a small csv file like
    col1,col2,col3
    varchar2(10),number,"number(10,2)"
    cat,2,3.2
    dog,99,10.4
    The first line in the file are column names
    Second line are their datatypes.
    Save the file as c:\test.csv
    Click the Browse button, search for this file.
    Click the Submit button to upload and parse the file.
    You can browse the file contents.
    If you like what you see, type in a table name and click the Create Table button to create a table.
    Let me know if anyone is interested and I can post the code.
    Hope this helps.
    Message was edited by:
    Vikas
    Message was edited by:
    Vikas

    Hi Vikas,
    I tried to import your COOL application into html db at http://htmldb.oracle.com and got the error:
    "Expecting p_company or wwv_flow_company cookie to contain security group id of application owner.
    Error ERR-7621 Could not determine workspace for application (:) on application accept.
    OK "
    The SAME error I get when trying to import Scott's Photo Catalog at http://htmldb.oracle.com/pls/otn/f?p=18326:7:6703988766288646534::::P7_ID:2242.
    What should I change in the exported script in addition to change your workspace schema VIKASA or Scott's STUDIO_SHOWCASE to my NICED68?
    Thanks!
    DC

  • Error loading local CSV file into external table

    Hello,
    I am trying to load a CSV file located on my C:\ drive on a WIndows XP system into an 'external table'. Everyting used to work correctly when using Oracle XE (iinstalled also locally on my WIndows system).
    However, once I am trynig to load the same file into a Oracle 11g R2 database on UNIX, I get the following errr:
    ORA-29913: Error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    error opening file ...\...\...nnnnn.log
    Please let me know if I can achieve the same functionality I had with Oracle XP.
    (Note: I cannot use SQL*Loader approach, I am invoking Oracle stored procedures from VB.Net that are attempting to load data into external tables).
    Regards,
    M.R.

    user7047382 wrote:
    Hello,
    I am trying to load a CSV file located on my C:\ drive on a WIndows XP system into an 'external table'. Everyting used to work correctly when using Oracle XE (iinstalled also locally on my WIndows system).
    However, once I am trynig to load the same file into a Oracle 11g R2 database on UNIX, I get the following errr:
    ORA-29913: Error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    error opening file ...\...\...nnnnn.log
    Please let me know if I can achieve the same functionality I had with Oracle XP.
    (Note: I cannot use SQL*Loader approach, I am invoking Oracle stored procedures from VB.Net that are attempting to load data into external tables).
    Regards,
    M.R.So your database is on a unix box, but the file is on your Windows desktop machine? So .... how is it you are making your file (on your desktop) visible to your database (on a unix box)???????

  • Loading multiple .csv files into a table using SSIS

    Hi,
    I have a requirement where I have 200+ csv files to be loaded into Netezza table using SSIS.
    Issue I am facing is all columns have different number of columns, for ex, file 1 has columns A,B,C and file 2 has columns C,D,E. My target table has all columns from A to E in it. 
    But, when I am using for each loop container, only the file for which I have specified filepath+filename in loop variable, that is getting loaded. Rest all files, no data is getting loaded from them and package is executing successfully.
    Any help is appreciated.
    Regards,
    VT

    if you want to iterate through files then all files should be in same folder and you should use file enumerator type within ForEach loop. Then inside loop you might need a script task to generate the data flow on the fly based on the avialble input columns
    from the file and do the mapping in the destination. So I assume you put NULLs (or some default value) for missing columns from the file
    http://blog.quasarinc.com/ssis/best-solution-to-load-dynamically-change-csv-file-in-ssis-etl-package/
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Loading data from .csv file into existing table

    Hi,
    I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
    I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
    timesheet_entry_id,time_worked,timesheet_date,project_key .
    The csv columns are :
    project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
    What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
    Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
    Thanks,
    Anjali

    Hi Anjali,
    Take a look at these threads which might outline different ways to do it -
    File Browse, File Upload
    Loading CSV file using external table
    Loading a CSV file into a table
    you can create hidden items in the page to validate previous records before insert data.
    Hope this helps,
    M Tajuddin
    http://tajuddin.whitepagesbd.com

  • Loading data from .csv file into Oracle Table

    Hi,
    I have a requirement where I need to populate data from .csv file into oracle table.
    Is there any mechanism so that i can follow the same?
    Any help will be fruitful.
    Thanks and regards

    You can use Sql Loader or External tables for your requirement
    Missed Karthick's post ...alredy there :)
    Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM

  • How to load a XML file into a table

    Hi,
    I've been working on Oracle for many years but for the first time I was asked to load a XML file into a table.
    As an example, I've found this on the web, but it doesn't work
    Can someone tell me why? I hoped this example could help me.
    the file acct.xml is this:
    <?xml version="1.0"?>
    <ACCOUNT_HEADER_ACK>
    <HEADER>
    <STATUS_CODE>100</STATUS_CODE>
    <STATUS_REMARKS>check</STATUS_REMARKS>
    </HEADER>
    <DETAILS>
    <DETAIL>
    <SEGMENT_NUMBER>2</SEGMENT_NUMBER>
    <REMARKS>rp polytechnic</REMARKS>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>3</SEGMENT_NUMBER>
    <REMARKS>rp polytechnic administration</REMARKS>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>4</SEGMENT_NUMBER>
    <REMARKS>rp polytechnic finance</REMARKS>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>5</SEGMENT_NUMBER>
    <REMARKS>rp polytechnic logistics</REMARKS>
    </DETAIL>
    </DETAILS>
    <HEADER>
    <STATUS_CODE>500</STATUS_CODE>
    <STATUS_REMARKS>process exception</STATUS_REMARKS>
    </HEADER>
    <DETAILS>
    <DETAIL>
    <SEGMENT_NUMBER>20</SEGMENT_NUMBER>
    <REMARKS> base polytechnic</REMARKS>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>30</SEGMENT_NUMBER>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>40</SEGMENT_NUMBER>
    <REMARKS> base polytechnic finance</REMARKS>
    </DETAIL>
    <DETAIL>
    <SEGMENT_NUMBER>50</SEGMENT_NUMBER>
    <REMARKS> base polytechnic logistics</REMARKS>
    </DETAIL>
    </DETAILS>
    </ACCOUNT_HEADER_ACK>
    For the two tags HEADER and DETAILS I have the table:
    create table xxrp_acct_details(
    status_code number,
    status_remarks varchar2(100),
    segment_number number,
    remarks varchar2(100)
    before I've created a
    create directory test_dir as 'c:\esterno'; -- where I have my acct.xml
    and after, can you give me a script for loading data by using XMLTABLE?
    I've tried this but it doesn't work:
    DECLARE
    acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
    BEGIN
    insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
    select x1.status_code,
            x1.status_remarks,
            x2.segment_number,
            x2.remarks
    from xmltable(
      '/ACCOUNT_HEADER_ACK/HEADER'
      passing acct_doc
      columns header_no      for ordinality,
              status_code    number        path 'STATUS_CODE',
              status_remarks varchar2(100) path 'STATUS_REMARKS'
    ) x1,
    xmltable(
      '$d/ACCOUNT_HEADER_ACK/DETAILS[$hn]/DETAIL'
      passing acct_doc as "d",
              x1.header_no as "hn"
      columns segment_number number        path 'SEGMENT_NUMBER',
              remarks        varchar2(100) path 'REMARKS'
    ) x2
    END;
    This should allow me to get something like this:
    select * from xxrp_acct_details;
    Statuscode status remarks segement remarks
    100 check 2 rp polytechnic
    100 check 3 rp polytechnic administration
    100 check 4 rp polytechnic finance
    100 check 5 rp polytechnic logistics
    500 process exception 20 base polytechnic
    500 process exception 30
    500 process exception 40 base polytechnic finance
    500 process exception 50 base polytechnic logistics
    but I get:
    Error report:
    ORA-06550: line 19, column 11:
    PL/SQL: ORA-00932: inconsistent datatypes: expected - got NUMBER
    ORA-06550: line 4, column 2:
    PL/SQL: SQL Statement ignored
    06550. 00000 -  "line %s, column %s:\n%s"
    *Cause:    Usually a PL/SQL compilation error.
    and if I try to change the script without using the column HEADER_NO to keep track of the header rank inside the document:
    DECLARE
    acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
    BEGIN
    insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
    select x1.status_code,
            x1.status_remarks,
            x2.segment_number,
            x2.remarks
    from xmltable(
      '/ACCOUNT_HEADER_ACK/HEADER'
      passing acct_doc
      columns status_code    number        path 'STATUS_CODE',
              status_remarks varchar2(100) path 'STATUS_REMARKS'
    ) x1,
    xmltable(
      '/ACCOUNT_HEADER_ACK/DETAILS'
      passing acct_doc
      columns segment_number number        path 'SEGMENT_NUMBER',
              remarks        varchar2(100) path 'REMARKS'
    ) x2
    END;
    I get this message:
    Error report:
    ORA-19114: error during parsing the XQuery expression: 
    ORA-06550: line 1, column 13:
    PLS-00201: identifier 'SYS.DBMS_XQUERYINT' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    ORA-06512: at line 4
    19114. 00000 -  "error during parsing the XQuery expression: %s"
    *Cause:    An error occurred during the parsing of the XQuery expression.
    *Action:   Check the detailed error message for the possible causes.
    My oracle version is 10gR2 Express Edition
    I do need a script for loading xml files into a table as soon as possible, Give me please a simple example for understanding and that works on 10gR2 Express Edition
    Thanks in advance!

    The reason your first SQL statement
    select x1.status_code,
            x1.status_remarks,
            x2.segment_number,
            x2.remarks
    from xmltable(
      '/ACCOUNT_HEADER_ACK/HEADER'
      passing acct_doc
      columns header_no      for ordinality,
              status_code    number        path 'STATUS_CODE',
              status_remarks varchar2(100) path 'STATUS_REMARKS'
    ) x1,
    xmltable(
      '$d/ACCOUNT_HEADER_ACK/DETAILS[$hn]/DETAIL'
      passing acct_doc as "d",
              x1.header_no as "hn"
      columns segment_number number        path 'SEGMENT_NUMBER',
              remarks        varchar2(100) path 'REMARKS'
    ) x2
    returns the error you noticed
    PL/SQL: ORA-00932: inconsistent datatypes: expected - got NUMBER
    is because Oracle is expecting XML to be passed in.  At the moment I forget if it requires a certain format or not, but it is simply expecting the value to be wrapped in simple XML.
    Your query actually runs as is on 11.1 as Oracle changed how that functionality worked when 11.1 was released.  Your query runs slowly, but it does run.
    As you are dealing with groups, is there any way the input XML can be modified to be like
    <ACCOUNT_HEADER_ACK>
    <ACCOUNT_GROUP>
    <HEADER>....</HEADER>
    <DETAILS>....</DETAILS>
    </ACCOUNT_GROUP>
      <ACCOUNT_GROUP>
      <HEADER>....</HEADER>
      <DETAILS>....</DETAILS>
      </ACCOUNT_GROUP>
    </ACCOUNT_HEADER_ACK>
    so that it is easier to associate a HEADER/DETAILS combination?  If so, it would make parsing the XML much easier.
    Assuming the answer is no, here is one hack to accomplish your goal
    select x1.status_code,
            x1.status_remarks,
            x3.segment_number,
            x3.remarks
    from xmltable(
      '/ACCOUNT_HEADER_ACK/HEADER'
      passing acct_doc
      columns header_no      for ordinality,
              status_code    number        path 'STATUS_CODE',
              status_remarks varchar2(100) path 'STATUS_REMARKS'
    ) x1,
    xmltable(
      '$d/ACCOUNT_HEADER_ACK/DETAILS'
      passing acct_doc as "d",
      columns detail_no      for ordinality,
              detail_xml     xmltype       path 'DETAIL'
    ) x2,
    xmltable(
      'DETAIL'
      passing x2.detail_xml
      columns segment_number number        path 'SEGMENT_NUMBER',
              remarks        varchar2(100) path 'REMARKS') x3
    WHERE x1.header_no = x2.detail_no;
    This follows the approach you started with.  Table x1 creates a row for each HEADER node and table x2 creates a row for each DETAILS node.  It assumes there is always a one and only one association between the two.  I use table x3, which is joined to x2, to parse the many DETAIL nodes.  The WHERE clause then joins each header row to the corresponding details row and produces the eight rows you are seeking.
    There is another approach that I know of, and that would be using XQuery within the XMLTable.  It should require using only one XMLTable but I would have to spend some time coming up with that solution and I can't recall whether restrictions exist in 10gR2 Express Edition compared to what can run in 10.2 Enterprise Edition for XQuery.

  • Getting Issue while uploading CSV file into internal table

    Hi,
    CSV file Data format as below
         a             b               c              d           e               f
    2.01E14     29-Sep-08     13:44:19     2.01E14     SELL     T+1
    actual values of column   A is 201000000000000
                     and  columen D is 201000000035690
    I am uploading above said CSV file into internal table using
    the below coding:
    TYPES: BEGIN OF TY_INTERN.
            INCLUDE STRUCTURE  KCDE_CELLS.
    TYPES: END OF TY_INTERN.
    CALL FUNCTION 'KCD_CSV_FILE_TO_INTERN_CONVERT'
        EXPORTING
          I_FILENAME      = P_FILE
          I_SEPARATOR     = ','
        TABLES
          E_INTERN        = T_INTERN
        EXCEPTIONS
          UPLOAD_CSV      = 1
          UPLOAD_FILETYPE = 2
          OTHERS          = 3.
      IF SY-SUBRC <> 0.
        MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
                WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
    am getting all columns data into internal table,
    getting problem is columan A & D. am getting values into internal table for both; 2.01E+14. How to get actual values without modifying the csv file format.
    waiting for your reply...
    thanks & regards,
    abhi

    Hi Saurabh,
    Thanks for your reply.
    even i can't double click on those columns.
    b'se the program needs be executed in background there can lot of csv file in one folder. No manual interaction on those csv files.
    regards,
    abhi

  • Question about reading csv file into internal table

    Some one (thanks those nice guys!) in this forum have suggested me to use FM KCD_CSV_FILE_TO_INTERN_CONVERT to read csv file into internal table. However, it can be used to read a local file only.
    I would like to ask how can I read a CSV file into internal table from files in application server?
    I can't simply use SPLIT as there may be comma in the content. e.g.
    "abc","aaa,ab",10,"bbc"
    My expected output:
    abc
    aaa,ab
    10
    bbb
    Thanks again for your help.

    Hi Gundam,
    Try this code. I have made a custom parser to read the details in the record and split them accordingly. I have also tested them with your provided test cases and it work fine.
    OPEN DATASET dsn FOR input IN TEXT MODE ENCODING DEFAULT.
    DO.
    READ DATASET dsn INTO record.
      PERFORM parser USING record.
    ENDDO.
    *DATA str(32) VALUE '"abc",10,"aaa,ab","bbc"'.
    *DATA str(32) VALUE '"abc","aaa,ab",10,"bbc"'.
    *DATA str(32) VALUE '"a,bc","aaaab",10,"bbc"'.
    *DATA str(32) VALUE '"abc","aaa,ab",10,"b,bc"'.
    *DATA str(32) VALUE '"abc","aaaab",10,"bbc"'.
    FORM parser USING str.
    DATA field(12).
    DATA field1(12).
    DATA field2(12).
    DATA field3(12).
    DATA field4(12).
    DATA cnt TYPE i.
    DATA len TYPE i.
    DATA temp TYPE i.
    DATA start TYPE i.
    DATA quote TYPE i.
    DATA rec_cnt TYPE i.
    len = strlen( str ).
    cnt = 0.
    temp = 0.
    rec_cnt = 0.
    DO.
    *  Start at the beginning
      IF start EQ 0.
        "string just ENDED start new one.
        start = 1.
        quote = 0.
        CLEAR field.
      ENDIF.
      IF str+cnt(1) EQ '"'.  "Check for qoutes
        "CHECK IF quotes is already set
        IF quote = 1.
          "Already quotes set
          "Start new field
          start = 0.
          quote = 0.
          CONCATENATE field '"' INTO field.
          IF field IS NOT INITIAL.
            rec_cnt = rec_cnt + 1.
            CONDENSE field.
            IF rec_cnt EQ 1.
              field1 = field.
            ELSEIF rec_cnt EQ 2.
              field2 = field.
            ELSEIF rec_cnt EQ 3.
              field3 = field.
            ELSEIF rec_cnt EQ 4.
              field4 = field.
            ENDIF.
          ENDIF.
    *      WRITE field.
        ELSE.
          "This is the start of quotes
          quote = 1.
        ENDIF.
      ENDIF.
      IF str+cnt(1) EQ ','. "Check end of field
        IF quote EQ 0. "This is not inside quote end of field
          start = 0.
          quote = 0.
          CONDENSE field.
    *      WRITE field.
          IF field IS NOT INITIAL.
            rec_cnt = rec_cnt + 1.
            IF rec_cnt EQ 1.
              field1 = field.
            ELSEIF rec_cnt EQ 2.
              field2 = field.
            ELSEIF rec_cnt EQ 3.
              field3 = field.
            ELSEIF rec_cnt EQ 4.
              field4 = field.
            ENDIF.
          ENDIF.
        ENDIF.
      ENDIF.
      CONCATENATE field str+cnt(1) INTO field.
      cnt = cnt + 1.
      IF cnt GE len.
        EXIT.
      ENDIF.
    ENDDO.
    WRITE: field1, field2, field3, field4.
    ENDFORM.
    Regards,
    Wenceslaus.

  • Example for loading a csv file into diadem from a labview application

    Hi everyone, i'm using labview 8.2 and DIAdem 10.1.
    I've been searching in NI example finder but I had no luck so far.
    I have already downloaded the labview connectivity VIs.
    Can anyone provide a example that can help me loading a csv file into diadem from a labview application?
    Thanks

    Hi Alexandre.
    I attach an example for you.
    Best Regards.
    Message Edité par R_Duval le 01-15-2008 02:44 PM
    Romain D.
    National Instruments France
    #adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
    NIDays 2010 : Conférence mondiale de l'instrumentation virtuelle
    >>Détails et Inscription<<
    Attachments:
    Classeur1.csv ‏1 KB
    Load CSV to Diadem.vi ‏15 KB

  • Uploading CSV file into internal table

    Hi,
    I want to upload a CSV file into internal table.The flat file is having values as below:
    'AAAAA','2003-10-11 07:52:37','167','Argentina',NULL,NULL,NULL,NULL,NULL,'MX1',NULL,NULL,'AAAA BBBB',NULL,NULL,NULL,'1',NULL,NULL,'AR ',NULL,NULL,NULL,'ARGENT','M1V','MX1',NULL,NULL,'F','F','F','F','F',NULL,'1',NULL,'MX','MMI ',NULL
    'jklhg','2004-06-25 08:01:57','456','hjllajsdk','MANAGUA   ',NULL,NULL,'265-5139','266-5136 al 38','MX1',NULL,NULL,'hjgkid GRÖBER','sdfsdf dfs asdfsdf 380 ad ased,','200 as ads, sfd sfd abajao y 50 m al sdf',NULL,'1',NULL,NULL,'NI ',NULL,NULL,NULL,'sdfdfg','M1V','dds',NULL,NULL,
    Here I can not even split at ',' because some of the values are having value like NULL and some have values with comma too,
    The delimiter is a quote and the separator is a comma here.
    Can anyone help on this?
    Thanks.
    Edited by: Ginger on Jun 29, 2009 9:08 AM

    As long as there can be a comma in a text literal you are right that the spilt command doesn't help. However there is one possibility how to attack this under one assumption:
    - A comma outside a text delimiter is always considered a separator
    - A comma inside a text delimiter is always considered a comma as part of the text
    You have to read you file line by line and then travel along the line string character by character and setting a flag or counter for the text delimiters:
    e.g.
    "Text","Text1, Text2",NULL,NULL,"Text"
    String Index  1: EQ " => lv_delimiter = 'X'
    String Index  2: EQ T => text literal (because lv_delimiter = 'X')
    String Index  3: EQ e => text literal (because lv_delimiter = 'X')
    String Index  4: EQ x => text literal (because lv_delimiter = 'X')
    String Index  5: EQ t => text literal (because lv_delimiter = 'X')
    String Index  6: EQ " => lv_delimiter = ' ' (because it was 'X' before)
    String Index  7: EQ , => This is a separator because lv_delimiter = ' '
    String Index  8: EQ " => lv_delimiter = 'X' (because it was ' ' before)
    String Index  9: EQ T => text literal (because lv_delimiter = 'X')
    String Index 10: EQ e => text literal (because lv_delimiter = 'X')
    String Index 11: EQ x => text literal (because lv_delimiter = 'X')
    String Index 12: EQ t => text literal (because lv_delimiter = 'X')
    String Index 13: EQ 1 => text literal (because lv_delimiter = 'X')
    String Index 14: EQ , => text literal (because lv_delimiter = 'X')
    String Index 15: EQ T => text literal (because lv_delimiter = 'X')
    Whenever you hit a 'real' separator (lv_delimiter = ' ') you pass the value of the string before that up to the previous separator into the next structure field.
    This is not an easy way to do it, but if you might have commas in your text literal and NULL values I gues it is probably the only way to go.
    Hope that helps,
    Michael

  • Loading an XML file into the table without creating a directory .

    Hi,
    I wanted to load an XML file into a table column . But I should not create a directory in the server and placing the XML file there and giving the path in the insert query. Can anybody help me here?
    Thanks in advance.

    You could write a java stored procedure that retrieves the file into a clob. Wrap that in a function call and use it in your insert statement.
    This solution require read privileges granted by sys and is therefore only feasible if the top-level directory/directories are known or you get read-access to everything.

  • Load CSV file into a table when a button is clicked by the user

    Hello,
    Can anyone please help me out with this issue, I have a form where in a user comes and uploads a CSV file and clicks a button, when the button is clicked - it should load the CSV file data into the database table for the corresponding columns.
    Can anyone please suggest me a possible solution or an approach.
    Thanks,
    Orton
    Edited by: orton607 on May 5, 2010 2:00 PM

    thanks fro replying.
    I have tried your changes but its not working. One more question is that I am having one column which contains commas, when I tried to load the file its failing. I think its the problem with commas. So I have changed the code to use the replace function for that column, then also its not working. Can anyone please suggest a possible approach. Below is my source code for your reference.
    DECLARE
    v_blob_data BLOB;
    v_blob_len NUMBER;
    v_position NUMBER;
    v_raw_chunk RAW(10000);
    v_char CHAR(1);
    c_chunk_len NUMBER := 1;
    v_line VARCHAR2 (32767):= NULL;
    v_data_array wwv_flow_global.vc_arr2;
    v_rows NUMBER;
    v_sr_no NUMBER := 1;
    l_cnt BINARY_INTEGER := 0;
    l_stepid NUMBER := 10;
    BEGIN
    delete from sample_tbl;
    -- Read data from wwv_flow_files</span>
    select blob_content into v_blob_data from wwv_flow_files
    where last_updated = (select max(last_updated) from wwv_flow_files where UPDATED_BY = :APP_USER)
    and id = (select max(id) from wwv_flow_files where updated_by = :APP_USER);
    v_blob_len := dbms_lob.getlength(v_blob_data);
    v_position := 1;
    -- Read and convert binary to char</span>
    WHILE ( v_position <= v_blob_len ) LOOP
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
    v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
    v_line := v_line || v_char;
    v_position := v_position + c_chunk_len;
    -- When a whole line is retrieved </span>
    IF v_char = CHR(10) THEN
    -- Convert comma to : to use wwv_flow_utilities </span>
    v_line := REPLACE (v_line, ',', ':');
    -- Convert each column separated by : into array of data </span>
    v_data_array := wwv_flow_utilities.string_to_table (v_line);
    -- Insert data into target table </span>
    EXECUTE IMMEDIATE 'insert into sample_tbl(col1..col12)
    values (:1,:2,:3,:4,:5,:6,:7,:8,:9,:10,:11,:12)'
    USING
    v_sr_no,
    v_data_array(1),
    v_data_array(2),
    v_data_array(3),
    v_data_array(4),
    v_data_array(5),
    v_data_array(6),
    v_data_array(7),
    v_data_array(8),
    REPLACE(v_data_array(9), ':', ','),
    to_date(v_data_array(10),'MM/DD/YYYY'),
    v_data_array(11);
    -- Clear out
    v_line := NULL;
    v_sr_no := v_sr_no + 1;
    l_cnt := l_cnt + SQL%ROWCOUNT;
    END IF;
    END LOOP;
    COMMIT;
    l_stepid := 20;
    IF l_cnt = 0 THEN
    apex_application.g_print_success_message := apex_application.g_print_success_message || ' Please select a file to upload ' ;
    ELSE
    apex_application.g_print_success_message := apex_application.g_print_success_message || 'File uploaded and processed ' || l_cnt || ' record(s) successfully.';
    END IF;
    l_stepid := 30;
    EXCEPTION WHEN OTHERS THEN
    ROLLBACK;
    apex_application.g_print_success_message := apex_application.g_print_success_message || 'Failed to upload the file. '||REGEXP_REPLACE(SQLERRM,'[('')(<)(>)(,)(;)(:)(")('')]{1,}', '') ;
    END;
    Below is the function which I am using to convert hex to decimal
    create or replace function hex_to_decimal( p_hex_str in varchar2 ) return number
    is
    v_dec number;
    v_hex varchar2(16) := '0123456789ABCDEF';
    begin
    v_dec := 0;
    for indx in 1 .. length(p_hex_str)
    loop
    v_dec := v_dec * 16 + instr(v_hex,upper(substr(p_hex_str,indx,1)))-1;
    end loop;
    return v_dec;
    end hex_to_decimal;
    thanks,
    Orton

  • How to load a XML file into a table using PL/SQL

    Hi Guru,
    I have a requirement, that i have to create a procedure or a package in PL/SQL to load  XML file into a table.
    How we can achive this.

    ODI_NewUser wrote:
    Hi Guru,
    I have a requirement, that i have to create a procedure or a package in PL/SQL to load  XML file into a table.
    How we can achive this.
    Not a perfectly framed question. How do you want to load the XML file? Hoping you want to parse the xml file and load it into a table you can do this.
    This is the xml file
    karthick% cat emp_details.xml
    <?xml version="1.0"?>
    <ROWSET>
    <ROW>
      <EMPNO>7782</EMPNO>
      <ENAME>CLARK</ENAME>
      <JOB>MANAGER</JOB>
      <MGR>7839</MGR>
      <HIREDATE>09-JUN-1981</HIREDATE>
      <SAL>2450</SAL>
      <COM>0</COM>
      <DEPTNO>10</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7839</EMPNO>
      <ENAME>KING</ENAME>
      <JOB>PRESIDENT</JOB>
      <HIREDATE>17-NOV-1981</HIREDATE>
      <SAL>5000</SAL>
      <COM>0</COM>
      <DEPTNO>10</DEPTNO>
    </ROW>
    </ROWSET>
    You can write a query like this.
    SQL> select *
      2    from xmltable
      3         (
      4            '/ROWSET/ROW'  passing xmltype
      5            (
      6                 bfilename('SDAARBORDIRLOG', 'emp_details.xml')
      7               , nls_charset_id('AL32UTF8')
      8            )
      9            columns empno    number      path 'EMPNO'
    10                  , ename    varchar2(6) path 'ENAME'
    11                  , job      varchar2(9) path 'JOB'
    12                  , mgr      number      path 'MGR'
    13                  , hiredate varchar2(20)path 'HIREDATE'
    14                  , sal      number      path 'SAL'
    15                  , com      number      path 'COM'
    16                  , deptno   number      path 'DEPTNO'
    17         );
         EMPNO ENAME  JOB              MGR HIREDATE                    SAL        COM     DEPTNO
          7782 CLARK  MANAGER         7839 09-JUN-1981                2450          0         10
          7839 KING   PRESIDENT            17-NOV-1981                5000          0         10
    SQL>

  • How to load excel-csv file into oracle database

    Hi
    I wanted to load excel file with csv extension(named as trial.csv) into
    oracle database table(named as dept),I have given below my experiment here.
    I am getting the following error,how should I rectify this?
    For ID column I have defined as number(5) datatype, in my control file I have defined as interger external(5),where as in the Error log file why the datatype for ID column is comming as character?
    1)my oracle database table is
    SQL> desc dept;
    Name Null? Type
    ID NUMBER(5)
    DNAME CHAR(20)
    2)my data file is(trial.csv)
    ID     DNAME
    11     production
    22     purchase
    33     inspection
    3)my control file is(trial.ctl)
    LOAD DATA
    INFILE 'trial.csv'
    BADFILE 'trial.bad'
    DISCARDFILE 'trial.dsc'
    APPEND
    INTO TABLE dept
    FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    (ID POSITION(1:5) INTEGER EXTERNAL(5)      
    NULLIF ID=BLANKS,
    DNAME POSITION(6:25) CHAR
         NULLIF DNAME=BLANKS
    3)my syntax on cmd prompt is
    c:\>sqlldr scott/tiger@xxx control=trial.ctl direct=true;
    Load completed - logical record count 21.
    4)my log file error message is
    Column Name Position Len Term Encl Datatype
    ID           1:5 5 , O(") CHARACTER
    NULL if ID = BLANKS
    DNAME 6:25 20 , O(") CHARACTER
    NULL if DNAME = BLANKS
    Record 1: Rejected - Error on table DEPT, column ID.
    ORA-01722: invalid number
    Record 21: Rejected - Error on table DEPT, column ID.
    ORA-01722: invalid number
    Table DEPT:
    0 Rows successfully loaded.
    21 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Bind array size not used in direct path.
    Column array rows : 5000
    Stream buffer bytes: 256000
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records read: 21
    Total logical records rejected: 21
    Total logical records discarded: 0
    Total stream buffers loaded by SQL*Loader main thread: 0
    Total stream buffers loaded by SQL*Loader load thread: 0
    5)Result
    SQL> select * from dept;
    no rows selected
    by
    balamuralikrishnan.s

    Hi everyone!
    The following are the steps to load a excell sheet to oracle table.i tried to be as simple as possible.
    thanks
    Step # 1
    Prapare a data file (excel sheet) that would be uploaded to oracle table.
    "Save As" a excel sheet to ".csv" (comma seperated values) format.
    Then save as to " .dat " file.
    e.g datafile.bat
    1,Category Wise Consumption Summary,Expected Receipts Report
    2,Category Wise Receipts Summary,Forecast Detail Report
    3,Current Stock List Category Wise,Forecast rule listing
    4,Daily Production Report,Freight carrier listing
    5,Daily Transactions Report,Inventory Value Report
    Step # 2
    Prapare a control file that define the data file to be loaded ,columns seperation value,name and type of the table columns to which data is to be loaded.The control file extension should be " .ctl " !
    e.g i want to load the data into tasks table ,from the datafile.dat file and having controlfile.ctl control file.
    SQL> desc tasks;
    Name Null? Type
    TASK_ID NOT NULL NUMBER(14)
    TASK_NAME VARCHAR2(120)
    TASK_CODE VARCHAR2(120)
    : controlfile.ctl
    LOAD DATA
    INFILE 'e:\datafile.dat'
    INTO TABLE tasks
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    (TASK_ID INTEGER EXTERNAL,
    TASK_NAME CHAR,
    TASK_CODE CHAR)
    The above is the structure for a control file.
    Step # 3
    the final step is to give the sqlldr command to execute the process.
    sqlldr userid=scott/tiger@abc control=e:\controlfile.ctl log=e:\logfile.log
    Message was edited by:
    user578762

Maybe you are looking for

  • Why do hyperlinks not work when Firefox loads a new web page?

    Today, for no apparent reason and for the first time ever in years of using Firefox, I am finding that when a new page loads I cannot select any of the hyperlinks on the page. This is happening on 90% or more of pages I access. If I navigate to a dif

  • To trigger ADOBE form from Custom Infotype

    hello, I hae create d ADOBE form and want to trigger it on "Save" record from Custom Infotype.... I need this form to be displied as User my do some changes and save it again... I have used following code to trigger this from module pool... Data : lv

  • Role not opening properly

    Hi, When a user logs into portal, he gets Home role as the first role. But, when he clicks on some other role and once again click to Home role, then some other role is opening and a transaction iview pops up. Under Home role, we do not have any tran

  • Trouble upgrading to IOS5

    Hi, I am trying to update my ipad to IOS5. After 'preparing for backup' I get the following message: the media on this ipad cannot be backed up because there is not enough space on this computer to hold all the backed up files (11.20GB required, 6.95

  • IPhone was unplugged during update; now completely unusable, not recognized

    So I was updating to the latest iPhone software through iTunes when somehow, the USB got unplugged. Now, the phone is always off; when I turn it on it just has a picture of a USB cable pointing towards the iTunes logo for a few seconds, and then shut