How to read a CSV file and Insert data into an Oracle Table

Hi All,
I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .
Please let me some suggestions on this.
Thanks,
Chandra R

jeneesh wrote:
And, please don't "hijack" 5 year old thread..Better start a new one..I've just split it off to a thread of it's own. ;)
@OP,
I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .You don't have a "clob file" as there's no such thing. CLOB is a datatype for storing large character based objects. A file is something on the operating system's filesystem.
So, why have you stored comma seperated data in a CLOB?
Where did this data come from? If it came from a file, why didn't you use SQL*Loader or, even better, External Tables to read and parse the data into structured format when populating the database with it?
If you really do have to parse a CLOB of data to pull out the comma seperated values, then you're going to have to write something yourself to do that, reading "lines" by looking for the newline character(s), and then breaking up the "lines" into the component data by looking for commas within it, using normal string functions such as INSTR and SUBSTR or, if necessary, REGEXP_INSTR and REGEXP_SUBSTR. If you have string data that contains commas but uses double quotes around the string, then you'll also have the added complexity of ignoring commas within such string data.
Like I say... it's much easier with SQL*Loader of External Tables as these are designed to parse such CSV type data.

Similar Messages

  • 1.is it possible to read a .xls file and load it into an oracle table

    1.is it possible to read a .xls file and load it into an oracle table, using oracle database or oracle forms i.e. either utl_file, or text_io, or any other oracle tool.
    As far as I know we need a csv file or a txt ( tab delimited) file ?
    2.Are there any windows tools for the same

    Hi,
    If you want to use the DDE package to read the XLS file then yes, you will neeed to know the number of rows and columns in the input file.
    i.e. How will you know :
    1) How many columns are there in the input file.
    If I have a XLS file with the following data :
    R1C1 R1C2 R1C3 R1C4 R1C5 R1C6 R1C7
    xxx xx x
    Where R represents row and C represents column, then how will you know the each row has 7 columns. If you know the answer upfront, then it's not a issue.
    Using the DDE apprach, you will have to specify the RowNum and the ColumnNo of each idividual cells to read/write data from xls sheet.
    Look at the syntax in my ealier post.
    using the other approch (i.e. comma delimited text file - CSV file) , you need not know the number of columns as you can loop thru the input record till the last column is read.
    All you have to do is to look for the 'n' occurances of the field delimiter say ',', do a substr from the current position to the point where the ',' was found.
    This process is to be repeated in a loop till all columns are read.
    The TEXT_IO package can trap for EOF (End Of File).
    Hope I made myself clear.
    -- Shailender Mehta --

  • Please recommend if we have options to read xml file and insert data into table without a temporary table.

    Please recommend if we have options to read xml file and insert data into table without a temporary table. 

    DECLARE @data XML;
    SET @data =N'<Root>
    <List RecordID="946236" />
    <List RecordID="946237" />
    <List RecordID="946238" />
    <List RecordID="946239" />
    <List RecordID="946240" />
    </Root>'
    INSERT INTO t (id) SELECT T.customer.value('@RecordID', 'INT') AS id
    FROM @data.nodes('Root/List')
     AS T(customer);
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Drop, create and insert data into few intermediate tables

    Hi All,
    I need to schedule a process to drop, create and insert data into few intermediate tables on a weekly basis. Here is what i need to do in the stored procedure, which can be scheduled weekly.
    DROP TABLE TABLE_NAME1;
    DROP TABLE TABLE_NAME2;
    DROP TABLE TABLE_NAME3;
    CREATE TABLE TABLE_NAME1
    CREATE TABLE TABLE_NAME2
    CREATE TABLE TABLE_NAME3
    INSERT INTO TABLE_NAME1 SELECT ....;
    INSERT INTO TABLE_NAME2 SELECT ....;
    INSERT INTO TABLE_NAME3 SELECT ....;
    Any suggestions, examples or code on how to accomplish this task would be very helpful. Any question pls let me know.
    Thanks in advance.

    I am using the intermediate tables in an extract process. The idea was that the table would be created prior to calling the extract procedure and once the data written to the intermediate table had been processed the table would be dropped. This would be repeated each time the extract process is run. From a DBA's point or view, would it be better to just leave the table on the database and truncate it after each run or is removing it entirely best?

  • How to read an ascii file and record data in a 2d array

    HI everyone,
    I have an experimental data file in ascii format. It contains 10 data sets.
    I'm trying to read the ascii file and record data in a 2d array for further analysis,
    but I still could not figure it out how to do it.
    Please help me to get this done.
    Here I have attaced the ascii file.
    -Sam
    Attachments:
    data.asc ‏123 KB
    2015-01-27_18-01-31_fourier_A2-F-abs.zip ‏728 KB

    Got it!
    Thank you very much !
    -Pamsath

  • SQL server 2014 and VS 2013 - Dataflow task, read CSV file and insert data to SQL table

    Hello everyone,
    I was assigned a work item wherein, I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current
    file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.
    On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file. Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?
    I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns
    in it. These files needs to be migrated to SQL tables using the optimum way.
    Does anybody know which is the best way to setup the Dataflow task for this requirement?
    Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.
    Any help would be much appreciated. It's very urgent.
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com

    The standard Data Flow Task supports only static metadata defined during design time. I would recommend you check the commercial COZYROC
    Data Flow Task Plus. It is an extension of the standard Data Flow Task and it supports dynamic metadata at runtime. You can process all your input CSV files using a single Data Flow Task
    Plus. No programming skills are required.
    SSIS Tasks Components Scripts Services | http://www.cozyroc.com/

  • Using servlets to read from text file and insert data

    Hi,
    Can I use a servlet to read from a space delimited text file on the client computer and use that data to insert into a table in my database? I want to make it easy for my users to upload their data without having to have them use SQL*Loader. If so can someone give me a hint as how to get started? I appreciate it.
    Thanks,
    Colby

    Create a page for the user to upload the file to your webserver and send a message (containing the file location) to a server app that will open the file, parse it, and insert it into your database. Make sure you secure the page.
    or
    Have the user paste the file into a simple web form that submits to a servlet that parses the data and inserts it into your db.

  • Using .csv file and adding data into database

    hi,
    i'm working on a project which shows all the share prices on a webpage from the FTSE100..
    my problem is..i connect to yahoo.co.uk to get my share price information which is updated every 15mins..they return a .csv file to me...at the moment, i am just printing the information onto my website, but is there any way that i could store this information into a database if i needed the data to be used elsewhere...thanx for the help in advance

    below is my code which i used to get my info onto the web...i'd just like to know how i would use this to store the data into a database..
    import java.net.*;
    import java.io.*;
    import java.lang.*;
    import java.util.*;
    public class SharePrice
         private String line;
         private int maxShares = 101;//maximun shares a user can have
         private int details = 5;//five details, name,date,time,price,change.
         public String [][] shareData = new String[maxShares][details];
    public SharePrice(String [] shares) throws Exception
              getShare(shares);
         //returns a double array containing share data of each share as a seperate row in the array
    public String [][] getShare(String [] sh) throws Exception
                   for(int i=0; i<sh.length; i++)
                        //if the entry is null we have reached the end of the array
                        if(sh!=null)
                             String share = sh[i];
                             //part of url of the resource
                             String address ="http://uk.finance.yahoo.com/d/quotes.csv?s=";
                             //adds the share tothe url so that particular shares data is retieved
                             address = address+share;
                             System.out.println(address);
                             try
                                  //connection is created to the resource and input stream opened to read data
                                  URL url = new URL(address);
                                  BufferedReader in = new BufferedReader(
                                            new InputStreamReader(
                                            url.openStream()));
                                  line = in.readLine();
                                  in.close();
                             }catch(Exception e){System.err.println("Exception: " + e.getMessage());
                                  e.printStackTrace();}
                             //the line of data retrieved is spli and placed in a single row of the array
                             //beause the each piece of data is seperated by commas it is easily seperated.
                             StringTokenizer t = new StringTokenizer(line, ",");
                             int count = t.countTokens();
                             System.out.println(" count= "+count);
                             while(t.hasMoreTokens())
                                  for(int j=0; j<count; j++)
                                       String s = t.nextToken();
                                       shareData[i][j] = s;
              return shareData;

  • How to read a txt file then insert them into db tables

    Hi,
    Please help me to solve this problem. This is what I need to do:
    1. read a text file which holds the values to be inserted into some database tables. For example it holds ename, deptno, sal, dname for emp and dept tables. These are in a text file now.
    2. Then I need to insert these data into db tables like emp and dept, and check if there are dups or voilate any constraints. The records to be inserted are usually > 100,000, so I am also looking for a fast way to do this.
    3. This operation happens regularly, so I need to schedule the job. I know this can be done by using procedures and dbms_jobs.
    Any help will be greately appreciated.

    Generally, in 8i, folks would use sql loader to accomplish this. Sql loader does exactly what you want, but is an external application, so it would probably be easiest to schedule it to run using your operating system's scheduler (i.e. cron on Unix, Task Scheduler on Windows). You could certainly write a PL/SQL procedure that called the external sql loader application, and use the Oracle scheduler to schedule it, but that strikes me as more effort than it's worth.
    If you're using 9i, you could look into external tables, which would simplify the process greatly.
    Justin

  • How to open an Excel file and write data into it.

    Hi All,
    I have an excel template, which has graphs and some tables containing corresponding data. If i change the data in tables it changes the graphs. So, if i have a template in the server, is it possible for me to open this excel file and change the data in the tables to chanage the graphs. How can i go to different worksheets and go to different cells and change the values and save the file.
    Thanx in advance
    Cheers
    Pej

    You can setup an ODBC connection to the Excel file and update the file with JDBC, using the JDBC-ODBC bridge.
    Hope this helps

  • Reading file and dump data into database using BPEL process

    I have to read CSV files and insert data into database.. To achieve this, I have created asynchronous bpel process. Added Filed Adapter and associated it with Receive activity.. Added DB adapter and associated with Invoke activity. Total two receive activity are available in  process, when tried to Test through EM, only first receive activity is completed, and waiting on second receive activity. Please suggest how to proceed with..
    Thanks, Manoj.

    Deepak, thank for your reply.. As per your suggestion I created BPEL composite with
    template "Define Service Later". I followed below steps, please correct me if I am wrong/missing anything. Your help is highly appreciated...
    Step 1-
    Created File adapter and corresponding Receive Activity (checkbox create instance is checked) with input variable.
    Step 2 - Then in composite.xml, dragged the
    web service under "Exposed Services" and linked the web service with Bpel process.
    Step 3 - Opened .bpel file and added the DB adapter with corresponding Invoke activity, created input variable. Web service is created of Type "Service" with existing WSDL(first option aginst WSDL URL).
    and added Assign activity between receive and invoke activities.
    Deployed the composite to server, when triedTest it
    manually through EM, it is promting for input like "subElmArray Size", then I entered value as 1 with corresponding values for two elements and click on Test We Service button.. Ptocess is completing in error. The error is
    Error Message:
    Fault ID
    service:80020
    Fault Time
    Sep 20, 2013 11:09:49 AM
    Non Recoverable System Fault :
    Correlation definition not registered. The correlation set definition for operation Read, process default/FileUpload18!1.0*soa_3feb622a-f47e-4a53-8051-855f0bf93715/FileUpload18, is not registered with the server. The correlation set was not defined in the process. Redeploy the process to the containe

  • Insert data into msql thru table trigger

    Hi,
    I installed dg4msql 11.2 and I would like to insert some data into our msql thru one of our Oracle table after insert table,
    I used the syntax below
    insert into test@dg4msql ("PN") values (:new.PN)
    but the below error problem if I insert data into my Oracle table,
    ORA-02054: transaction 4.17.15898 in-doubt
    ORA-028500, connect from ORACLE to a non-Oracle system returned this message
    [Oracle][ODBC SQL Server Driver][SQL Server] Invalid object name "RECOVER.HS_TRANSACTION_LOG
    Do I have to create the HS_TRANSACTION_LOG table in my msql server to resolve this problem?
    Thanks
    Vincent

    Vincent,
    If the gateway insert is being done as part of a trigger then it will almost certainly be part of a distributed transaction so yes, you will have to create the transaction log table and the recovery user in the SQL*Server database.
    As well as the gateway documentation have a look at this note in My Oracle Support -
    Note.227011.1 How to Setup DG4MSQL to Use Distributed Transactions          (Doc ID 227011.1)
    Regards,
    Mike

  • Read a CSV file and dynamically generate the insert

    I have a requirement where there are multiple csv's which needs to be exported to a sql table. So far, I am able to read the csv file and generate the insert statement dynamically for selected columns however, the insert statement when passed as a parameter
    to the $cmd.CommandText
    does not evaluate the values
    How to evaluate the string in powershell
    Import-Csv -Path $FileName.FullName | % {
    # Insert statement.
    $insert = "INSERT INTO $Tablename ($ReqColumns) Values ('"
    $valCols='';
    $DataCols='';
    $lists = $ReqColumns.split(",");
    foreach($l in $lists)
    $valCols= $valCols + '$($_.'+$l+')'','''
    #Generate the values statement
    $DataCols=($DataCols+$valCols+')').replace(",')","");
    $insertStr =@("INSERT INTO $Tablename ($ReqColumns) Values ('$($DataCols))")
    #The above statement generate the following insert statement
    #INSERT INTO TMP_APPLE_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )
    $cmd.CommandText = $insertStr #does not evaluate the values
    #If the same statement is passed as below then it execute successfully
    #$cmd.CommandText = "INSERT INTO TMP_APL_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )"
    #Execute Query
    $cmd.ExecuteNonQuery() | Out-Null
    jyeragi

    Hi Jyeragi,
    To convert the data to the SQL table format, please try this function out-sql:
    out-sql Powershell function - export pipeline contents to a new SQL Server table
    If I have any misunderstanding, please let me know.
    If you have any feedback on our support, please click here.
    Best Regards,
    Anna
    TechNet Community Support

  • Read a csv file and read the fiscal yr in the 4th pos?

    Hello ABAP Experts,
    how to write a code for read a csv file and read the fiscal year in the 4th position.
    any suggestions or code highly appreciated.
    Thanks,
    BWer

    Hi Bwer,
    Declare table itab with the required fields...
    Use GUI UPLOAD to get the contents of the file (say abc.csv) in case if the file is on the presentation server...
    CALL FUNCTION 'GUI_DOWNLOAD'
      EXPORTING
        filename                        = 'c:\abc.csv'
       FILETYPE                        = 'ASC'
        WRITE_FIELD_SEPARATOR           = 'X'
      tables
        data_tab                        = itab
    EXCEPTIONS
       FILE_WRITE_ERROR                = 1
       NO_BATCH                        = 2
       OTHERS                          = 22
    IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
             WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    Use OPEN DATASET in case if the file is on the application server..
    After that USE SPLIT command at comma to get the contents of the 4th field...
    Regards,
    Tanveer.
    <b>Please mark helpful answers</b>

  • How to insert data into the mysql table by giving as a text file

    Hi,
    Any one know's how to insert data into the mysql table by giving as a text file as the input in JSP.Please respond ASAP.
    Thanks:)

    At least you can try StringTokenizer to parse your text files. Or download a text JDBC driver to parse your files, for instance, HXTT Text(www.hxtt.net) or StelsCSV(www.csv-jdbc.com).

Maybe you are looking for

  • F-48 (post vendor down payment:header data)

    Hi Experts, Good Evening to you, when i was posting F-48 (post vendor down payment:header data) i was getting an error of No Special G/L acct defined for acct type K sp. G/L ind B recon.acct so at that time i went to OBYR and given in Acct type-K SGL

  • How can I remove my card information from the App Store??

    ???

  • Problem accessing sounds in jar file

    Hi, I have made a program and packed it to a jar file. Everything works fine in the IDE, so this is purely a matter of jar access. The jar has 6 folders. framework (which is one of my packages) META-INF sound (also my package) sounds (my wav files fo

  • Restricting the File adapter to pick up only a part of the payload

    Hi , I have a csv file like below, $H$,Header1, xyz, xyz $H$,Header2, xyz, xyz $H$,Header3, xyz, xyz $D$,Detail1,xyz,xyz $D$,Detail2,xyz,xyz $D$,Detail3,xyz,xyz Header and Details links have no link to each other!!! I just wanted to check if at all i

  • NT hardware requirement ???

    Hello everyone !!! I'm planning to install Oracle 8i on a NT server. This server will run a critical application. 1- Do you guys have any ideas about hardware requirement on a COMPAQ or HP server I have to follow ? 2- Could you recommend me a specifi