CSV file read

Is there a standard FM which handles csv file read? I am currently using a 'split at' to separate values but this fails when some strings ( within quotes) have commas.
Eg:
333,khdfs, "Company name", 87348, " Name1, Name2"
In this scenerio, the last field in quotes gets split into 2. I cannot handle this in the progrm because the last field does not always have a comma split. Any suggestions?

Hi Suker ,
First you remove all the Quotes , then split into coma (,).
I mean to say --
REPLACE   ALL   OCCURRENCES OF '"'  IN <string_name> WITH SPACE.
Now split the string at the coma - -
SPLIT  AT  ....
Regards
Pinaki

Similar Messages

  • BULK INSERT from a text (.csv) file - read only specific columns.

    I am using Microsoft SQL 2005, I need to do a BULK INSERT from a .csv I just downloaded from paypal.  I can't edit some of the columns that are given in the report.  I am trying to load specific columns from the file.
    bulk insert Orders
    FROM 'C:\Users\*******\Desktop\DownloadURL123.csv'
       WITH
                  FIELDTERMINATOR = ',',
                    FIRSTROW = 2,
                    ROWTERMINATOR = '\n'
    So where would I state what column names (from row #1 on the .csv file) would be used into what specific column in the table.
    I saw this on one of the sites which seemed to guide me towards the answer, but I failed.. here you go, it might help you:
    FORMATFILE [ = 'format_file_path' ]
    Specifies the full path of a format file. A format file describes the data file that contains stored responses created using the bcp utility on the same table or view. The format file should be used in cases in which:
    The data file contains greater or fewer columns than the table or view.
    The columns are in a different order.
    The column delimiters vary.
    There are other changes in the data format. Format files are usually created by using the bcp utility and modified with a text editor as needed. For more information, see bcp Utility.

    Date, Time, Time Zone, Name, Type, Status, Currency, Gross, Fee, Net, From Email Address, To Email Address, Transaction ID, Item Title, Item ID, Buyer ID, Item URL, Closing Date, Reference Txn ID, Receipt ID,
    "04/22/07", "12:00:21", "PDT", "Test", "Payment Received", "Cleared", "USD", "321", "2.32", "3213', "[email protected]", "[email protected]", "", "testing", "392302", "jdal32", "http://ddd.com", "04/22/03", "", "",
    "04/22/07", "12:00:21", "PDT", "Test", "Payment Received", "Cleared", "USD", "321", "2.32", "3213', "[email protected]", "[email protected]", "", "testing", "392932930302", "jejsl32", "http://ddd.com", "04/22/03", "", "",
    Do you need more than 2 rows? I did not include all the columns from the actual csv file but most of it, I am planning on taking to the first table these specfic columns: date, to email address, transaction ID, item title, item ID, buyer ID, item URL.
    The other table, I don't have any values from here because I did not list them, but if you do this for me I could probably figure the other table out.
    Thank you very much.

  • Csv file reading and voltage and current plotting with respect to time samples XY plotting

    Hallo,
             I've been struggling with reading  a comma separated value (csv) file from another instrument (attached). I need to plot this data for analysis. I have 5 column of data with numbers of rows, the first three row is the information of the measurement. I want to read 4th row as string and rest of the row as numbers. I want to plot 2nd column (i1)  with respect to TIMESTAMP; 4th column(u2) wrt TIMESTAMP. And finally plotting i1 (x-axis) vs.. u2 (y-axis) in labview. Could anyone help me.
    In excel its so easy to plot  but I don't know how its done in labview.
    Attachments:
    labview forum test.csv ‏30 KB
    excel plot.jpg ‏88 KB

    Start by opening the file.  Then use the Read Text File function.  You can right-click on it and configure it to read lines.  First make it read 3 lines (this is your extra header data).  Then make it read a single line.  That will give you your channel names.  Then read the rest of the file (disable the read by line and wire a -1 into the number of bytes to read).  Then use the Spreadsheet String to Array function to give you your data.
    I would recommend going through the LabVIEW tutorials if you are really new at this.
    LabVIEW Basics
    LabVIEW 101
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • CSV file reading using UTL_FILE at run time

    Hi,
    I have to read CSV file using UTL_FILE.
    but Folder contains Many CSV files.
    I dont know there name.So i have to read csv file at run time.
    Please let me know how should we achieve this?
    Thanks

    place the following in a shell script, say "list_my_files.ksh"
    ls -l > my_file_list.datthen run the shell script using dbms_scheduler;
    begin
    dbms_scheduler.create_program (program_name   => 'a_test_proc'
                                  ,program_type   => 'EXECUTABLE'
                                  ,program_action => '/home/bluefrog/list_my_files.ksh'
                                  ,number_of_arguments => 0
                                  ,enabled => true);
    end;
    /then open "my_file_list.dat" using UTL_FILE, read all file names and choose the one you require.
    P;

  • CSV  FILE READING

    Hi all,
    I got the Csv parser from the net.It is giving runtime error "IO FILE Exception"
    actually there are 3 file included in it.
    CSVFile
    import java.util.ArrayList;
    import java.io.BufferedReader;
    import java.io.FileReader;
    * holds the file object of records
    public class CSVFile
    * arraylist of records, each one containing a single record
    private ArrayList records = new ArrayList();
    * What to replace a row delimiter with, on output.
    private String replacementForRowDelimiterInTextField = " "; // Change if needed.
         * debug, > 0 for output.
        public int debug = 5;
        private boolean debugLoading = true; //true when debugging load cycle
    *Return the required record
    *@param index the index of the required record
    *@return a CSVRecord, see #CSVRecord
    public CSVRecord getRecord (int index)
        if (this.debug > 3 && !debugLoading) {
         System.err.println("CSVFile getRecord ["+index+"]"+ ((CSVRecord)this.records.get(index)).getFields(3));
         return (CSVRecord)this.records.get(index);
    *Get the number of records in the file
    *@return 1 based count of records.
    public int count()
         return this.records.size();
         // ----- Constructor -----
    *Constructor; create a file object
    *@param details  a propertyFile object, see #propertyFile
    *@param csvFile filename of csv file
         public CSVFile(propertyFile details, String csvFile)
             try{
              BufferedReader reader = new BufferedReader (new FileReader (csvFile));
              //StringBuilder sbBuffer = new StringBuilder( reader.ReadToEnd() );
              StringBuffer buf=new StringBuffer();
              String text;
              try {
                  while ((text=reader.readLine()) != null)
                   buf.append(text + "\n");
                  reader.close();
              }catch (java.io.IOException e) {
                  System.err.println("Unable to read from csv file "+ csvFile);
                  System.exit(2);
              String buffer;
              buffer = buf.toString();
              buffer = buffer.replaceAll("&","&");
              buffer = buffer.replaceAll("<","<");
              boolean inQuote = false;
              String savedRecord = "";
              String curRecord = "";
              if (debug > 2) {
                  System.err.println("csvFile: setup");
                  System.err.println("Read int from src CSV file");
              //Split entire input file into array records, using row delim.
              String records[] =  buffer.split( details.rowDelimiter() );
              //Iterate over each split, looking for incomplete quoted strings.
              for (int rec=0; rec <records.length; rec++)
                   curRecord = savedRecord + records[rec];
                   if (debug > 4) {
                       System.out.println("csvFile: saved rec" + savedRecord);
                       System.out.println("csvFile: current rec " + curRecord);
                       System.out.println("csvFile: currRecLth: " + curRecord.length());
                   for (int i = 0; i < curRecord.length(); i ++ )
                        char ch = curRecord.charAt(i);
                        char prev = ( i != 0? curRecord.charAt(i-1): ' ');
                        char nxt = ( i < (curRecord.length()-2)? curRecord.charAt(i+1): ' ');
                        if ( !inQuote && ch == '"' )
                            inQuote = true;
                        else
                            if ( inQuote && ch == '"' )
                             if ( i + 1 < curRecord.length() )
                                 inQuote = (nxt == '"')
                                  || (prev == '"');
                             else
                                 inQuote = false;
                   if ( inQuote )
                        // A space is currently used to replace the row delimiter
                        //when found within a text field
                        savedRecord = curRecord + replacementForRowDelimiterInTextField;
                        inQuote = false;
                   else
                        this.records.add( new CSVRecord(details, curRecord) );
                        savedRecord = "";
              catch (java.io.FileNotFoundException e) {
                  System.out.println("Unable to read CSV file, quitting");
                  System.exit(2);
         // ----- Private Methods -----
         private String[] SplitText(String textIn, String splitString)
              String [] arrText = textIn.split(splitString);
              return arrText;
    *Get all records in the csvfile
    *@return array of CSVRecords, see #CSVRecord
    public CSVRecord[] GetAllRecords()
    CSVRecord[] allRecords = new CSVRecord[ this.records.size() ];
    for (int i = 0; i < this.records.size(); i++ )
         allRecords[i] = (CSVRecord)this.records.get(i);
    return allRecords;
      public static void main(String args[])
         propertyFile path=new propertyFile("C:\\bea\\jdk142_05\\bin");
        CSVFile  a=new CSVFile(path,"C:\\bea\\jdk142_05\\bin\\xxx.csv");
    CSVRecord
    import  java.util.ArrayList;
    *Represents a single record of a CSV file
    public class CSVRecord
         *Debug
        private int debug = 0;
         * Arraylist of fields of the record
        private ArrayList fields = new ArrayList();
         *get the field with index index
         *@param index of field required
         *@return String value of that field
        public String getFields (int index)
         if ( index < fields.size())
         return (String)this.fields.get(index);
         else return ("");
         *get the number of fields
         *@return int number of fields in this file
        public int count()
         return this.fields.size();
         *Create a csv record from the input String, using the propertyfile.
         *@param  details , the property file
         *@see <a href="propertyFile.html">propertyFile</a>
         *@param  recordText , the record to be added to the arraylist of records
        public  CSVRecord(propertyFile details, String recordText)
          * true if within a quote
         boolean inQuote = false;
          * temp saved field value
         String savedField = "";
          * current field value
         String curField = "";
          * field being built
         String field = "";
          * array of records.
          * split it according to the field delimiter.
          * The default String.split() is not accurate according to the M$ view.
         String records[] =  recordText.split( details.fieldDelimiter() );
         for (int rec=0; rec <records.length; rec++)
              field = records[rec];
              //Add this field to currently saved field.
              curField = savedField + field;
              //Iterate over current field.
              for (int i = 0; i < curField.length(); i ++ ){
                   char ch = curField.charAt(i); //current char
                   char nxt = ((i==
                             curField.length() -1)
                            ? ' ' : curField.charAt(i+1)); //next char
                   char prev = (i==0? ' ': curField.charAt(i-1)); //prev char
                   if ( !inQuote && ch == '"' )
                       inQuote = true;
                   else
                       if ( inQuote && ch == '"' )
                        if ( (i + 1) < curField.length() )
                            inQuote = (nxt == '"') || (prev == '"');
                        else
                            inQuote = (prev == '"');
              }//end of current field
              if ( inQuote )
                   savedField = curField + details.fieldDelimiter() + " ";
                   inQuote = false;
              else if (!inQuote && curField.length() > 0)
                   char ch = curField.charAt(0); //current char
                   char lst = curField.charAt(curField.length()-1);
                   if (ch   == '"' &&
                       lst == '"')
                        //Strip leading and trailing quotes
                        curField = curField.substring(1,curField.length()-2);
                        //curField = curField.Replace( "\"\"", "\"" );
                        curField =curField.replaceAll("\"\"", "\"");
                   this.fields.add( curField );
                   savedField = "";
              else if(curField.length() == 0){
                  this.fields.add("");
              if (debug > 2)
                  System.out.println("csvRec  Added:" + curField);
             }//   end of for each record
    propertyFile
    import java.util.ArrayList;
    import java.io.BufferedReader;
    import java.io.FileReader;
    * This class holds the data from a Property file.
    public class propertyFile
        // ----- Private Fields -----
         *Comments from the file
        private String comment;
         * Delimiter for individual fields
        private String fieldDelimiter; // was char
         *   Delimiter for each row
        private String rowDelimiter;
         * Root element to use for output XML
        private String xmlRootName;
         * Element to use for each row
        private String recordName;
         *How many fields are there -  Note: This is 1 based, not zero based.
        private int fieldCount;
         * array of fields
        private ArrayList fields = new ArrayList(88);
         *Set to int > 0 for debug output
        private int  debug=0;
    /** A single instance of this will hold all the relavant details for ONE PropertyFile.
        *@param filePath String name of the property file.
        public  propertyFile(String filePath)
         //StreamReader reader = new StreamReader( filePath );
         try {
         BufferedReader reader = new BufferedReader (new FileReader (filePath));
         String line = null;
         while ( (line = reader.readLine()) != null )
              if ( line.length() != 0 )   //was != ""
                   if (debug> 0)
                       System.err.println("String is: " + line + "lth: " + line.length());
                   if ( line.charAt(0) != '[' && !( line.startsWith("//") ) )
                        String propertyValue = line.split("=")[1];
                        // Assign Comment
                        if ( line.toUpperCase().startsWith("COMMENT=") )
                            this.comment = propertyValue;
                        // Assign Field Delimter
                        if ( line.toUpperCase().startsWith("FIELDDELIMITER") )
                            this.fieldDelimiter = propertyValue.substring(0);
                        // Assign Row Delimiter
                        if ( line.toUpperCase().startsWith("ROWDELIMITER") )
                             if ( propertyValue.substring(0,1).toUpperCase() ==
                                  "\\" && propertyValue.toUpperCase().charAt(1) == 'N')
                                 this.rowDelimiter = "\r\n";
                             else
                                 this.rowDelimiter = propertyValue;
                        // Assign Root Document Name
                        if ( line.toUpperCase().startsWith("ROOTNAME") )
                            this.xmlRootName = propertyValue;
                        // Assign Record Name
                        if ( line.toUpperCase().startsWith("RECORDNAME") )
                            this.recordName = propertyValue;
                        // Assign Field Count
                        if ( line.toUpperCase().startsWith("FIELDS") )
                            this.fieldCount =  Integer.parseInt(propertyValue);
                   else
                        if ( line.toUpperCase().startsWith("[FIELDS]") )
                             while ( (line = reader.readLine()) != null )
                                  if ( line.length() == 0)
                                      break;
                                  else{
                                      if (debug > 0)
                                       System.err.println("Adding: "+line.split("=")[1]);
                                      this.fields.add( line.split("=")[1] );
                             break;
         reader.close();
         } catch (java.io.IOException e) {
             System.out.println("**** IO Error on input file. Quitting");
             System.exit(2);
         * Return the comment int the property file
         *@return String, the comment value, if any
        public String comment ()
         return this.comment;
         * The delimiter to be used for each field, often comma.
         *@return String, the character(s)
        public String fieldDelimiter()
         return this.fieldDelimiter;
         * Row Delimiter - often '\n'
         *@return String, the character(s)
        public String rowDelimiter ()
         return this.rowDelimiter;
        * The XML document root node.
        * @return String, the element name
        public String XMLRootName()
         return this.xmlRootName;
        /** <summary>
        ** The node name for each record
        public String recordName()
         return this.recordName;
        ** Number of Fields per record/node
        *@return integer count of number of fields, 1 based.
        public int fields()
         return this.fieldCount;
         // ----- Public Methods -----
         ** The value of the nth field, 0 based.
         ** @param index Which field to return
         * @return String the field value
        public String fieldNames(int index)
         if (index <this.fields.size())
             return (String)this.fields.get(index); //was .toString()
         else
              System.err.println("PropertyFile: Trying to get idx of :"
                           + index
                           + "\n when only "
                           //+ (this.fields.size() -  1)
                           + this.fieldCount
                           + " available"
              System.exit(2);
         return "";
         *Test entry point, this class
         *@param argv  cmd line arg of property file
        public static void main (String argv[]) {
              if ( argv.length != 1) {
               System.out.println ("Md5 <file>") ;
               System.exit (1) ;
        propertyFile p = new propertyFile(argv[0]);
    Please help as i m novice in File handling espically csvfiles

    > **** IO Error on input file. Quitting
    Press any key to continue . . .
    Ok, no compiler error but it seems that the filePath String name of the property file isn't there.

  • SSIS CSV FILE READING ISSUE

    hi can u reply me for the below post
    Actually iam using a FF connection manger to read the csv file .. with the row delimiter to {CR}{LF}.
    Suddenly in looping through the files package got failed because it cant able to read the csv file 
    Again i changed  the row delimiter to {LF} then it worked for the file which i faced issue with the {CR}{LF} delimiter.
    Now i want to know why the package is failing for the row delimiter issue..
    can any one help me on this.
    Please share me what actually the difference between those

    Please share me what actually the difference between those
    CR = Carriage Return = CHAR(13) in SQL
    This character is used in Mac OS as new line.
    When this character is used, cursor goes to first position of the line
    LF = Line Feed = CHAR(10) in SQL
    This character is used in Unix as new line.
    When this character is used, cursor goes to next line of the line (Old days typewritter when paper moves up)
    CR LF
    New line in Windows system. Combination of CR and LF.
    Best thing is to open the test flat file in Notepadd++ and enable show symbols - show chars and see what exactly you have as row delimiter
    Cheers,
    Vaibhav Chaudhari
    [MCTS],
    [MCP]

  • How can I have a csv file read from excel to labview.

    Hi,
    I would like to read multiple csv files from excel to labview, creating a duplicate of the tables in excel, which would allow me to then draw some graphs for data analysis and comparison between the two.
    Are there any examples that could be useful to what I am trying to do?
    Thanks

    Patel33 wrote:
    From one of the csv files, I only require 3 of the columns. Is there a way to only read that part of the csv file?
    No. The characters in a file are just one long string and delimiters and linefeeds are special characters that defined where fields and lines start and end. As such, columns are interlaced into the file and consists of many small sections, where the position depends on the number of characters in each field, which is typically variable. You really need to read the entire file, then only look at the interesting columns later.
    LabVIEW Champion . Do more with less code and in less time .

  • 10g - disconnected analytics, encrytion on csv file in client machine

    hi, experts,
    I found that the disconnected analytics client firstly download the data from server and SAVED AS CSV FILES.
    but everyone can view the content in csv files.
    IT IS UNSAFE.
    can any encrytion be done for this issue?
    Thank you very mucH!

    user11861756 wrote:
    I am trying to get a csv file in the client machine .My database server is configured in oracle 10g within an unix
    box.Though i am able to get a csv file on my server by using pl/sql codes and can have it on my local machine by
    ftp.But the problem is we dont have any permission to create ant file in our production environment .
    Is there is any process or utility on which i can build on my previous approach to get my csv file on my client
    machine directly.
    The answer to this question is the question.. What is client-server?*
    PL/SQL, SQL, Oracle, database ... all these equal server.
    SQL*Plus, TOAD, web browsers... all these equal client.
    Who handles the presentation of data on the client? Who has access to the client platform's keyboard, mouse, screen printer, disk drives, etc? The client.
    So how can a server process or component then hack across the client-server architecture and into a client component and write data (e.g. CSV files) there?
    This is a fundamental concept that needs to be clearly understood when dealing with all aspects of client-server.
    What also needs to be understood that client-server is a software architecture. This means that client-server can run on a single machine. Client can run on a PC. Server can run on a big Unix machine. Or server can run on the PC and the client can run on the big Unix machine.
    Which means that you can have PL/SQL code (on the server platform) acting as a client.. and use server software on your client platform to service that PL/SQL session.
    For example. PL/SQL creates a CSV "+file+" as a CLOB. PL/SQL uses a FTP package and opens a client FTP session to a FTP server on your client PC. Now your PC is the FTP server and PL/SQL the FTP client. The client then proceeds to FTP the CLOB as a CSV file.

  • Importing CSV file and parsing it

    First of all I am very new to writing powershell code.  Therefore, my question could be very rudimentary, but I cannot find an answer, so please help.
    I'm trying to read a CSV file and parse it.  I cannot figure out how to access nth element without hardcoding its name.
    $data = Import-Csv $file   #import CSV file
    # read column headers (manually read the first row of the data file, or import it from other source, or ...)
    $file_dump = Get-Content $file  #OK, I'm sure there is another way to get just the first line, but that's not relevant
    $name_list = $file_dump[0].split(",")
    # access element
    $temp = $data[$i].Name  # works - but that's HARDCODING the column name into the script - what if someone changes it?
    #but what I want to do is
    $temp = $data[$i].$name_list[0]
    How do I do this in PowerShell?

    So you're asking how to get the first data point from the first column, no matter what the header is?
    Why won't you know what your input file looks like?
    You can always drop the first line of the file to remove the existing headers and then use the -Header parameter of Import-Csv to give yourself known headers to reference (this will only work if you know how many columns to expect, etc etc etc).
    http://ss64.com/ps/import-csv.html
    Don't retire TechNet! -
    (Don't give up yet - 13,085+ strong and growing)

  • Submit quiz results to one single .csv file

    How can I submit quiz results (over 200 people will be taking
    my captivate quiz) to a single .csv file?
    Right now, the quizes are submitted to my email address and
    attached to the email as a POSTDATA.ATT file. I have to manually go
    into my Outlook and save attachment as "FnameLname.csv”. So
    each quiz taker will have an individual .csv file! So I will have
    over 100 emails and over 100 .csv files!!!!!
    How can I make the quiz results submit to a single
    Quiz_Results.csv file on my web server instead?

    The way I would do this is to submit the scores into a
    database. In between Captivate and the database you'll need
    middleware (.asp, asp.net, ColdFusion, etc.). This middleware
    receives your data from Captivate and processes it - submitting it
    into the database. You can then write another middleware page that
    produces a report (web page table, or exports .csv file) with the
    data stored the database.
    Another possibility is to use Captivate's built-in SCORM
    functionality and submit user scores into an LMS, then run reports
    and export .csv files from your LMS.
    Sorry - I don't think this functionality is built into
    Captivate to join multiple records into one .csv file.

  • Color in .CSV file

    Hi,
    How can we add color to header of .csv file which is to be sent to the unix server.
    Currently iam using the following code:
      open dataset l_direc for output in text mode encoding default.
      if sy-subrc = 0.
        loop at ifile.
          transfer ifile to l_direc.
        endloop.
      endif.
      close dataset l_direc.
    where the first record in the internal table 'ifile' is the header record.
    Please send your suggestions,
    Swapna.

    arun14 wrote:
    Subject: Re: how to manipulate ,csv file using java
    how to highlight a particular column in .csv file with a particular colorThere is no formatting information in a .csv file.

  • Parser - CSV files to Oracle database

    Hello all,
    I wrote a csv parser that parse csv files to micorsoft access database. Now I need to parse the dataset to oracle database. This is part of my exporter class
    class MdbExporter : IExporter
    /// <summary>
    /// Exportiert das DataSet in eine Mdb-Datei.
    /// </summary>
    /// <param name="ds">Zu exportierendes DataSet.</param>
    public void Write(DataSet ds, string[] names)
    string conStr = "Provider=Microsoft.Jet.OLEDB.4.0;" +
    "Data Source=" + names[0] + ";";
    Console.WriteLine("Exporting to database {0} ...", names[0]);
    Any ideas how can I do that?
    Thanks for any replay in advance.

    I wrote a parser that goes through several folders containing csv files, reads them and creates a dataset that contains 15 tables ( which I need for an intranet application after that). I succeed to write a MDB exporter that export this dataset to microsoft access database. Now I need an exporter that will export my dataset to ORACLE database. Below is the full code for my MDB exporter:
    public void Write(DataSet ds, string[] names)
    string conStr = "Provider=Microsoft.Jet.OLEDB.4.0;" +
    "Data Source=" + names[0] + ";";
    Console.WriteLine("Exporting to database {0} ...", names[0]);
    DbConnection connection = new OleDbConnection(conStr);
    try
    connection.Open();
    catch (DbException e)
    ConsoleEx.WriteException(true, e, "Unable to open database: {0}", names[0]);
    throw;
    DbCommand command = connection.CreateCommand();
    foreach (DataTable table in ds.Tables)
    Console.WriteLine("\tDeleting table: {0}", table.TableName);
    // delete old tables
    command.CommandText = string.Format("drop table {0}", table.TableName);
    TryExecute(command, false);
    // create new
    Console.WriteLine("\tCreating new table: {0}", table.TableName);
    string[] columnStrings = new string[table.Columns.Count];
    for (int i = 0; i < table.Columns.Count; i++)
    columnStrings[i] = "`" + table.Columns.ColumnName + "`" + " varchar";
    command.CommandText = string.Format("create table {0} ({1})",
    table.TableName, string.Join(", \n", columnStrings));
    TryExecute(command, true);
    // add rows
    for (int row = 0; row < table.Rows.Count; row++)
    for (int col = 0; col < table.Columns.Count; col++)
    columnStrings[col] = "'" + Convert.ToString(table.Rows[row].ItemArray[col]) + "'";
    command.CommandText = string.Format("insert into {0} values ({1})",
    table.TableName, string.Join(", \n", columnStrings));
    TryExecute(command, true);
    connection.Close();
    I need similar exporter to Oracle database. Starting with SQL LOADER from the beginning its really not a good in my opinion since I am almost done with this approach.
    Any help would be appreciated.
    Regards,
    Sven

  • How to add colors to a CSV file?

    Hi there,
    I'm exporting data to a CSV file using printWriter.
    Question: is it possible to add colors, bold text or stretch the length of a column in the produced CSV file?
    This, of course, should be the end result of the file (not do it when the file is open)
    Thank you!

    xianwinwin wrote:
    Question: is it possible to add colors, bold text or stretch the length of a column in the produced CSV file?Of course if you assumed that you'd be looking at the file with a fixed-width font, you could insert a bunch of tabs or spaces to get columns to match up, but I strongly advise you not do that. I think of CSV as a poor-man's excel data file or perhaps a flat database file, but most importantly it holds data in a standard format that can be read and manipulated by many different programs, including ones you write yourself. It is not meant to be for pretty visualization. Try to pretty it up and you possibly ruin it's main reason for existing. For nice visualization I'd create a small app that read the CSV, formatted it nicely and either displayed the data or stored it in some pretty format as a non-CSV file.

  • Csv file loading issue

    I am trying to load data from csv file into oracle table.
    The interface executes successfully but the problem is-
    there are 501 rows in the original csv file.
    When i load it in the file model it shows only 260 rows.
    What is the problem? Why all the rows are not loaded?

    Just forget about the interface.
    I am creating a new datastore of file type.
    In the resource name, i m giving my files path.
    when i reverse engineer it and check for the data, it shows only 260.
    But there are 501 records in my csv file.

  • Read a CSV file and dynamically generate the insert

    I have a requirement where there are multiple csv's which needs to be exported to a sql table. So far, I am able to read the csv file and generate the insert statement dynamically for selected columns however, the insert statement when passed as a parameter
    to the $cmd.CommandText
    does not evaluate the values
    How to evaluate the string in powershell
    Import-Csv -Path $FileName.FullName | % {
    # Insert statement.
    $insert = "INSERT INTO $Tablename ($ReqColumns) Values ('"
    $valCols='';
    $DataCols='';
    $lists = $ReqColumns.split(",");
    foreach($l in $lists)
    $valCols= $valCols + '$($_.'+$l+')'','''
    #Generate the values statement
    $DataCols=($DataCols+$valCols+')').replace(",')","");
    $insertStr =@("INSERT INTO $Tablename ($ReqColumns) Values ('$($DataCols))")
    #The above statement generate the following insert statement
    #INSERT INTO TMP_APPLE_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )
    $cmd.CommandText = $insertStr #does not evaluate the values
    #If the same statement is passed as below then it execute successfully
    #$cmd.CommandText = "INSERT INTO TMP_APL_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )"
    #Execute Query
    $cmd.ExecuteNonQuery() | Out-Null
    jyeragi

    Hi Jyeragi,
    To convert the data to the SQL table format, please try this function out-sql:
    out-sql Powershell function - export pipeline contents to a new SQL Server table
    If I have any misunderstanding, please let me know.
    If you have any feedback on our support, please click here.
    Best Regards,
    Anna
    TechNet Community Support

Maybe you are looking for