Reading CSV on Unix

Hi,
I want to read a CSV file that is in a folder in UNIX box. I dont have an ODBC DSN , how can I read it ?
Thanks in Advance
Jyoti

I am reading it from harddisk but I dont want to parse the CSV. I am using the following code-
String dbUrl = "C:\\pl\\New Folder\\data.xls";
Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
String lConnectStr = "jdbc:odbc:Driver={Microsoft Excel Driver (*.txt)};DBQ=" + dbUrl + ";DriverID=22;READONLY=false";
Connection conn = DriverManager.getConnection(lConnectStr );
Statement sql = conn.createStatement();
ResultSet result = sql.executeQuery("SELECT * from data.xls ");
but I am getting the error-
Exception: java.sql.SQLException: [Microsoft][ODBC Excel Driver] The Microsoft Jet database engine could not find the obj
ect 'xls'. Make sure the object exists and that you spell its name and the path name correctly.

Similar Messages

  • Drivers to read CSV file

    Hi,
    In one of my application, I need to read CSV file as a database and read the columns from the CSV file. Can I get any drivers which accepts CSV file as a datasource and I can query on that datasource?
    I know this will work in Windows.. But I am working on UNIX and need drivers to be installed on Unix.
    Does any one knows how to do it or if they can provide any links, That is of very helpful for me..
    Please help me in resolving this problem..

    Hi
    You need to use a JDBC driver of Type 3 to do your task.
    Swaraj

  • Question about reading csv file into internal table

    Some one (thanks those nice guys!) in this forum have suggested me to use FM KCD_CSV_FILE_TO_INTERN_CONVERT to read csv file into internal table. However, it can be used to read a local file only.
    I would like to ask how can I read a CSV file into internal table from files in application server?
    I can't simply use SPLIT as there may be comma in the content. e.g.
    "abc","aaa,ab",10,"bbc"
    My expected output:
    abc
    aaa,ab
    10
    bbb
    Thanks again for your help.

    Hi Gundam,
    Try this code. I have made a custom parser to read the details in the record and split them accordingly. I have also tested them with your provided test cases and it work fine.
    OPEN DATASET dsn FOR input IN TEXT MODE ENCODING DEFAULT.
    DO.
    READ DATASET dsn INTO record.
      PERFORM parser USING record.
    ENDDO.
    *DATA str(32) VALUE '"abc",10,"aaa,ab","bbc"'.
    *DATA str(32) VALUE '"abc","aaa,ab",10,"bbc"'.
    *DATA str(32) VALUE '"a,bc","aaaab",10,"bbc"'.
    *DATA str(32) VALUE '"abc","aaa,ab",10,"b,bc"'.
    *DATA str(32) VALUE '"abc","aaaab",10,"bbc"'.
    FORM parser USING str.
    DATA field(12).
    DATA field1(12).
    DATA field2(12).
    DATA field3(12).
    DATA field4(12).
    DATA cnt TYPE i.
    DATA len TYPE i.
    DATA temp TYPE i.
    DATA start TYPE i.
    DATA quote TYPE i.
    DATA rec_cnt TYPE i.
    len = strlen( str ).
    cnt = 0.
    temp = 0.
    rec_cnt = 0.
    DO.
    *  Start at the beginning
      IF start EQ 0.
        "string just ENDED start new one.
        start = 1.
        quote = 0.
        CLEAR field.
      ENDIF.
      IF str+cnt(1) EQ '"'.  "Check for qoutes
        "CHECK IF quotes is already set
        IF quote = 1.
          "Already quotes set
          "Start new field
          start = 0.
          quote = 0.
          CONCATENATE field '"' INTO field.
          IF field IS NOT INITIAL.
            rec_cnt = rec_cnt + 1.
            CONDENSE field.
            IF rec_cnt EQ 1.
              field1 = field.
            ELSEIF rec_cnt EQ 2.
              field2 = field.
            ELSEIF rec_cnt EQ 3.
              field3 = field.
            ELSEIF rec_cnt EQ 4.
              field4 = field.
            ENDIF.
          ENDIF.
    *      WRITE field.
        ELSE.
          "This is the start of quotes
          quote = 1.
        ENDIF.
      ENDIF.
      IF str+cnt(1) EQ ','. "Check end of field
        IF quote EQ 0. "This is not inside quote end of field
          start = 0.
          quote = 0.
          CONDENSE field.
    *      WRITE field.
          IF field IS NOT INITIAL.
            rec_cnt = rec_cnt + 1.
            IF rec_cnt EQ 1.
              field1 = field.
            ELSEIF rec_cnt EQ 2.
              field2 = field.
            ELSEIF rec_cnt EQ 3.
              field3 = field.
            ELSEIF rec_cnt EQ 4.
              field4 = field.
            ENDIF.
          ENDIF.
        ENDIF.
      ENDIF.
      CONCATENATE field str+cnt(1) INTO field.
      cnt = cnt + 1.
      IF cnt GE len.
        EXIT.
      ENDIF.
    ENDDO.
    WRITE: field1, field2, field3, field4.
    ENDFORM.
    Regards,
    Wenceslaus.

  • Loading data from CSV to Unix database

    Hi All
    We had copied CSV into UNIX box and tried to upload data from CSV to oracle. We are able to load data from CSV's but some CSV are loading perfectly, but are loading extra character into the column.
    Even I tried by putting the CSV files in windows and load the data into UNIX database, I am facing the same problem.
    But if I use the same CSV's and load data in Windows database, It is working fine.
    Can anybody suggest me the solution.
    Regards,
    Kumar.

    ... oh, what a confusion. I still answerded in the ittoolbox group:
    "Hi,
    the problem are the different character sets in the windows and in the unix environment. So some of the characters are missinterpreted.
    Is the database where you first loaded the csv's from a unix box also on unix? How did you copied the files? Via ftp? I hope in ascii mode? "
    Regards,
    Detlef

  • Use servlet read .csv file by ie, but why two more " ?

    Hi , friends
    following is the key method in my servlet used to read .csv file to ie , but , after I read file to client side , I found that in the both sides of every line " was added, so the csv file can have only one column (many columns was merged into one column), that was not what i want, who guys can find some idea? thanks for your enthusiasm!
    void serveRemoteFile(String sFileName, String sContentType, HttpServletRequest req, HttpServletResponse res, StringBuffer sbLog, Runtime rt){
    FileInputStream in = null;
    ServletOutputStream out = null;
    byte bBuf[] = null;
    int nLen;
    if (isBlank(sContentType) || isBlank(sFileName))
    return;
    res.setStatus(res.SC_OK);
    res.setContentType(sContentType);
    res.setHeader("Content-Disposition", "inline;filename= temp.csv");
    try {
    in = new FileInputStream(sFileName);
    out = res.getOutputStream();
    bBuf = new byte[1024];
    while ( (nLen = in.read(bBuf,0,1024)) != -1) {
    out.write(bBuf, 0, nLen);
    out.flush();
    out.close();
    }catch (FileNotFoundException e){
    errorPage(req,res,sbLog);
    }catch (IOException e){
    errorPage(req,res,sbLog);

    Excel uses a weird CSV file format. You can find more information about it and a Java library to read data from it here:
    http://ostermiller.org/utils/ExcelCSV.html

  • How do I load Adobe reader on a unix operating system?

    I am not very computer literate so keep that in mind.  How can I load Adobe Reader on a unix based operating PC?

    Try this link:
    http://xmodulo.com/2013/07/how-to-install-adobe-reader-on-linux.html
    I hope it helps.

  • Memory problem reading CSV to array

    I've written a CSV reader in order to read training and test data into an array of arrays. Now the way I'm doing it I use stringTokenizer to return a string array of the csv file then convert that to a float array. When I convert the string to a float I make the string null, I hoped that would keep the memory down but when I get to files of about 65000 rows and 20 columns it runs out of memory. I'm sure this can be solved by increasing the heap size but I just found the whole float to string to float a little inefficient. My questions is this; is there are better way to convert a csv of floats into an array of floats, ie without first converting it to a string and then back to a float?
    Error:
    Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
         at InputTokenizer.csvReader(InputTokenizer.java:33)
         at run.main(run.java:49)
    import java.io.*;
    import java.util.*;
    public class InputTokenizer {
         public String[][] numbers;
         float[][] floatInputs;
         public float[][] csvReader(String fileLocation, int numRows, int numCols){
              File file = new File(fileLocation);
              numbers = new String[numRows][numCols];
              String line = null;
              int row = 0;
              int col = 0;
              try{
                   BufferedReader bufRdr = new BufferedReader(new FileReader(file));
                   while((line = bufRdr.readLine()) != null)
                        StringTokenizer st = new StringTokenizer(line,";");
                        while (st.hasMoreTokens())
                             //get next token and store it in the array
                             numbers[row][col] = st.nextToken();
                             col++;
                   row++;
                   col=0;
                   float[][] floatInputs = new float[numRows][numCols];
                   for(int i=0; i<numCols;i++){
                      for(int j=0;j<numRows;j++){
                           Float buffer = new Float(numbers[i][j]);
                           numbers[i][j] =null;
                           floatInputs[i][j] = buffer;
              catch(Exception e)
                   System.out.println("Exception while reading csv file: " + e);
              return floatInputs;
    }

    Stupid questions, changed the code to this and it seems to work = )
    import java.io.*;
    import java.util.*;
    public class InputTokenizer {
         float[][] floatInputs;
         public float[][] csvReader(String fileLocation, int numRows, int numCols){
              File file = new File(fileLocation);
              floatInputs = new float[numRows][numCols];
              String line = null;
              int row = 0;
              int col = 0;
              try{
                   BufferedReader bufRdr = new BufferedReader(new FileReader(file));
                   while((line = bufRdr.readLine()) != null)
                        StringTokenizer st = new StringTokenizer(line,";");
                        while (st.hasMoreTokens())
                             //get next token and store it in the array
                             floatInputs[row][col] = new Float(st.nextToken());
                             col++;
                   row++;
                   col=0;
              catch(Exception e)
                   System.out.println("Exception while reading csv file: " + e);
              return floatInputs;
    }

  • Reading Data from Unix file and write into an Internal table

    Dear all,
                     I am having an requirement of reading data from unix file and write the same into an internal table..how to do that ...experts please help me in this regard.

    Hi,
    do like this
    PARAMETERS: p_unix LIKE rlgrap-filename OBLIGATORY.
    DATA: v_buffer(2047) TYPE c.
    DATA: BEGIN OF i_buffer OCCURS 0,
            line(2047) TYPE c,
    END OF i_buffer.
    * Open the unix file..
    OPEN DATASET p_unix FOR INPUT IN TEXT MODE.
    <b>IF sy-subrc NE 0.
    *** Error Message "Unable to open file.
    ELSE.</b>
       DO.
         CLEAR: v_buffer.
         READ DATASET p_unix INTO v_buffer.
         IF sy-subrc NE 0.
            EXIT.
         ENDIF.
         MOVE v_buffer TO i_buffer.
         APPEND i_buffer.
      ENDDO.
    ENDIF.
    CLOSE DATASET p_unix.
    <b>Reward points if it helps,</b>
    Satish

  • ODI reading csv file issue

    Hi,
    I am in ODI 11.1.1, which is our ELT tool for metedata load for all Planning App.
    One issue i am facing is , one data extract coming from an essbase cube using a Calc Script as .csv file.
    when i am trying to read that in ODI , it is unable to read. Opening the file and saving it manually . After that step ODI is able to read the file.
    in the datastore under the Model , i have defined the file type Delimited as comma and header 2, record separator Ms-Dos.
    anyone has come accross this issue ?
    Thanks,
    KP

    Hi,
    Most probalby it is due to difference between file creation platform and file read platform.
    Say you created a file in Unix and then try to read the file in MS_Dos , ODI will not be able to file , unless you set
    record separator as Unix.
    Thanks,
    Sutirtha

  • How to read csv Data and save it with no format changes

    Hi,
    At first I am not used to Diadem.
    I want to read in a csv file do some calculation with the data and save the changed data in the same csv file. A file as an example is attached (496888_edit.csv).
    Therefore I wrote this lines:
    Dim i
    Dim Delimiter
    Dim FilePath
    Dim FileParameters
    Delimiter = ";"
    Call DataFileLoad(FilePath,"CSV","Load")
    ' Do some calculations
    FileParameters = "<filename>"&FilePath&"</filename>"&"<delimiter>"&​Delimiter&"</delimiter>"
    Call DataFileSave(FileParameters,"CSV")
    After running that lines the csv file is looking like the other attached file (496888_after.csv)
    Because of some reasons which I could not explain Diadem is rounding the numbers. I want that both files look the same.
    What can I do?
    There might be another extension. Just to read in some columns, doing some calculations and after the calculation saving that columns in the file instead of the originals. (The csv files are much bigger like the two examples)
    Thanks,
    Jens 
    Attachments:
    496888_edit.csv ‏1 KB
    496888_after.csv ‏1 KB

    The only thing that can be changed in writing float64 values using the CSV plugin is the decimal point ('.' or ',').
    The format of the doubles is not rounded but the CSV writer only writes the relevant digits.
    It is using up to 15 digits which is the resolution of float 64. It would also switch to scintific writing if necessary.
    So there is no way to force only 6 digits and filling 0s are left out. So if you just fear that you loose precision that will not happen.
    (only the typical problems of epressing a 2 system binary value in a 10 system text string)
    If you are interested to have a fix float format the only solution is to write with VBS directly to a file doing formatting on your own, which is slow.

  • Read CSV file into a 1-D array

    Hi
    I would like to read a csv file into a cluster of 4 elements which would then be read into a 1-D array.
    My cluster contains a typedef, a double, a boolean, and another typedef.
    Basically it could be seen as:
    Bob Runs, 4, T, Bob
    Mary sits, 5, F, Mary
    Bob Sits, 2, F, Bob
    Mary Runs, 9, T, Mary
    (keeps growing)
    Are there any good examples for what I am trying to put together that I could leverage, or is it better to use a different input file than a csv. I am trying to make my program more flexable and easier to make adjustments even after the executable is created.  My line items seem to be growing exponentially and is getting difficult to manage in the LV window.
    Thanks
    Solved!
    Go to Solution.

    Unless your CSV file is huge, I'd use "Read from Spreadsheet File" with the delimiter set as "," and the type as string.  This will give you a 2D array of strings.  You could then separate out each column of the array, convert to the appropriate data type, and use Index & Bundle Cluster Array to build your array of clusters.  Something like this (except I'm using a string constant in place of reading from the file).

  • Read csv file and insert the data to a list in sharePoint

    Hi everyone,
    i wrote a code that reads from a csv file all his data but i need also to insert all the data to a new list in sharePoint.
     how can i do this?
    plus, i need to read from the csv file once a day in specific hour. how can i do this? thank you so much!!

    Did you look at the example I posted above?
    ClientContext in CSOM will allow you to get a handle on the SharePoint objects;
    http://msdn.microsoft.com/en-us/library/office/ee539976(v=office.14).aspx
    w: http://www.the-north.com/sharepoint | t: @JMcAllisterCH | YouTube: http://www.youtube.com/user/JamieMcAllisterMVP

  • SQL server 2014 and VS 2013 - Dataflow task, read CSV file and insert data to SQL table

    Hello everyone,
    I was assigned a work item wherein, I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current
    file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.
    On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file. Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?
    I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns
    in it. These files needs to be migrated to SQL tables using the optimum way.
    Does anybody know which is the best way to setup the Dataflow task for this requirement?
    Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.
    Any help would be much appreciated. It's very urgent.
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com

    The standard Data Flow Task supports only static metadata defined during design time. I would recommend you check the commercial COZYROC
    Data Flow Task Plus. It is an extension of the standard Data Flow Task and it supports dynamic metadata at runtime. You can process all your input CSV files using a single Data Flow Task
    Plus. No programming skills are required.
    SSIS Tasks Components Scripts Services | http://www.cozyroc.com/

  • Reading csv files efficiently..

    Hi..
    I need to read,parse and insert required data from csv files of size 1 GB into database efficiently.
    Which is the best approach.
    Thanks,

    I did this search:
    http://www.google.com/search?q=site%3Ajakarta.apache.org+csv+parser
    and found this:
    http://jakarta.apache.org/commons/sandbox/csv/
    It took only a few seconds.
    Every java project should start with a google search against apache.org.

  • Problem in reading csv file in servlet

    Hi everyone,
    I m getting an ClassCastExeption while importing a .csv file to databse...
    It works fine in case of .xls...
    I m using jxl from Jakarta...and the code is as follows...
    Thanks in advance..
    FileItemFactory factory1 = new DiskFileItemFactory();
    ServletFileUpload upload1 = new ServletFileUpload(factory1);
    items1 = upload1.parseRequest(request);
    Iterator iter = items1.iterator();
    while (iter.hasNext()) {
                                  FileItem item = (FileItem) iter.next();
                                  InputStream uploadedStream = item.getInputStream();
    importCSV(uploadedStream);
    and in the import method.....
    Workbook w;
    w = Workbook.getWorkbook((FileInputStream)is);
    In the above line i m getting ClassCastException...
    Please help..

    Thanks gimbal2 ..
    w = Workbook.getWorkbook(is); throws the FileNotFoundException
    I solved it by tokenizing the inputstream on ','(comma) and then getting the tokens for each column in the database....

Maybe you are looking for

  • Touchpad and keyboard don't work simultaneously

    Hey everyone! I just got a new Edge 15 yesterday (this is the Best Buy exclusive one, and it's more expensive of the two options available), but I'm having an issue with it that's really preventing me from enjoying using the machine, and it's incredi

  • Using RFC in File to Idoc Scenario

    Hi all,    I got a complex requirement in EDI to IDOC Scenario , I am getting EDI 850 file from Customer and posting IDOC to R3 . In this process i need to have some validations using RFC(Which connects to R3) at XI level, If these validations fails

  • How to use standard validator with custom component?

    Hi folks! I've implemented a custom component extending UIInput. My component worked well but now I want to use a standard validator. How can I use it?? regards, Steven

  • Double tap on home button

    Hi thought to access this is was settings, general and home but apparently not. Checked google and they say teh same. Can some one tell me where it's moved to? thanks Ian

  • How to create the deployment descriptor with the sun application server

    I have packaged my .war and ejb-jar files. I go to autodeploy and it says i need a deployment descriptor?