Splitting CSV file; output a simple tokenised string of doubles - HELP

Okay here it goes? I have a file that contains comma delimited double values in the form;
2,3
2.8,9.0
4.8,9.0
I am trying to get the values in to the a simple string of "2 3 2.8 9.0 4.8 9.0". Therefore i have tried to replace all commas with a space and then split on whitespace? no success there?
Th other way is to split on white space string.split("\\s+") and creat a string [] array and then loop through each element of the array creating a multi dimensional array of seperated values. These i could then concatente to form a string?
My question is where is my code going wrong for spliting the string???
package LineGraph;
import java.util.*;
import java.io.*;
public class ReadData extends Thread {
     private Cdf cdf = null;
     private String fileName = null;
     String rawData = "";
     double[][] data = creatData();
     static double[][] creatData() {
          String rawData = GUI.jta.getText();
            System.out.println("rawData: " + rawData);
            String dataSet = rawData.replaceAll(",","\\s+"); 
            String [] point = null;
            point = rawData.split("\\s+");
            for (int i=0; i < point.length; i++){
                 System.out.println("point:" + point);
          //for (int i = 0; i < dataSet.length; i++){
          //System.out.println("dataSet: " + dataSet[i]);
          //String [] point = null;
          //if (dataSet.length > 0) {
          // point = dataSet[0]; // start with the first element
          // for (int i=1; i<dataSet.length; i++) {
          // point = dataSet[i];
          //System.out.println("point:" + point);
          //String[] CoOrd = point.split(",");
          String result = "";
          if (point.length > 0) {
               result = point[0];
               for (int j=1; j<point.length; j++) {
                    result = point[j];
          StringTokenizer tokens = new StringTokenizer(result);
          List list = new LinkedList();
          while (tokens.hasMoreElements()){
               String number = "";
               String token = tokens.nextToken();
               for (int i=0; i<token.length(); i++){
                    if (Character.isDefined(token.charAt(i))){
                         number += token.substring(i, i+1);
          if (!number.equalsIgnoreCase("")){
               boolean add = list.add(number);
     System.out.println("list:" + list);
          double [][]data = new double[list.size()/2][2];
          int index = -2;
          for (int i=0; i<data.length;i++){
                    index += 2;
                    data[i][0] = Double.parseDouble(
                              (list.get(index).toString()));
                    data[i][1] = Double.parseDouble(
                              (list.get(index +1).toString()));
          System.out.println("data.length: " + data.length); //** add this -- see how much data is being created
          return data;
          //cdf = new Cdf(data, fileName, PrintMessage);
     public Cdf getCdf(){
          return this.cdf;

the point being there are 2 delimiters to split, the first being the comma the second being the line break thus creating an array of single digits
I have solved the issue by using replaceall to replace all commas with spaces and then split on the spaceStill makes no sense. A line break doesn't not contain spaces. So how does replacing commas with spaces allow you to split on the line break.
If you are appending each line to a string (to build one long string) then append the data with a comma, not a space.
I really get the idea you guys are not enjoying my code?Its your requirements we don't understand. You obviously aren't explaining them correctly.

Similar Messages

  • Splitting csv file

    Hi,
    I have a procedure which stores my retrieved records into a csv file using the mime type. I want to split my resulting file into 2 depending on the records retrieved. Is this possible? Please reply asap.
    Thanks in advance
    Bharat

    the point being there are 2 delimiters to split, the first being the comma the second being the line break thus creating an array of single digits
    I have solved the issue by using replaceall to replace all commas with spaces and then split on the spaceStill makes no sense. A line break doesn't not contain spaces. So how does replacing commas with spaces allow you to split on the line break.
    If you are appending each line to a string (to build one long string) then append the data with a comma, not a space.
    I really get the idea you guys are not enjoying my code?Its your requirements we don't understand. You obviously aren't explaining them correctly.

  • Oscilloscope .csv file output?

    Im relatively new to this and am learning as i go. Dont have any background in electronic engineering.
    Im using an Agilent Infiniium MSO8104A to record the voltage of the stepper motor control within a CNC milling machine and to record the movements of the x-axis and y-axis movements of the machine. I have circuits set up and probes attached. But i am also using a pre-amplifier circuit with a micro magnetic sensor to read the magnetic field of a PCB while the other axes are being read.
    I am hoping to take an image of the PCB's magnetic field just for research purposes.
    Problem:
    I have everything set up and the Oscilloscope is reading everything accurately
    Green = X-Axis
    Purple = Y-Axis
    Yellow = Magnetic field
    http://i.imgur.com/8eYEecd.png
    When i click File->Save->Waveform and save as a .CSV file then opening the file i expected to see 3 or 4 columns. 1 column representing the voltages being detected on each channel and a 4th column for Time. However when i opened it i only had 2 columns of voltage data. Even when i only have 1 channel active taking a reading and save it to the .CSV file it still has 2 columns of data in the fole.
    I can tell by the slight voltage fluctuations in one of the columns that that must be the magnetic sensor readings as the sensor would not be as accurate as the probes.
    I have spent the last week trying to figure out why only 2 columns of data (assuming they represent 2 channels) are being recorded and not all 3 channels at the same time. Can anyone lend some advice on what i am doing wrong, as i have played around with the Oscilloscope settings, read through most of the manual and i feel i am wasting far to much time on trying to figure this out. I have talked to a few people that are educated and qualified in electronic engineering and are more than familiar with the use of Oscilloscopes but have only been able to explain to them my issue. The lab i am working in is restricted so i can not bring one in to show them what i am doing and for them to walk me through fixing this issue. They have advised that it is really dependant on how you have your Oscilloscope set up and what it is i am trying to do. This Oscilloscope was set up for me prior to me commencing, so i have assumed it was set up for what i needed to be doing.
    http://i.imgur.com/8eYEecd.png

    I don't see anything in this about actually controlling the instrument.  You might have better luck going to an Agilent forum and/or Agilent Tech support since it is strictly with their scope and manually operating it.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions

  • Extract Members List of "Selected AD Groups" :: Input: CSV File :: OUTPUT: CSV File (URGENT REQUIREMENT)

    Hello Everyone,
    I am looking for a script which extracts AD Group Members (sourced from CSV/TXT file) and Output to CSV/TXT file.
    Can someone help me finding customized script to solve the purpose.
    Quick response is much much appreciated.
    Thanks & Regards,
    Amit Kumar

    Create a CSV with your headers and use this
    Import-Module Activedirectory
    $Groups=Import-Csv -Path "C:\Users\seimi\Documents\ADGroups.csv"
    foreach ($Entry in $Groups) {
    $Path="C:\Users\seimi\Documents\"+ $Entry.groupname +".csv"
    $Users=(Get-ADGroupMember -Identity $entry.groupname | select -ExpandProperty Name) -join ","
    Add-Content -Path "C:\Users\seimi\Documents\PipeGroup.csv" -Value ($Entry.groupname +";" + $Users)
    Seidl Michael | http://www.techguy.at |
    twitter.com/techguyat | facebook.com/techguyat

  • CSV file output!!

    Hello, i have a working select statment, and i have try to adapt an old CSV output file i have used before, but i am gettin the following error:
    SQL> @$ORBEXE/sup.sql
    TYPE wk_tab IS TABLE OF read_input%ROWTYPE
    ERROR at line 98:
    ORA-06550: line 98, column 4:
    PL/SQL: ORA-00933: SQL command not properly ended
    ORA-06550: line 8, column 1:
    PL/SQL: SQL Statement ignored
    There is probably a few more mistakes, but il start with this one.
    Thanks CP
    DECLARE
         v_count NUMBER(10) := 0;
         EXEC_file UTL_FILE.FILE_TYPE;
       CURSOR read_input
       IS
    SELECT COALESCE(ilv1.btr_pla_refno,ilv2.pla_refno,ilv3.PLP_PLA_REFNO) refno,  ilv1.par_per_surname, ilv1.adr_line_all, ilv1.description, ilv1.SCHEME_NO, ilv1.btr_cpa_cla_refno, ilv1.susp
    ,ilv2.pla_par_refno,  ilv3.PLP_BAK_ACCOUNT_NUMBER
    FROM (
    select distinct benefit_transactions.btr_cpa_cla_refno
                         ,parties.par_per_surname
                   ,addresses.adr_line_all
                   ,rbx151_schemes_data.description
                      ,rbx151_schemes_data.SCHEME_NO
                   ,btr_pla_refno
                      ,nvl2 (claim_parts.cpa_suspended_date, 'Y', 'N')        AS SUSP
      from fsc.address_usages
          ,fsc.address_elements
          ,fsc.addresses
          ,fsc.parties
          ,fsc.properties
          ,claim_periods
          ,benefit_transactions
          ,rbx151_schemes_cl
          ,rbx151_schemes_data
          ,claim_roles
          ,claim_property_occupancies
            ,claim_hb_payment_schemes
            ,claims
              ,claim_parts
    where address_elements.ael_street_index_code = addresses.adr_ael_street_index_code
       and addresses.adr_refno = address_usages.aus_adr_refno
       and properties.pro_refno = address_usages.aus_pro_refno
       and properties.pro_refno = claim_property_occupancies.cpo_pro_refno
       and rbx151_schemes_cl.scheme_no = rbx151_schemes_data.scheme_no
       and claim_roles.cro_crt_code = 'CL'
       and claim_roles.cro_end_date is null
       and claim_periods.cpe_cpa_cla_refno = claim_roles.cro_cla_refno
       and parties.par_refno = claim_roles.cro_par_refno
       and claim_property_occupancies.cpo_cla_refno = claim_periods.cpe_cpa_cla_refno
       and claim_property_occupancies.cpo_cla_refno = benefit_transactions.btr_cpa_cla_refno
       and claim_periods.cpe_cpa_cla_refno = benefit_transactions.btr_cpa_cla_refno
       and benefit_transactions.btr_cpa_cla_refno = rbx151_schemes_cl.claim_no
       and claim_roles.cro_cla_refno = claim_property_occupancies.cpo_cla_refno
       and claim_periods.cpe_cpo_pro_refno = rbx151_schemes_cl.pro_refno
       and claim_periods.cpe_cpa_cpy_code = 'HB'
       and claim_periods.cpe_cps_code = 'A'
       and claim_periods.cpe_cpa_cpy_code = benefit_transactions.btr_cpa_cpy_code
       and rbx151_schemes_cl.claim_no like '406%'
      -- and benefit_transactions.btr_cpa_cla_refno = '307801231'
    --   and parties.par_per_surname is not null
       and claim_property_occupancies.cpo_pro_refno = rbx151_schemes_cl.pro_refno
       and claim_periods.cpe_cpa_cla_refno = claim_parts.cpa_cla_refno   --MORE ADDED CODE!!
       and claims.cla_refno = claim_hb_payment_schemes.chp_cla_refno  --ADDED CODE!!!
       AND claims.cla_refno = claim_roles.cro_cla_refno  --ADDED CODE!!!
       and (claim_hb_payment_schemes.chp_pty_code ='CL' or claim_hb_payment_schemes.chp_pty_code ='LL') --ADDED CODE
       and claim_periods.cpe_created_date =
              (select max(c2.cpe_created_date)
                 from claim_periods c2
                where c2.cpe_cpa_cla_refno = claim_periods.cpe_cpa_cla_refno
                  and claim_periods.cpe_cpa_cpy_code = c2.cpe_cpa_cpy_code )
       and claim_property_occupancies.cpo_created_date =
              (select max(cp2.cpo_created_date)
                 from claim_property_occupancies cp2
                where cp2.cpo_cla_refno = claim_property_occupancies.cpo_cla_refno)
       and benefit_transactions.btr_created_date =
              (select max(b2.btr_created_date)
                 from benefit_transactions b2
               where b2.btr_cpa_cla_refno = benefit_transactions.btr_cpa_cla_refno)
      and claim_parts.CPA_CREATED_DATE =
              (select max(c1.CPA_CREATED_DATE)
                 from claim_parts c1
               where c1.CPA_CREATED_DATE = claim_parts.CPA_CREATED_DATE)) ilv1
    full outer join
    (select            private_ll_accounts.pla_refno,
                 private_ll_accounts.pla_par_refno
       from        private_ll_accounts
      where  private_ll_accounts.pla_created_date =
              (select max(p2.pla_created_date)
                from private_ll_accounts p2
              where p2.pla_refno = private_ll_accounts.pla_refno
                 and private_ll_accounts.pla_refno = p2.pla_refno (+))) ilv2
                ON (ilv1.btr_pla_refno = ilv2.pla_refno)
    full outer JOIN
    (select distinct private_ll_pay_schemes.PLP_PLA_REFNO, private_ll_pay_schemes.PLP_BAK_ACCOUNT_NUMBER
    from   private_ll_pay_schemes
    where  private_ll_pay_schemes.PLP_START_DATE =
             (select max(p1.PLP_START_DATE)
                from private_ll_pay_schemes p1
               where p1.PLP_PLA_REFNO = private_ll_pay_schemes.PLP_PLA_REFNO
                 and private_ll_pay_schemes.PLP_PLA_REFNO = p1.PLP_PLA_REFNO (+))) ilv3
    ON (ilv2.pla_refno =ilv3.PLP_PLA_REFNO)
    WHERE (ilv1.par_per_surname IS not NULL)
    --or ilv1.btr_pla_refno IS NULL and ilv3.PLP_PLA_REFNO IS NOT NULL)
    --and ( -- OR ilv2.pla_refno IS NOT NULL OR ilv3.PLP_PLA_REFNO IS NOT NULL);
       TYPE wk_tab IS TABLE OF read_input%ROWTYPE
          INDEX BY PLS_INTEGER;
       wk   wk_tab;
    BEGIN
       exec_file := utl_file.fopen('/spp/spool/RBlive/rr_output', 'sup.txt', 'W');
       OPEN read_input;
       LOOP
          EXIT WHEN read_input%NOTFOUND;
          FETCH read_input
          BULK COLLECT INTO wk LIMIT 100;
          FOR i IN 1 .. wk.count
          LOOP
         v_count :=0;
             utl_file.put_line(exec_file, wk(i).refno||','||wk(i).par_per_surname||','||wk(i).adr_line_all||','||wk(i).description||','||wk(i).SCHEME_NO||','||wk(i).btr_cpa_cla_refno||','||wk(i).susp||','||wk(i).PLA_PAR_REFNO||','||wk(i).PLP_BAK_ACCOUNT_NUMBER);
          END LOOP;
       END LOOP;
       CLOSE read_input;
       utl_file.fclose(exec_file);
    END;
    /

    Hello, first off, you need to change:
    WHERE (ilv1.par_per_surname IS not NULL)
    --or ilv1.btr_pla_refno IS NULL and ilv3.PLP_PLA_REFNO IS NOT NULL)
    --and ( -- OR ilv2.pla_refno IS NOT NULL OR ilv3.PLP_PLA_REFNO IS NOT NULL);To:
    WHERE (ilv1.par_per_surname IS not NULL);
    --or ilv1.btr_pla_refno IS NULL and ilv3.PLP_PLA_REFNO IS NOT NULL)
    --and ( -- OR ilv2.pla_refno IS NOT NULL OR ilv3.PLP_PLA_REFNO IS NOT NULL);(You were missing a semi-colon since it was commented out)

  • Extract Members of "Selected AD Groups" :: Input: CSV File :: OUTPUT: CSV File

    Dear Leaders,
    I am looking for a script to extracts AD Group Members (sourced from CSV/TXT file) and Output to CSV/TXT file.
    Can someone help me with this customized script.
    In particular I am looking for a script which generates SINGLE OUTPUT file in following format.
    GroupName    GroupMembers
    ADGroup1    Member1,Member2,Member3
    ADGroup2    Member1,Member2,Member3
    ADGroup3    Member1,Member2,Member3
    Thank You Very much in advance !!
    Regards,
    Amit Kumar Rao

    Get-ADGroup -Filter "GroupCategory -eq 'Security'" -SearchBase "OU=Organization-Unit"For Help:Get-Help Get-ADGroup -Examplesif examples not thereUpdate-Help

  • CSV  FILE READING

    Hi all,
    I got the Csv parser from the net.It is giving runtime error "IO FILE Exception"
    actually there are 3 file included in it.
    CSVFile
    import java.util.ArrayList;
    import java.io.BufferedReader;
    import java.io.FileReader;
    * holds the file object of records
    public class CSVFile
    * arraylist of records, each one containing a single record
    private ArrayList records = new ArrayList();
    * What to replace a row delimiter with, on output.
    private String replacementForRowDelimiterInTextField = " "; // Change if needed.
         * debug, > 0 for output.
        public int debug = 5;
        private boolean debugLoading = true; //true when debugging load cycle
    *Return the required record
    *@param index the index of the required record
    *@return a CSVRecord, see #CSVRecord
    public CSVRecord getRecord (int index)
        if (this.debug > 3 && !debugLoading) {
         System.err.println("CSVFile getRecord ["+index+"]"+ ((CSVRecord)this.records.get(index)).getFields(3));
         return (CSVRecord)this.records.get(index);
    *Get the number of records in the file
    *@return 1 based count of records.
    public int count()
         return this.records.size();
         // ----- Constructor -----
    *Constructor; create a file object
    *@param details  a propertyFile object, see #propertyFile
    *@param csvFile filename of csv file
         public CSVFile(propertyFile details, String csvFile)
             try{
              BufferedReader reader = new BufferedReader (new FileReader (csvFile));
              //StringBuilder sbBuffer = new StringBuilder( reader.ReadToEnd() );
              StringBuffer buf=new StringBuffer();
              String text;
              try {
                  while ((text=reader.readLine()) != null)
                   buf.append(text + "\n");
                  reader.close();
              }catch (java.io.IOException e) {
                  System.err.println("Unable to read from csv file "+ csvFile);
                  System.exit(2);
              String buffer;
              buffer = buf.toString();
              buffer = buffer.replaceAll("&","&");
              buffer = buffer.replaceAll("<","<");
              boolean inQuote = false;
              String savedRecord = "";
              String curRecord = "";
              if (debug > 2) {
                  System.err.println("csvFile: setup");
                  System.err.println("Read int from src CSV file");
              //Split entire input file into array records, using row delim.
              String records[] =  buffer.split( details.rowDelimiter() );
              //Iterate over each split, looking for incomplete quoted strings.
              for (int rec=0; rec <records.length; rec++)
                   curRecord = savedRecord + records[rec];
                   if (debug > 4) {
                       System.out.println("csvFile: saved rec" + savedRecord);
                       System.out.println("csvFile: current rec " + curRecord);
                       System.out.println("csvFile: currRecLth: " + curRecord.length());
                   for (int i = 0; i < curRecord.length(); i ++ )
                        char ch = curRecord.charAt(i);
                        char prev = ( i != 0? curRecord.charAt(i-1): ' ');
                        char nxt = ( i < (curRecord.length()-2)? curRecord.charAt(i+1): ' ');
                        if ( !inQuote && ch == '"' )
                            inQuote = true;
                        else
                            if ( inQuote && ch == '"' )
                             if ( i + 1 < curRecord.length() )
                                 inQuote = (nxt == '"')
                                  || (prev == '"');
                             else
                                 inQuote = false;
                   if ( inQuote )
                        // A space is currently used to replace the row delimiter
                        //when found within a text field
                        savedRecord = curRecord + replacementForRowDelimiterInTextField;
                        inQuote = false;
                   else
                        this.records.add( new CSVRecord(details, curRecord) );
                        savedRecord = "";
              catch (java.io.FileNotFoundException e) {
                  System.out.println("Unable to read CSV file, quitting");
                  System.exit(2);
         // ----- Private Methods -----
         private String[] SplitText(String textIn, String splitString)
              String [] arrText = textIn.split(splitString);
              return arrText;
    *Get all records in the csvfile
    *@return array of CSVRecords, see #CSVRecord
    public CSVRecord[] GetAllRecords()
    CSVRecord[] allRecords = new CSVRecord[ this.records.size() ];
    for (int i = 0; i < this.records.size(); i++ )
         allRecords[i] = (CSVRecord)this.records.get(i);
    return allRecords;
      public static void main(String args[])
         propertyFile path=new propertyFile("C:\\bea\\jdk142_05\\bin");
        CSVFile  a=new CSVFile(path,"C:\\bea\\jdk142_05\\bin\\xxx.csv");
    CSVRecord
    import  java.util.ArrayList;
    *Represents a single record of a CSV file
    public class CSVRecord
         *Debug
        private int debug = 0;
         * Arraylist of fields of the record
        private ArrayList fields = new ArrayList();
         *get the field with index index
         *@param index of field required
         *@return String value of that field
        public String getFields (int index)
         if ( index < fields.size())
         return (String)this.fields.get(index);
         else return ("");
         *get the number of fields
         *@return int number of fields in this file
        public int count()
         return this.fields.size();
         *Create a csv record from the input String, using the propertyfile.
         *@param  details , the property file
         *@see <a href="propertyFile.html">propertyFile</a>
         *@param  recordText , the record to be added to the arraylist of records
        public  CSVRecord(propertyFile details, String recordText)
          * true if within a quote
         boolean inQuote = false;
          * temp saved field value
         String savedField = "";
          * current field value
         String curField = "";
          * field being built
         String field = "";
          * array of records.
          * split it according to the field delimiter.
          * The default String.split() is not accurate according to the M$ view.
         String records[] =  recordText.split( details.fieldDelimiter() );
         for (int rec=0; rec <records.length; rec++)
              field = records[rec];
              //Add this field to currently saved field.
              curField = savedField + field;
              //Iterate over current field.
              for (int i = 0; i < curField.length(); i ++ ){
                   char ch = curField.charAt(i); //current char
                   char nxt = ((i==
                             curField.length() -1)
                            ? ' ' : curField.charAt(i+1)); //next char
                   char prev = (i==0? ' ': curField.charAt(i-1)); //prev char
                   if ( !inQuote && ch == '"' )
                       inQuote = true;
                   else
                       if ( inQuote && ch == '"' )
                        if ( (i + 1) < curField.length() )
                            inQuote = (nxt == '"') || (prev == '"');
                        else
                            inQuote = (prev == '"');
              }//end of current field
              if ( inQuote )
                   savedField = curField + details.fieldDelimiter() + " ";
                   inQuote = false;
              else if (!inQuote && curField.length() > 0)
                   char ch = curField.charAt(0); //current char
                   char lst = curField.charAt(curField.length()-1);
                   if (ch   == '"' &&
                       lst == '"')
                        //Strip leading and trailing quotes
                        curField = curField.substring(1,curField.length()-2);
                        //curField = curField.Replace( "\"\"", "\"" );
                        curField =curField.replaceAll("\"\"", "\"");
                   this.fields.add( curField );
                   savedField = "";
              else if(curField.length() == 0){
                  this.fields.add("");
              if (debug > 2)
                  System.out.println("csvRec  Added:" + curField);
             }//   end of for each record
    propertyFile
    import java.util.ArrayList;
    import java.io.BufferedReader;
    import java.io.FileReader;
    * This class holds the data from a Property file.
    public class propertyFile
        // ----- Private Fields -----
         *Comments from the file
        private String comment;
         * Delimiter for individual fields
        private String fieldDelimiter; // was char
         *   Delimiter for each row
        private String rowDelimiter;
         * Root element to use for output XML
        private String xmlRootName;
         * Element to use for each row
        private String recordName;
         *How many fields are there -  Note: This is 1 based, not zero based.
        private int fieldCount;
         * array of fields
        private ArrayList fields = new ArrayList(88);
         *Set to int > 0 for debug output
        private int  debug=0;
    /** A single instance of this will hold all the relavant details for ONE PropertyFile.
        *@param filePath String name of the property file.
        public  propertyFile(String filePath)
         //StreamReader reader = new StreamReader( filePath );
         try {
         BufferedReader reader = new BufferedReader (new FileReader (filePath));
         String line = null;
         while ( (line = reader.readLine()) != null )
              if ( line.length() != 0 )   //was != ""
                   if (debug> 0)
                       System.err.println("String is: " + line + "lth: " + line.length());
                   if ( line.charAt(0) != '[' && !( line.startsWith("//") ) )
                        String propertyValue = line.split("=")[1];
                        // Assign Comment
                        if ( line.toUpperCase().startsWith("COMMENT=") )
                            this.comment = propertyValue;
                        // Assign Field Delimter
                        if ( line.toUpperCase().startsWith("FIELDDELIMITER") )
                            this.fieldDelimiter = propertyValue.substring(0);
                        // Assign Row Delimiter
                        if ( line.toUpperCase().startsWith("ROWDELIMITER") )
                             if ( propertyValue.substring(0,1).toUpperCase() ==
                                  "\\" && propertyValue.toUpperCase().charAt(1) == 'N')
                                 this.rowDelimiter = "\r\n";
                             else
                                 this.rowDelimiter = propertyValue;
                        // Assign Root Document Name
                        if ( line.toUpperCase().startsWith("ROOTNAME") )
                            this.xmlRootName = propertyValue;
                        // Assign Record Name
                        if ( line.toUpperCase().startsWith("RECORDNAME") )
                            this.recordName = propertyValue;
                        // Assign Field Count
                        if ( line.toUpperCase().startsWith("FIELDS") )
                            this.fieldCount =  Integer.parseInt(propertyValue);
                   else
                        if ( line.toUpperCase().startsWith("[FIELDS]") )
                             while ( (line = reader.readLine()) != null )
                                  if ( line.length() == 0)
                                      break;
                                  else{
                                      if (debug > 0)
                                       System.err.println("Adding: "+line.split("=")[1]);
                                      this.fields.add( line.split("=")[1] );
                             break;
         reader.close();
         } catch (java.io.IOException e) {
             System.out.println("**** IO Error on input file. Quitting");
             System.exit(2);
         * Return the comment int the property file
         *@return String, the comment value, if any
        public String comment ()
         return this.comment;
         * The delimiter to be used for each field, often comma.
         *@return String, the character(s)
        public String fieldDelimiter()
         return this.fieldDelimiter;
         * Row Delimiter - often '\n'
         *@return String, the character(s)
        public String rowDelimiter ()
         return this.rowDelimiter;
        * The XML document root node.
        * @return String, the element name
        public String XMLRootName()
         return this.xmlRootName;
        /** <summary>
        ** The node name for each record
        public String recordName()
         return this.recordName;
        ** Number of Fields per record/node
        *@return integer count of number of fields, 1 based.
        public int fields()
         return this.fieldCount;
         // ----- Public Methods -----
         ** The value of the nth field, 0 based.
         ** @param index Which field to return
         * @return String the field value
        public String fieldNames(int index)
         if (index <this.fields.size())
             return (String)this.fields.get(index); //was .toString()
         else
              System.err.println("PropertyFile: Trying to get idx of :"
                           + index
                           + "\n when only "
                           //+ (this.fields.size() -  1)
                           + this.fieldCount
                           + " available"
              System.exit(2);
         return "";
         *Test entry point, this class
         *@param argv  cmd line arg of property file
        public static void main (String argv[]) {
              if ( argv.length != 1) {
               System.out.println ("Md5 <file>") ;
               System.exit (1) ;
        propertyFile p = new propertyFile(argv[0]);
    Please help as i m novice in File handling espically csvfiles

    > **** IO Error on input file. Quitting
    Press any key to continue . . .
    Ok, no compiler error but it seems that the filePath String name of the property file isn't there.

  • How to read a CSV file into the portal

    hi all,
    I want to read the content of CSV file that is avaliable in my desktop.
    plz help me with suitable code
    Regards
    Savitha
    Edited by: Savitha S R on Jun 1, 2009 8:25 AM

    Please use this code for that
    REPORT  znkp_upload_csv line-size 400.
    DATA : v_filename TYPE string.
    PARAMETER : p_file LIKE rlgrap-filename DEFAULT 'C:\Documents and Settings\Administrator\Desktop\JV.csv' .
    *Types for Reading CSV file
    TYPES : BEGIN OF x_csv,
              text(400) TYPE c,
            END OF x_csv.
    *Global internal table for reading CSV file
    DATA : lt_csv TYPE TABLE OF x_csv.
    *Global work area for reading CSV file
    DATA : wa_csv LIKE LINE OF lt_csv.
    v_filename = p_file.
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
        filename                      = v_filename
      FILETYPE                      = 'ASC'
      HAS_FIELD_SEPARATOR           = ' '
      HEADER_LENGTH                 = 0
      READ_BY_LINE                  = 'X'
      DAT_MODE                      = ' '
      CODEPAGE                      = ' '
      IGNORE_CERR                   = ABAP_TRUE
      REPLACEMENT                   = '#'
      CHECK_BOM                     = ' '
      VIRUS_SCAN_PROFILE            =
      NO_AUTH_CHECK                 = ' '
    IMPORTING
      FILELENGTH                    =
      HEADER                        =
      TABLES
        data_tab                      = lt_csv
    EXCEPTIONS
       file_open_error               = 1
       file_read_error               = 2
       no_batch                      = 3
       gui_refuse_filetransfer       = 4
       invalid_type                  = 5
       no_authority                  = 6
       unknown_error                 = 7
       bad_data_format               = 8
       header_not_allowed            = 9
       separator_not_allowed         = 10
       header_too_long               = 11
       unknown_dp_error              = 12
       access_denied                 = 13
       dp_out_of_memory              = 14
       disk_full                     = 15
       dp_timeout                    = 16
       OTHERS                        = 17
    IF sy-subrc <> 0.
      MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
              WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    DATA : BEGIN OF item OCCURS 0 ,
            t1(20) TYPE c,
            t2(20) TYPE c,
            t3(20) TYPE c,
            t4(20) TYPE c,
            t5(20) TYPE c,
            t6(20) TYPE c,
            t7(20) TYPE c,
            t8(20) TYPE c,
           END OF item.
    DATA : txt(1) TYPE c. " 1-Header 2-Item
    LOOP AT lt_csv into wa_csv.
    split wa_csv-text at ',' into item-t1
                                  item-t2
                                  item-t3
                                  item-t4
                                  item-t5
                                  item-t6
                                  item-t7
                                  item-t8.
    append item.
    clear item.
    ENDLOOP.
    Check ITEM TABLE
    Regards
    Naresh

  • Loading a CSV file and accessing the variables

    Hi guys,
    I'm new to AS3 and dealt with AS2 before (just getting the grasp when the change it).
    Is it possible in AS3 to load an excel .csv file into Flash using the URLLoader (or ???) and the data as variables?
    I can get the .csv to load and trace the values (cell1,cell2,cell3....) but I'm not sure how to collect the data and place it into variables.
    Can I just create an array and access it like so.... myArray[0], myArray[1]? If so, I'm not sure why it's not working.
    I must be on the completely wrong path. Here's what I have so far....
    var loader:URLLoader = new URLLoader();
    loader.dataFormat = URLLoaderDataFormat.VARIABLES;
    loader.addEventListener(Event.COMPLETE, dataLoaded);
    var request:URLRequest = new URLRequest("population.csv");
    loader.load(request);
    function dataLoaded(evt:Event):void {
        var myData:Array = new Array(loader.data);
        trace(myData[i]);
    Thanks for any help,
    Sky

    just load your csv file and use the flash string methods to allocate those values to an array:
    var myDate:Array = loader.data.split(",");

  • Import data from excel/csv file in web dynpro

    Hi All,
    I need to populate a WD table by first importing a excel/CSV file thru web dynpro screen and then reading thru the file.Am using FileUpload element from NW04s.
    How can I read/import data from excel / csv file in web dynpro table context?
    Any help is appreciated.
    Thanks a lot
    Aakash

    Hi,
    Here are the basic steps needed to read data from excel spreadsheet using the Java Excel API(jExcel API).
    jExcel API can read a spreadsheet from a file stored on the local file system or from some input stream, ideally the following should be the steps while reading:
    Create a workbook from a file on the local file system, as illustrated in the following code fragment:
              import java.io.File;
              import java.util.Date;
              import jxl.*;
             Workbook workbook = Workbook.getWorkbook(new File("test.xls"));
    On getting access to the worksheet, once can use the following code piece to access  individual sheets. These are zero indexed - the first sheet being 0, the  second sheet being 1, and so on. (You can also use the API to retrieve a sheet by name).
              Sheet sheet = workbook.getSheet(0);
    After getting the sheet, you can retrieve the cell's contents as a string by using the convenience method getContents(). In the example code below, A1 is a text cell, B2 is numerical value and C2 is a date. The contents of these cells may be accessed as follows
    Cell a1 = sheet.getCell(0,0);
    Cell b2 = sheet.getCell(1,1);
    Cell c2 = sheet.getCell(2,1);
    String a1 = a1.getContents();
    String b2 = b2.getContents();
    String c2 = c2.getContents();
    // perform operations on strings
    However in case we need to access the cell's contents as the exact data type ie. as a numerical value or as a date, then the retrieved Cell must be cast to the correct type and the appropriate methods called. The code piece given below illustrates how JExcelApi may be used to retrieve a genuine java double and java.util.Date object from an Excel spreadsheet. For completeness the label is also cast to it's correct type. The code snippet also illustrates how to verify that cell is of the expected type - this can be useful when performing validations on the spreadsheet for presence of correct datatypes in the spreadsheet.
      String a1 = null;
      Double b2 = 0;
      Date c2 = null;
                        Cell a1 = sheet.getCell(0,0);
                        Cell b2 = sheet.getCell(1,1);
                        Cell c2 = sheet.getCell(2,1);
                        if (a1.getType() == CellType.LABEL)
                           LabelCell lc = (LabelCell) a1;
                           stringa1 = lc.getString();
                         if (b2.getType() == CellType.NUMBER)
                           NumberCell nc = (NumberCell) b2;
                           numberb2 = nc.getValue();
                          if (c2.getType() == CellType.DATE)
                            DateCell dc = (DateCell) c2;
                            datec2 = dc.getDate();
                           // operate on dates and doubles
    It is recommended to, use the close()  method (as in the code piece below)   when you are done with processing all the cells.This frees up any allocated memory used when reading spreadsheets and is particularly important when reading large spreadsheets.              
              // Finished - close the workbook and free up memory
              workbook.close();
    The API class files are availble in the 'jxl.jar', which is available for download.
    Regards
    Raghu

  • How to Compare 2 CSV file and store the result to 3rd csv file using PowerShell script?

    I want to do the below task using powershell script only.
    I have 2 csv files and I want to compare those two files and I want to store the comparision result to 3rd csv file. Please look at the follwingsnap:
    This image is csv file only. 
    Could you please any one help me.
    Thanks in advance.
    By
    A Path finder 
    JoSwa
    If a post answers your question, please click &quot;Mark As Answer&quot; on that post and &quot;Mark as Helpful&quot;
    Best Online Journal

    Not certain this is what you're after, but this :
    #import the contents of both csv files
    $dbexcel=import-csv c:\dbexcel.csv
    $liveexcel=import-csv C:\liveexcel.csv
    #prepare the output csv and create the headers
    $outputexcel="c:\outputexcel.csv"
    $outputline="Name,Connection Status,Version,DbExcel,LiveExcel"
    $outputline | out-file $outputexcel
    #Loop through each record based on the number of records (assuming equal number in both files)
    for ($i=0; $i -le $dbexcel.Length-1;$i++)
    # Assign the yes / null values to equal the word equivalent
    if ($dbexcel.isavail[$i] -eq "yes") {$dbavail="Available"} else {$dbavail="Unavailable"}
    if ($liveexcel.isavail[$i] -eq "yes") {$liveavail="Available"} else {$liveavail="Unavailable"}
    #create the live of csv content from the two input csv files
    $outputline=$dbexcel.name[$i] + "," + $liveexcel.'connection status'[$i] + "," + $dbexcel.version[$i] + "," + $dbavail + "," + $liveavail
    #output that line to the csv file
    $outputline | out-file $outputexcel -Append
    should do what you're looking for, or give you enough to edit it to your exact need.
    I've assumed that the dbexcel.csv and liveexcel.csv files live in the root of c:\ for this, that they include the header information, and that the outputexcel.csv file will be saved to the same place (including headers).

  • How to get sql query data and write into csv file?

    I am writing to seek help, in how can I create bat. script which can execute the following logic:
    connection to the database
    run sql query
    create CSV file
    output query data, into CSV file
    save the CSV file
    osql
    -S 84.18.111.111
    -U adw
    -P rem
    -i "c:\query.sql"
    send ""
    sed -e 's/,\s\+/,/g' MCI_04Dec2014.csv > localNoSpaces.csv
    -o "c:\MCI_04Dec2014.csv"
    This what i have so far, and I am little struggling with the logic after creating CSV file. I am unable to get the above script to work, please advice further, where I may be going wrong. 
    Can you create if statement logic within window's script, to check for null parameters or data feeds?
    Any hints would be most appreciated. 

    Thank you for your reply. 
    Apology for posting the code irrelevant to the forum, as I am still novice scripting user.  
    My goal is to create window's script which can compute the above logic and send the final output file (csv), to FTP folder. 
    Can this logic be implemented via bat. script, if so, is there a example or tutorial i could follow, in order to achieve my task. 
    Any help would be much appreciated. 

  • How to test if a csv file is empty or not and email it?

    Hi,
    I am new to powershell scripting and i have a small task which is basically i have a csv file which will sometimes be empty(it will have header row even if its empty ) and sometimes it will have some rows of data.so basically i have to write a power shell
    script which will look at the csv file and if its empty then do nothing if its not empty then email it to someone with the csv file attached to it?
    Can someone please help me with it?
    Thanks

    Hi mike,
    Yeah sure i will keep an eye on it ,and yeah coming back to your way of code.Here is the way i am doing it
    If ((Import-Csv D:\App\ODS\File\Share\StoreTrafficException.csv).Count -gt 1)
           Send-MailMessage -To "[email protected]"
           -From "[email protected]"
           -Attachments D:\App\ODS\File\Share\StoreTrafficException.csv
           -Subject 'Check on this file'
           -SmtpServer "int-smtp1.jjill.com"
    and if i run this one i get the following error :-
    Missing expression after unary operator '-'.
    At D:\App\Email.ps1:5 char:9
    +        - <<<< Attachments D:\App\ODS\File\Share\StoreTrafficException.csv
        + CategoryInfo          : ParserError: (-:String) [], ParentContainsErrorRecordException
        + FullyQualifiedErrorId : MissingExpressionAfterOperator
    And all the sendmail paramaters and values are correct because i am using the sameone in my other code which i said worked fine.Can you suggest me any changes on why its failing? Just want to know the reason and it would be great if i can have this way also.
    Thanks

  • Zero is getting truncated while XI is creating a CSV file

    Hi,
    Scenario is IDOC to File.
    in the IDOC i get some Serial number as 00000007856465 in the output CSV file i need this value as 07856465
    Here when i run this map in the ESR Message mapping i could see 07856465 this value fine but while posting at the output directory the values are dropped with zeros truncated.
    in the output CSV file -- the value is -- 7856465
    Could you please help me out in solving this issue.
    Thanks,
    --Sai.

    Hi Sai Krishna,
    As said by Bhaskar, use Formatnum function or UDF in your mapping. So that you would be able to get the 0's in the output.
    I would prefer to go with FormatNum function.
    Check the UDF provided by Raja Shekar in the below thread:
    UDF for leading zero's in message mapping
    Thanks,

  • JDBC wrapper for CSV files?

    I wrote my own method to read in CSV files into a table structure (String[][]). For big CSV files, I added several functionalities to ignore specific data lines that have specific values. All this looks quite similar to a database table that I do a select * for and reduce the resulting rows via specific WHERE clause criterias. So I wonder if there's already such a JDBC wrapper around CSV files?

    Yes. I believe the JDBC-ODBC bridge can use an Excel URL to read in a CSV. Though don't quote me on that one.
    However, why not simply use your RDBMS data-import utility? You can invoke it from a scheduler or from Runtime.exec(). It should perform MUCH better than middleware for a huge CSV file. If manipulation needs to occur for the data, write it first to a temp table, then manipulate it.
    - Saish

Maybe you are looking for