Using .csv file and adding data into database

hi,
i'm working on a project which shows all the share prices on a webpage from the FTSE100..
my problem is..i connect to yahoo.co.uk to get my share price information which is updated every 15mins..they return a .csv file to me...at the moment, i am just printing the information onto my website, but is there any way that i could store this information into a database if i needed the data to be used elsewhere...thanx for the help in advance

below is my code which i used to get my info onto the web...i'd just like to know how i would use this to store the data into a database..
import java.net.*;
import java.io.*;
import java.lang.*;
import java.util.*;
public class SharePrice
     private String line;
     private int maxShares = 101;//maximun shares a user can have
     private int details = 5;//five details, name,date,time,price,change.
     public String [][] shareData = new String[maxShares][details];
public SharePrice(String [] shares) throws Exception
          getShare(shares);
     //returns a double array containing share data of each share as a seperate row in the array
public String [][] getShare(String [] sh) throws Exception
               for(int i=0; i<sh.length; i++)
                    //if the entry is null we have reached the end of the array
                    if(sh!=null)
                         String share = sh[i];
                         //part of url of the resource
                         String address ="http://uk.finance.yahoo.com/d/quotes.csv?s=";
                         //adds the share tothe url so that particular shares data is retieved
                         address = address+share;
                         System.out.println(address);
                         try
                              //connection is created to the resource and input stream opened to read data
                              URL url = new URL(address);
                              BufferedReader in = new BufferedReader(
                                        new InputStreamReader(
                                        url.openStream()));
                              line = in.readLine();
                              in.close();
                         }catch(Exception e){System.err.println("Exception: " + e.getMessage());
                              e.printStackTrace();}
                         //the line of data retrieved is spli and placed in a single row of the array
                         //beause the each piece of data is seperated by commas it is easily seperated.
                         StringTokenizer t = new StringTokenizer(line, ",");
                         int count = t.countTokens();
                         System.out.println(" count= "+count);
                         while(t.hasMoreTokens())
                              for(int j=0; j<count; j++)
                                   String s = t.nextToken();
                                   shareData[i][j] = s;
          return shareData;

Similar Messages

  • Reading file and dump data into database using BPEL process

    I have to read CSV files and insert data into database.. To achieve this, I have created asynchronous bpel process. Added Filed Adapter and associated it with Receive activity.. Added DB adapter and associated with Invoke activity. Total two receive activity are available in  process, when tried to Test through EM, only first receive activity is completed, and waiting on second receive activity. Please suggest how to proceed with..
    Thanks, Manoj.

    Deepak, thank for your reply.. As per your suggestion I created BPEL composite with
    template "Define Service Later". I followed below steps, please correct me if I am wrong/missing anything. Your help is highly appreciated...
    Step 1-
    Created File adapter and corresponding Receive Activity (checkbox create instance is checked) with input variable.
    Step 2 - Then in composite.xml, dragged the
    web service under "Exposed Services" and linked the web service with Bpel process.
    Step 3 - Opened .bpel file and added the DB adapter with corresponding Invoke activity, created input variable. Web service is created of Type "Service" with existing WSDL(first option aginst WSDL URL).
    and added Assign activity between receive and invoke activities.
    Deployed the composite to server, when triedTest it
    manually through EM, it is promting for input like "subElmArray Size", then I entered value as 1 with corresponding values for two elements and click on Test We Service button.. Ptocess is completing in error. The error is
    Error Message:
    Fault ID
    service:80020
    Fault Time
    Sep 20, 2013 11:09:49 AM
    Non Recoverable System Fault :
    Correlation definition not registered. The correlation set definition for operation Read, process default/FileUpload18!1.0*soa_3feb622a-f47e-4a53-8051-855f0bf93715/FileUpload18, is not registered with the server. The correlation set was not defined in the process. Redeploy the process to the containe

  • How to read a CSV file and Insert data into an Oracle Table

    Hi All,
    I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .
    Please let me some suggestions on this.
    Thanks,
    Chandra R

    jeneesh wrote:
    And, please don't "hijack" 5 year old thread..Better start a new one..I've just split it off to a thread of it's own. ;)
    @OP,
    I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .You don't have a "clob file" as there's no such thing. CLOB is a datatype for storing large character based objects. A file is something on the operating system's filesystem.
    So, why have you stored comma seperated data in a CLOB?
    Where did this data come from? If it came from a file, why didn't you use SQL*Loader or, even better, External Tables to read and parse the data into structured format when populating the database with it?
    If you really do have to parse a CLOB of data to pull out the comma seperated values, then you're going to have to write something yourself to do that, reading "lines" by looking for the newline character(s), and then breaking up the "lines" into the component data by looking for commas within it, using normal string functions such as INSTR and SUBSTR or, if necessary, REGEXP_INSTR and REGEXP_SUBSTR. If you have string data that contains commas but uses double quotes around the string, then you'll also have the added complexity of ignoring commas within such string data.
    Like I say... it's much easier with SQL*Loader of External Tables as these are designed to parse such CSV type data.

  • Please recommend if we have options to read xml file and insert data into table without a temporary table.

    Please recommend if we have options to read xml file and insert data into table without a temporary table. 

    DECLARE @data XML;
    SET @data =N'<Root>
    <List RecordID="946236" />
    <List RecordID="946237" />
    <List RecordID="946238" />
    <List RecordID="946239" />
    <List RecordID="946240" />
    </Root>'
    INSERT INTO t (id) SELECT T.customer.value('@RecordID', 'INT') AS id
    FROM @data.nodes('Root/List')
     AS T(customer);
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • How can FMS create a text file and write data into it in the Server application folders?

    Recently, I writed a programe about creating a text file and writing data into it in the server application folder. My code is as following:
               var fileObj = new File("/MyApp/test.txt");
               if( fileObj !=  null)
                      if(fileObj.open( "text", "append"))
                            fileObj.write( "                                                      ———— Chat Info Backup ————\r\n" );
                            fileObj.close( );
                            trace("Chat info backup document :" +  fileObj.name + " has been created successfully!");    
    But when I run it, FMS throw the error as following: File operation open failed  ;  TypeError: fileObj has no properties.
    Can you help me ? Thanks in advance.
    Supplement: The text file named test.txt doesn't exist before create the fileObj, an instance of File Class.

    Is MyApp the name of the application directory, or is it a child of the application directory? If myApp is the app name, just use test.txt as the path flag in the file constructor.

  • Is there a way to select a certain box of elements from a csv file and read that into LabVIEW?

    Hello all, I was wondering if there was a way to select only a certain "box" of elements from a .csv file in LabVIEW? I have LabVIEW 2011 and my main goal is to take two arrays and graph them against each other. I can import the .csv file just fine and separate each row and each column to be its own, but say I have an 8X8 but want to graph the middle 4X5 or something like that. Is there any way to extract an array without starting at the beginning and without ending at the end? Thank you in advance.
    Solved!
    Go to Solution.

    Hi Szklanam,
    as a CSV file is just a TXT file with a different suffix you can read a certain number of lines of that file. So you can limit the number of rows in your resultung array. To limit the number of columns you still have to use ArraySubset, so maybe it's a lot easier to read the full CSV file and pick the interesting spots with ArraySubset...
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • How to activate another worksheet in excel file and write data into it

    Hi,
    I am writing an automation program to collect test data and write the data to an excel file.
    The excel file has several worksheets and now I can only write data to one sheet. Can anyone please let me know how to activate another worksheet and write data into it? Thank you very much.

    You can do a search in the Example Finder for more Excel VIs.
    They will give you a clearer idea of how to go about doing things in the way you need.
    - Partha
    LabVIEW - Wires that catch bugs!

  • Error while using Rule file in loading data into Essbase through ODI

    Hi Experts,
    I am facing problem while loading data into Essbase. I am able to load data into Essbase successfully. But when i used Rule fule to add values to existing values am getting error.
    test is my Rule file.
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Cannot put olap file object. Essbase Error(1053025): Object [test] already exists and is not locked by user [admin@Native Directory]
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
         at java.lang.Thread.run(Thread.java:662)
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C1_HSP_RATES "HSP_Rates",C2_ACCOUNT "Account",C3_PERIOD "Period",C4_YEAR "Year",C5_SCENARIO "Scenario",C6_VERSION "Version",C7_CURRENCY "Currency",C8_ENTITY "Entity",C9_VERTICAL "Vertical",C10_HORIZONTAL "Horizontal",C11_SALES_HIERARICHY "Sales Hierarchy",C12_DATA "Data" from PLANAPP."C$_0HexaApp_PLData" where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    #stmt.setFetchSize(srcFetchSize)
    stmt.setFetchSize(1)
    print "executing query"
    rs = stmt.executeQuery(sql)
    print "done executing query"
    #load the data
    print "loading data"
    stats = pWriter.loadData(rs)
    print "done loading data"
    #close the database result set, connection
    rs.close()
    stmt.close()
    Please help me on this...
    Thanks & Regards,
    Chinnu

    Hi Priya,
    Thanks for giving reply. I already checked that no lock are available for rule file. I don't know what's the problem. It is working fine without the Rule file, but throwing error only when using rule file.
    Please help on this.
    Thanks,
    Chinnu

  • After the upgrade to Mozilla 32 using csv files to import data does not work anymore. What should I do?

    Hello,
    After the latest upgrade to Mozilla 32 a previously recorded imacros scirpt in which the data was being imported into mozilla from a csv file keeps erroring out. This same problem occurred with the with the upgrade to Mozilla 30 and was fixed when Mozilla was upgraded to 31. Can you please advise on how to correct the problem so that csv files can be used in conjunction with imacros scripts. I would like to point out that imacros still works without the use of csv files, but when you introduce a csv to the script the imacros does not work.
    Thank you,
    Eugene

    hello eugene, please directly contact the imacros extension developers through the means provided at their [http://forum.iopus.com/viewforum.php?f=11 forum] - they can likely give you more detailed guidance and are the only ones who can fix bugs or make necessary adjustments in the addon for new firefox versions. thank you...

  • SQL server 2014 and VS 2013 - Dataflow task, read CSV file and insert data to SQL table

    Hello everyone,
    I was assigned a work item wherein, I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current
    file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.
    On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file. Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?
    I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns
    in it. These files needs to be migrated to SQL tables using the optimum way.
    Does anybody know which is the best way to setup the Dataflow task for this requirement?
    Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.
    Any help would be much appreciated. It's very urgent.
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com

    The standard Data Flow Task supports only static metadata defined during design time. I would recommend you check the commercial COZYROC
    Data Flow Task Plus. It is an extension of the standard Data Flow Task and it supports dynamic metadata at runtime. You can process all your input CSV files using a single Data Flow Task
    Plus. No programming skills are required.
    SSIS Tasks Components Scripts Services | http://www.cozyroc.com/

  • To read & save data from serial port (write to .csv file), and plot data as waveform

    Hi,
    I've been stuck with this problem for weeks and really need some help. I've been searching a lot but I can't find a proper solution.
    I am trying to use Labview to plot and display some parameters of a data acquisition system for a project.
    The data acquisition system is connected to a computer via serial connection through COM 1 port.
    The data acquisition system sends a serial stream of data (ASCII string with baud rate 19200), as shown below (from HyperTerminal),
    01:00:26,933.24,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    01:00:27,931.54,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    The serial data stream consists of a time stamp (as 1st data), followed by 16 individual channel data, separated by comma "," and each stream ending in return "\n". Each value of the data ranges from 1 to 6 digits (value of -5000 to 15000).
     I have no problem reading this values correctly using HyperTerminal, but I have problem to plot and display individual parameters of this serial string in Labview. I want to save the entire data in  a “.csv” file corresponding to all the 16 channels with time stamp and plot each individual channel parameter in a graph with time on X-axis.
    I am relatively new to Labview hope I can learn a lot from your help.
     Thank you

    Thank you Dennis.........
    Now I had modified the program incorporating1D array for gauge display.....but currently I have a new problem. The program is running updating the file, waveform graph and Gauge very slowly. Approximately once in every 15 seconds 16 data streams are getting updated in a bunch with gauge sticking to the last value rather than indicating real time value.
    eg.
    12:33:41,-3728.78,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:42,-3730.47,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:43,-3729.81,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:44,-3733.80,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:45,-3733.57,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:46,-3729.27,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:47,-3725.97,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:48,-3730.14,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:49,-3733.20,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:50,-3733.60,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:51,-3732.24,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:52,-3734.51,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:53,-3728.07,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:54,-3732.54,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:55,-3732.24,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    12:33:56,-3731.58,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,
    gets updated in a bunch after 15 seconds, rather than in every second. I thought the problem is with while loop but, i realised there is no delay associated with it..... Can u kindly help me in modifing the program so as it can update the waveform, file data and gauge in real time? (I have attached the modified VI file "suitcase new.vi" and the data gathered after executing the same as in "Trail.txt".)
    Thank You,
    Rahul
    Attachments:
    trail.txt ‏85 KB
    suitcase new.vi ‏44 KB

  • How to open an Excel file and write data into it.

    Hi All,
    I have an excel template, which has graphs and some tables containing corresponding data. If i change the data in tables it changes the graphs. So, if i have a template in the server, is it possible for me to open this excel file and change the data in the tables to chanage the graphs. How can i go to different worksheets and go to different cells and change the values and save the file.
    Thanx in advance
    Cheers
    Pej

    You can setup an ODBC connection to the Excel file and update the file with JDBC, using the JDBC-ODBC bridge.
    Hope this helps

  • Using csv files and automatic activation

    We have setup our hybrid with Office 365, how can I use the csv file to migrate and activate the users at the same time ? I can migrate the users but they are not activate otomatically. Actually we plan to ONLY activate exchange not the other products
    like Lync, Sharepoint..at the moment.

    Hi,
    Do you migrate mailboxes to Exchange Online with a staged migration?
    When you migrate mailboxes to Exchange Online with a staged migration, you can create a CSV file to create a migration batch.
    After a migration batch has finished running, you need to convert the on-premises mailboxes in the migration batch to mail-enabled users by downloading related scripts and run it.
    I'm afraid there is no way to migrate and activate the users at the same time using a single CSv file.
    Here is a related article for your reference.
    http://technet.microsoft.com/en-GB/library/jj874018(v=exchg.150).aspx
    Best regards,
    Belinda Ma
    TechNet Community Support

  • Saving files and copying data into final cut express

    I have read that you can create duplicate copies of your I movie project and work between them so ou can make changes but always have a master file.
    but I have not been able to figure out how to do that. And after I am finishend with my project I can export it to final cut express for some additional editing and video layering is that true? can I do the sme with final cut pro or just final cut express? Also is there any quality loss in this method? Thanks in advance

    Hi tnt,
    you write on a piece of paper; you copy that paper; you continue to write on the first one... how does the copy "know", what you've written..???
    sorry, who gave you that advice of copying projects? backup frequently, yepp, that makes sense with ANY data on a computer... but with a backup you allways go back in time.....
    FCE is able to open iMovie project directly (don't know about FCP, don't own, $$$); BUT then the file structure is broken, you can NOT re-import project to iM, BUT the movie...
    you know the difference? a movie is one single block, a project makes it possible to edit transitions, to edit titles etc...
    for sure, you can export and import from and to iM and FCE/P without loss of quality as long as you stay with "dv stream" (native file format of both/all three apps). I'm, doing this a lot, using iM for editing and FCE "just" for "special effects", advanced titles etc.-
    hope I could be helpful

  • Sql loader - loading data into database problem

    Hi,
    I am facing problem in loading data using ctl file and a data file into oracle db.
    The table I am entering has the following structure:
    CREATE TABLE "ENRCO07"."TEST"
    "NAME" VARCHAR2(50 BYTE),
    "MOD_DATE" DATE DEFAULT CURRENT_TIMESTAMP NOT NULL ENABLE
    My ctl file has structure as:
    OPTIONS (DIRECT=FALSE, ERRORS=1000)
    LOAD DATA
    APPEND
    INTO TABLE TEST
    truncate
    FIELDS TERMINATED BY ";"
    TRAILING NULLCOLS
    NAME,
    MOD_DATE DATE 'YYYYMMDDHH24MISS'
    I tried a lot with MOD_DATE format since it was showing the error null cannot be inserted and other errors were also encountered.
    My problem statement is :
    I want the default date as current_timestamp in the format "'YYYYMMDDHH24MISS" else the date that comes from the data file to be loaded into oracle db.
    I can't alter the DDL only ctl can be altered to get to the solution.
    I am new to this , kindly help.
    Thanks
    Abhinav

    Thanks for the reply but
    still the problem is
    if my data file has records as:
    abhi1;20120416151900
    abhi2;20120417151700
    abhi3;
    abhi4;20120416151900
    and the ctl as:
    OPTIONS (DIRECT=FALSE, ERRORS=1000)
    LOAD DATA
    APPEND
    INTO TABLE TEST
    truncate
    FIELDS TERMINATED BY ";"
    TRAILING NULLCOLS
    USER_NAME CHAR NULLIF (USER_NAME=BLANKS),
    MOD_DATE DATE 'YYYYMMDDHH24MISS' NULLIF (MOD_DATE=BLANKS)
    The entered data in the db is:
    abhi1     16-APR-12
    abhi2     17-APR-12
    abhi4     16-APR-12
    the data with missing date is not loaded.
    Thanks

Maybe you are looking for