Best way to load in Excel script for cRIO (streaming​?)

Right now my program reads from an excel file to create custom tests to be performed by the cRIO.   I insist on excel being the input method since it is easy for non-programmers to use and manipulate.  The excel files are simple.  A String that describes an action (IE: Close Relay 5), and the second column denotes the amount of time to wait prior to the next line.   But these tests can range between 5 to 200 lines and in some setups there are 10-20 tests.
Reading from excel is slow, so if I read Excel as I execute the test it throws off the timing.   Loading at the beginning of every test is possible but cumbersome.  And loading in a gigantic 3-dimensional string array at the start is so far the best solution I've found but I am not happy with it.  A 20x200x2 string array is just... unnecessary and I believe it may even be causing some performance issues and inconsistencies.
So I'm trying to get it into a better format but there are so many possibilities it's almost overwhelming.  TDMS, LVM, Storage VI's, .dat/csv.  
Even if someone could point me in the right direction I could take it from there but I don't even know where to start.  I also have access to LabVIEW Self-Paced Training modules for just about everything offered so that could be used as a resource if I knew where to look.  I need to just take the classes, if only I could find the time.

It would help to understand your setup a bit more here. Obviously a cRIO running standalone can't read an Excel file, so what are you actually doing here? Is the cRIO running standalone, or is it just being used as network data acquisition and the application itself is running on the same computer as Excel? If the cRIO is standalone, how are you transferring the tests to it?
How long does each test take to execute?
You said that reading from Excel is "slow" - but it's not that slow. If you already have the Excel file open, I can't imagine it takes more than a second to read 2 cells, and I'd expect it takes much less. If the speed is really an issue, can you read the next test while the current one is still executing?
Are you saving the Excel data as a text document? Instead of loading an entire text file as a string (or string array), you could read line-by-line as you need it. That, too, would be fairly quick. Or, you could read line-by-line and parse each line as you read it into a more computation-friendly format.

Similar Messages

  • Best way to do an Excel data file load

    Hi
    I need to load Excel file’s data into an ORACLE table (on a frequent basis). While loading it, I need to validate each column's contents (using PL SQL code). Are there any packages/procs/APIs provided by ORACLE to do this kind of activity? What would be the best way to do an Excel file load within ORACLE ? FYI, I have Visual Basic code that reads data from Excel file and loads it into a temporary ORACLE table, then I validate data in this temporary table using a PL-SQL code in stored procedure. I am trying to avoid the "front end" process of this effort in VB and want to do the whole thing within ORACLE itself. Please let me know if you have any ideas.
    Your help is greatly appreciated!!
    Thanks in advance,
    Ram

    If you are running on Windows, you could try COM Automation which means moving your VB process into a stored procedure. I've never tried this myself, having been quite satisfied with Heterogeneous Connectivity.
    Tak

  • Best way to load initial TimesTen database

    I have a customer that wants to use TimesTen as a pure in-memory database. This IMDB has about 65 tables some having data upwards of 6 million rows. What is the best way to load this data? There is no cache-connect option being used. I am thinking insert is the only option here. Are there any other options?
    thansk

    You can also use the TimesTen ttbulkcp command line utility, this tool is similar to SQL*Loader except it handles both import and export of data.
    For example, the following command loads the rows listed in file foo.dump into a table called foo in database mydb, placing any error messages into the file foo.err.
    ttbulkcp -i -e foo.err dsn=mydb foo foo.dump
    For more information on the ttbulkcp utility you can refer to the Oracle TimesTen API Reference Guide.

  • Best way to load configuration

    Hi All
    I've developed an application and I want to give my client the possiblity to change some configuration data (paths, languaje, etc).
    Which is the best way to store this kind of data and wich is the best way to load it ?
    I don't want a simple book or manual reference. I'll appreciate words of experience wisdom.. :)
    Thanks in advance
    <jl>

    It used to be through java.util.Properties, but
    I've found the java.util.preferences classes to
    be almost too easy to use - pretty sweet stuff! (And
    that should be enough hints for you right there...)I'll read preferences tips in API reference.
    Thanks.
    <jl>

  • Best way to load data from External data to ADC

    Hi Guy,
    I am new in BAM. I currently working for a existing system that have all it data in existing system, which i only have the view access to the database/table.
    so i just wonder, what is the best way to load the data so that it can populate to the active data cache?
    thanks.

    Hello,
    Use the EDS.For reference http://docs.oracle.com/cd/E14571_01/integration.1111/e10224/bam_extl_data_sources.htm
    Regards
    Siva Sankar

  • What's The Best Way to Load a Replacement iPod Touch?

    I just received a replacement iPod Touch 3G for my daughter. Her iTunes library for the Touch that we returned to Apple is on her MacBook Pro. What is the best way to load the new Touch? She plans on using the same name for it. Should I restore from a previous backup or just plug it in and let in sync? Do I need to set anything on iTunes for disk management?
    Thank you,
    Bruce

    If she wants it like her old one the previous backup, if there is anything she wants to change though, now would be a perfect time to just resync it and start over. It really depends on what she wants.

  • What's the best way to load FieldPoint measurement data into PI System?

    I am finding the best way to load data collected by NI Field Point (FP2220) into the PI system of our power plant.
    I found pieces of information about FieldPoint OPC server in NI.com. Not sure if it comes with Field Point Hardware, sold by NI as a separate product or it is actually non-standard NI products. Anyway, I know that there exists a thing called FieldPoint OPC server.
    The PI system I mentioned has a OPC client software called PI-OPC interface. It is able to communicate with standard OPC DA server. If that FieldPoint OPC server is a standard OPC DA server provide data collected by Field Point complying to OPC standard, than that's perfect.
    Anyone familar with PI system and NI product, please help if the above is going to work or if there is a better way to put Fieldpoint data into PI.

    Hi Eric,
    This information really helpful, thanks. Regarding to the NI OPC server for NI FieldPoint, I have the other query.
    In my setup, there are two sets of FieldPoint located in two different locations on my ethernet network. They are going to be controlled by a single PC. If I am going to connect both my FieldPoint sets with OPC standard, how many NI OPC server for FieldPoint do I need to connect to? Are there two NI OPC servers each serves one FieldPoint set? Or there is only NI OPC server which serves both FieldPoint sets?
    I am concerning about the number of NI OPC server instances running, because the number of OPC client license I need to purchase depends on how many OPC server I need to connect to. If one NI OPC server serves both my FieldPoint sets, I only need to buy one OPC client license; otherwise, I need to purchase two. In the future, I am going to have another two sets of FieldPoint sets, so the answer of my query determines how many OPC clients I need to purchase eventually - One or four. A huge price difference.
    Looking forward to your reply.
    Regards,
    Roger

  • Best way to load a local XML

    What's the best way to load a local XML file into a Datagrid
    without using the HTTP service?
    Thanks

    Use Class.getResource() and Class.getResourceAsStream,
    when you eventually jar it up, just put the same
    hierarchy in the jar and they can be found.
    BufferedInputStream in = new
    BufferedInputStream(Class.getResourceAsStream("foo/bar/
    yimg.jpg"));for example.
    That should work out of the Jar too.Thanks v. much - got it all going some days ago but found it all fell apart when I ran the code within the jar. However I used the similar ClassLoader routines and found that if you first get a uri for the resource then it is a fully-qualified file uri. This works ok when running directly against files on disk, but once they are in a .jar it all turns to custard, giving you "java.lang.IllegalArgumentException: URI is not hierarchical".
    I followed your suggestion to use Class.getResourceAsStream and it now works perfectly - thanks again!

  • Best way to load messages - properties file or database?

    Hi Guys,
    I have a debate with my colleague about best way to load/keep GUI messages.
    As we known, all those messages are in properties file, web tier will handle the messages loading to GUI.
    However, my colleague suggested, we could save all the messages in a database table, when application starts up, it will load all the messages from database.
    Please help me figure out pros/cons for both ways? What's the best to deal with message loading?
    Thanks

    allemande wrote:
    Please help me figure out pros/cons for both ways?There is no big difference with regard to performance and memory use. Both are after all cached in memory at any way. I wouldn't worry about it.
    The biggest difference is in the maintainability and reusability. Propertiesfiles are just textbased files. You can just edit the messages using any text editor. It is only a bit harder to wrap it in a UI like thing, but you can achieve a lot using java.util.Properties API with its load() and store() methods. Another concern is that it cannot be used anymore if you switch from platform language (e.g. Java --> C# or so) without writing some custom tool to read Java style properties files at any way. Databases, on the other hand, require SQL knowledge or some UI tool to edit the contents. So you have to create a UI like thing at any way if you want to let someone with less knowledge edit the messages. This is more time consuming. But it can universally be used by any application/platform by just using SQL standard.

  • What is the best way to replace the Inline Views for better performance ?

    Hi,
    I am using Oracle 9i ,
    What is the best way to replace the Inline Views for better performance. I see there are lot of performance lacking with Inline views in my queries.
    Please suggest.
    Raj

    WITH plus /*+ MATERIALIZE */ hint can do good to you.
    see below the test case.
    SQL> create table hx_my_tbl as select level id, 'karthick' name from dual connect by level <= 5
    2 /
    Table created.
    SQL> insert into hx_my_tbl select level id, 'vimal' name from dual connect by level <= 5
    2 /
    5 rows created.
    SQL> create index hx_my_tbl_idx on hx_my_tbl(id)
    2 /
    Index created.
    SQL> commit;
    Commit complete.
    SQL> exec dbms_stats.gather_table_stats(user,'hx_my_tbl',cascade=>true)
    PL/SQL procedure successfully completed.
    Now this a normal inline view
    SQL> select a.id, b.id, a.name, b.name
    2 from (select id, name from hx_my_tbl where id = 1) a,
    3 (select id, name from hx_my_tbl where id = 1) b
    4 where a.id = b.id
    5 and a.name <> b.name
    6 /
    Execution Plan
    0 SELECT STATEMENT Optimizer=ALL_ROWS (Cost=7 Card=2 Bytes=48)
    1 0 HASH JOIN (Cost=7 Card=2 Bytes=48)
    2 1 TABLE ACCESS (BY INDEX ROWID) OF 'HX_MY_TBL' (TABLE) (Cost=3 Card=2 Bytes=24)
    3 2 INDEX (RANGE SCAN) OF 'HX_MY_TBL_IDX' (INDEX) (Cost=1 Card=2)
    4 1 TABLE ACCESS (BY INDEX ROWID) OF 'HX_MY_TBL' (TABLE) (Cost=3 Card=2 Bytes=24)
    5 4 INDEX (RANGE SCAN) OF 'HX_MY_TBL_IDX' (INDEX) (Cost=1 Card=2)
    Now i use the with with the materialize hint
    SQL> with my_view as (select /*+ MATERIALIZE */ id, name from hx_my_tbl where id = 1)
    2 select a.id, b.id, a.name, b.name
    3 from my_view a,
    4 my_view b
    5 where a.id = b.id
    6 and a.name <> b.name
    7 /
    Execution Plan
    0 SELECT STATEMENT Optimizer=ALL_ROWS (Cost=8 Card=1 Bytes=46)
    1 0 TEMP TABLE TRANSFORMATION
    2 1 LOAD AS SELECT
    3 2 TABLE ACCESS (BY INDEX ROWID) OF 'HX_MY_TBL' (TABLE) (Cost=3 Card=2 Bytes=24)
    4 3 INDEX (RANGE SCAN) OF 'HX_MY_TBL_IDX' (INDEX) (Cost=1 Card=2)
    5 1 HASH JOIN (Cost=5 Card=1 Bytes=46)
    6 5 VIEW (Cost=2 Card=2 Bytes=46)
    7 6 TABLE ACCESS (FULL) OF 'SYS_TEMP_0FD9D6967_3C610F9' (TABLE (TEMP)) (Cost=2 Card=2 Bytes=24)
    8 5 VIEW (Cost=2 Card=2 Bytes=46)
    9 8 TABLE ACCESS (FULL) OF 'SYS_TEMP_0FD9D6967_3C610F9' (TABLE (TEMP)) (Cost=2 Card=2 Bytes=24)
    here you can see the table is accessed only once then only the result set generated by the WITH is accessed.
    Thanks,
    Karthick.

  • Best way to start a new catalog for 2009 ?

    Hi,
    What's the best way to start a new catalog for the new year, with all the keywords and all the presets of the modules (without having to export all the folders) of previous catalog 2008 ?
    Thanks,
    Dominique

    Well there is a cost too in not being able to find all your images in a single step, and inconsistencies soon develop - eg a keyword is plural in one catalogue, singular in another. Speed or stability issues are not simply related to catalogue size, and I've seen decently-performing catalogues 50% bigger than yours (as well as slow ones of a few hundred images). Have you optimized the catalogue recently?
    But if you think it's a good idea to fragment control of your picture collection.... Presets will be carried over to a new catalogue as they belong to the machine (unless you have the save with catalogue preference turned on). Keywords can be moved via the Metadata > Export and Import Keywords command. If you have lots of collections and smart collections, then maybe make a copy of your existing catalogue and then remove all the items from it - making sure you don't trash them of course.
    John

  • Hi all! What is the best way to create the correct space for baseball jersey names and numbers? along with making sure they are the right size for large printing.

    What is the best way to create the correct space for baseball jersey names and numbers? along with making sure they are the right size for large printing.

    Buying more hard drive space is a very valid option, here.  Editing takes up lots of room, you should never discount the idea of adding more when you need it.
    Another possibility is exporting to MXF OP1a using the AVC-I codec.  It's not lossless, but it is Master quality.  Plus the file size is a LOT smaller, so it may suit your needs.

  • Best way to create an IPhone Application for my Blog

    What's the best way to create an Iphone application for my Blog? I've seen several blogs that have their own application.
    Could use some help,
    Used Car parts Guy
    <Edited by Moderator>

    Thanks for this info... I too am interested in creating my own application... Would love to hear from others...
    Do you think it brings in traffic?
    Are you charging for your application or free?
    Thanks,
    <Edited by Moderator>

  • Best Way to Load Data in Hash Partition

    Hi,
    I have partitioning by Hash on a Large Table of 5 TB. We have to load Data say more than 500GB daily on that table from ETL.
    What is the best way to Load data into that Big Table which has hash Partition .
    Regards
    Sahil Soni

    Do you have any specific requirements to match records to lookup tables or it just a straight load - that is an insert?
    Do you have any specific performance requirements?
    The easiest and fastest way to load data into Oracle is via external file and parallel query/parallel insert. Remember that parallel DML is not enabled by default and you have to do so via alter session command. You can leverage multiple CPU cores and direct path operation to perform the load.
    Assuming your database is on a linux/unix server - you could NFS load the file if it is on a remote system, but then you will be most likely limited by network transfer speed.

  • Best way to load "Action Essentials"

    Kinda new with using Macs....need to know the BEST way to load Action Essentials?

    well I was really over-thinking it! ha
    Thanks so much Rick!
    On Fri, Sep 12, 2014 at 12:24 PM, Rick Gerard <[email protected]>

Maybe you are looking for