What is the best way to write 10 channels of data each sampled at 4kHz to file?

Hi everyone,
I have developed a vi with about 8 AI channels and 2 AO channels... The vi uses a number of parallel while loops to acquire, process, and display continous data.. All data are read at 400 points per loop interation and all synchronously sampled at 4kHz...
My questions is: Which is the best way of writing the data to file? The "Write Measurement To File.vi" or low-level "open/create file" and "close file" functions? From my understanding there are limitations with both approaches, which I have outlines below..
The "Write Measurement To File.vi" is simple to use and closes the file after each interation so if the program crashes not all data would necessary be lost; however, the fact it closes and opens the file after each iteration consumes the processor and takes time... This may cause lags or data to be lost, which I absolutely do not want..
The low-level "open/create file" and "close file" functions involves a bit more coding, but does not require the file to be closed/opened after each iteration; so processor consumption is reduced and associated lag due to continuous open/close operations will not occur.. However, if the program crashes while data is being acquired ALL data in the buffer yet to be written will be lost... This is risky to me...
Does anyone have any comments or suggestions about which way I should go?... At the end of the day, I want to be able to start/stop the write to file process within a running while loop... To do this can the opn/create file and close file functions even be used (as they will need to be inside a while loop)?
I think I am ok with the coding... Just the some help to clarify which direction I should go and the pros and cons for each...
Regards,
Jack
Attachments:
TMS [PXI] FINAL DONE.vi ‏338 KB

One thing you have not mentioned is how you are consuming the data after you save it.  Your solution should be compatible with whatever software you are using at both ends.
Your data rate (40kS/s) is relatively slow.  You can achieve it using just about any format from ASCII, to raw binary and TDMS, provided you keep your file open and close operations out of the write loop.  I would recommend a producer/consumer architecture to decouple the data collection from the data writing.  This may not be necessary at the low rates you are using, but it is good practice and would enable you to scale to hardware limited speeds.
TDMS was designed for logging and is a safe format (<fullDisclosure> I am a National Instruments employee </fullDisclosure> ).  If you are worried about power failures, you should flush it after every write operation, since TDMS can buffer data and write it in larger chunks to give better performance and smaller file sizes.  This will make it slower, but should not be an issue at your write speeds.  Make sure you read up on the use of TDMS and how and when it buffers data so you can make sure your implementation does what you would like it to do.
If you have further questions, let us know.
This account is no longer active. Contact ShadesOfGray for current posts and information.

Similar Messages

  • What is the best way to kill/stop a data load?

    Hi.
    What is the best way to kill/stop a data load?
    I have a data load from my QA R/3 system that is extracting 115.000.000+ records. The problem is that the selection in the function module used in the data source does not work, and the problem was not detected because of the nature of the data on the development system.
    I could kill processes owned by my background user (on both R/3 and BW) but I risk killing other loads, and sometimes the job seems to restart if I just try to kill processes. If I remove transactional RFCs in SM58 the load does not terminate; I only skip one or more datapackages. I have also tried to change the QM-status in the monitor to red, but that does not stop the load either...
    So isn't there a nice fool-proof way of stopping a dataload?
    Best regards,
    Christian Frier

    Hi,
    There r 2 ways to kill the job.
    One is using transation RSMO locate the job and display the status tab double click on the yellow light that is shown on the line total, a pop will come 'set overall status ' is displayed select the desired status that is red and save it. Then return to the monitor page and select the header tab double ckick on the data target right click and then goto 'manage',there should be request sitting there probably with yellow lights , highlight the line with the faulty request click the delete button then click refresh button.
    Second is goto SM37 and click on the active selection and enter the jobname and then click excute the particulr job should appear highlight the jobname then click on the stop iconthat appears on the taskbar( 3 rd from left)
    hope it is clear.
    Regards-
    Siddhu

  • What's the best way to write freehand with InDesign?

    I have a Wacom and want to place some handwriting on my document - what's the best way to do this?

    Try the pen or pencil tools or do it in Photoshop and place it.
    Bob

  • What is the best way to write management pack modules?

    i have written many modules using powershell script but when i deploy that management pack on SCOM it is throwing so many errors saying powershell script has dropped due to timeout.
    My Mp has lot of powershell script which gets the data from the service which executes for each and every instance the mp certainly has around 265 Instances and the powershell have to execute for each and every instance.
    how can i improve the scripts?
    do i need to use someother scripting language like javascript or VBScript in the management pack to execute different modules.
    what is the best practice to write Modules
    i have useed even the cookdown for multi instance data gathering
    Thanks & Regards, Suresh Gaddam

    One thing you have not mentioned is how you are consuming the data after you save it.  Your solution should be compatible with whatever software you are using at both ends.
    Your data rate (40kS/s) is relatively slow.  You can achieve it using just about any format from ASCII, to raw binary and TDMS, provided you keep your file open and close operations out of the write loop.  I would recommend a producer/consumer architecture to decouple the data collection from the data writing.  This may not be necessary at the low rates you are using, but it is good practice and would enable you to scale to hardware limited speeds.
    TDMS was designed for logging and is a safe format (<fullDisclosure> I am a National Instruments employee </fullDisclosure> ).  If you are worried about power failures, you should flush it after every write operation, since TDMS can buffer data and write it in larger chunks to give better performance and smaller file sizes.  This will make it slower, but should not be an issue at your write speeds.  Make sure you read up on the use of TDMS and how and when it buffers data so you can make sure your implementation does what you would like it to do.
    If you have further questions, let us know.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • What is the Best way To Copy and paste data from 1 book to another

     I have 18 sheets in 5 different books that I want to extract data from specific cells.  What is the best way to do this?  Example:  1 sheet is called Numbers E-O1 data in 13:WXYZ. The data updates and moves up 1 row every time I enter
    a new number. So let's say I enter the number 12. Through a lot of calculations the output goes in 13:WXYZ. To what I call a counter which is a 4 digit number.  Anyways, how can I send that 4 digit number to a totally different sheet?  To bullet
    what I'm talking about
    data in cells Row 13:WXYZ in book called Numbers sheet E-O1
    send data to book called "Vortex Numbers" Sheet E-O row 2001:CDEF
    What formula or Macro can I use to make this work?
    thank you!

    Hello Larbec,
    Syntax:
    '[BookName]SheetName'!Range
    Enter in cell  2001:CDEF:
    ='[Numbers]E-O1'!13:WXYZ
    This assumes that the file is open in Excel. Otherwise you need to add the path:
    'ThePath[BookName]SheetName'!Range
    Best regards George

  • What is the best way to stack DAQ aquired data in labview?

    I'm developing an application to work with an M-series daq card and labview 8.5 to output a signal and then record on 8 differential inputs for a short period of time (~10 ms). I need to stack my data, however, because the incoming signal will be very, very small, even after amplification. So basically i'm running a slightly modified version of the multifunction Synch AI-AO.vi (included with the install of daqmx). What is the best way for me to rerun this vi a set number of times and add new data directly to the old data (not cat-ing or anything, like |sample 1 of run 1| + |sample 1 of run 2| = stacked stample 1).
    A slightly modified version of the mutlifunction synch AI-AO.vi is attached.
    Attachments:
    des_v2_Multi-Function-Synch AI-AO.vi ‏143 KB

    Hi LSU,
    see attachment on how to "stack" several measurements. I simply add the waveforms and use a shift register to keep the last iterations value.
    Writing to files in each iteration is extremly CPU consuming - especially with express vis. Using for loops for just one iteration is "senseless". You could enable the conditional terminal of the for loop to realize your stop feature.
    For your message 4:
    Have you ever tried all the things you asked for? Sometimes it's easiest to just try&error
    And for the "n=n+x" question: It really helps to take the free online courses offered by NI!
    Message Edited by GerdW on 11-11-2009 06:27 PM
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome
    Attachments:
    des_v2_Multi-Function-Synch AI-AO.vi ‏128 KB

  • What's the best way to handle all my data?

    I have a black box system that connects directly to a PC and sends 60 words of data at 10Hz (worse case scenario). The black box continuously transmits these words, which contain a large amount of data that is continuously updated from up to 50 participants (again worst case scenario) 
    i.e. 60words * 16bits * 10Hz * 50participants = 480Kbps.  All of this is via a UDP Ethernet connection.
    I have LabVIEW reading the data without any problem. I now want to manipulate this data and then distribute it to other PCs on a network via TCP/IP.
    My question is what is the best way of storing my data locally on the interface PC so that I can then have clients request the information they require via TCP/IP. Each message that comes in via the Ethernet will relate to one of the participants, so I need to be able to check if I already have data about that participant - if I do then I can just update it, if I don't I need to create a record for the participant, and if I havn't heard from one for a while I will need to delete it. I don't want to create unnecessary network traffic. I also want to avoid global variables if possible - especially considering that I may have up to 3000 variables to play with.
    I'm not after a solution, just some ideas about how to tackle this problem... I thought I could perhaps create a database and have labview update a table with the data, adding a record for each participant. Alternatively is there a better way of storing all the data in memory besides global variables?
    Thanks in advance.

    Hi russelldav,
    one note on your data handling:
    When  each of the 50 participants send the same 60 "words" you don't need 3000 global variables to store them!
    You can reorganize those data into a cluster for each participant, and using an array of cluster to keep all the data in one "block".
    You can initialize this array at the start of the program for the max number of participants, no need to (dynamically) add or delete elements from this array...
    Edited:
    When all "words" have the same representation (I16 ?) you can make a 2D array instead of an array of cluster...
    Message Edited by GerdW on 10-26-2007 03:51 PM
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • What is the best way to upload icons I created on illustrator, onto a photoshop file of a website design?

    I currently in the process of designing a website on Photoshop. I have designed some icons for the website using Illustrator. What is the best way to upload these icons onto the photoshop file and keep them looking crisp and high quality? I have tried a number of ways but all seem to come out pixilated.

    What you need to do is make sure you are creating the Illustrator icons at the same ppi as the Photoshop document and at the correct size. If you have created the icons say with 300ppi but are in pixels it won't carry the same properties as the website you are creating if that has been set up as 72ppi and therefore will distort.
    Once you have mastered that then you are best (if you think there may be changes to the icons later down the line) to create .ai files of each icon and place them intt the Photoshop document, then if you make any changes to the source .ai file it will update within you website design.
    If you don't see the icons changing then you can always just copy and paste directly from Illustrator to Photoshop and make any changes on the fly by making sure you keep it as a smart object.

  • What is the best way to load and convert data from a flat file?

    Hi,
    I want to load data from a flat file, convert dates, numbers and some fields with custom logic (e.g. 0,1 into N,Y) to the correct format.
    The rows where all to_number, to_date and custom conversions succeed should go into table STG_OK. If some conversion fails (due to an illegal format in the flat file), those rows (where the conversion raises some exception) should go into table STG_ERR.
    What is the best and easiest way to archive this?
    Thanks,
    Carsten.

    Hi,
    thanks for your answers so far!
    I gave them a thought and came up with two different alternatives:
    Alternative 1
    I load the data from the flat file into a staging table using sqlldr. I convert the data to the target format using sqlldr expressions.
    The columns of the staging table have the target format (date, number).
    The rows that cannot be loaded go into a bad file. I manually load the data from the bad file (without any conversion) into the error table.
    Alternative 2
    The columns of the staging table are all of type varchar2 regardless of the target format.
    I define data rules for all columns that require a later conversion.
    I load the data from the flat file into the staging table using external table or sqlldr without any data conversion.
    The rows that cannot be loaded go automatically into the error table.
    When I read the data from the staging table, I can safely convert it since it is already checked by the rules.
    What I dislike in alternative 1 is that I manually have to create a second file and a second mapping (ok, I can automate this using OMB*Plus).
    Further, I would prefer using expressions in the mapping for converting the data.
    What I dislike in alternative 2 is that I have to create a data rule and a conversion expression and then keep the data rule and the conversion expression in sync (in case of changes of the file format).
    I also would prefer to have the data in the staging table in the target format. Well, I might load it into a second staging table with columns having the target format. But that's another mapping and a lot of i/o.
    As far as I know I need the data quality option for using data rules, is that true?
    Is there another alternative without any of these drawbacks?
    Otherwise I think I will go for alternative 1.
    Thanks,
    Carsten.

  • What's the best way to load FieldPoint measurement data into PI System?

    I am finding the best way to load data collected by NI Field Point (FP2220) into the PI system of our power plant.
    I found pieces of information about FieldPoint OPC server in NI.com. Not sure if it comes with Field Point Hardware, sold by NI as a separate product or it is actually non-standard NI products. Anyway, I know that there exists a thing called FieldPoint OPC server.
    The PI system I mentioned has a OPC client software called PI-OPC interface. It is able to communicate with standard OPC DA server. If that FieldPoint OPC server is a standard OPC DA server provide data collected by Field Point complying to OPC standard, than that's perfect.
    Anyone familar with PI system and NI product, please help if the above is going to work or if there is a better way to put Fieldpoint data into PI.

    Hi Eric,
    This information really helpful, thanks. Regarding to the NI OPC server for NI FieldPoint, I have the other query.
    In my setup, there are two sets of FieldPoint located in two different locations on my ethernet network. They are going to be controlled by a single PC. If I am going to connect both my FieldPoint sets with OPC standard, how many NI OPC server for FieldPoint do I need to connect to? Are there two NI OPC servers each serves one FieldPoint set? Or there is only NI OPC server which serves both FieldPoint sets?
    I am concerning about the number of NI OPC server instances running, because the number of OPC client license I need to purchase depends on how many OPC server I need to connect to. If one NI OPC server serves both my FieldPoint sets, I only need to buy one OPC client license; otherwise, I need to purchase two. In the future, I am going to have another two sets of FieldPoint sets, so the answer of my query determines how many OPC clients I need to purchase eventually - One or four. A huge price difference.
    Looking forward to your reply.
    Regards,
    Roger

  • What is the best way to put LabVIEW DSC data into an Oracle database?

    I have been collecting data using LabVIEW DSC 7.0 for several years and have always accessed the data from the Citadel database via the Historical Data Viewer.  I would now like to begin putting this data into an Oracle database.  My company stores all their data in Oracle and it would provide me all the benefits of their existing infrastructure such as automated backups, data mining tools, etc.
    My initial thought is to use "Read Trace.vi" in LabVIEW to pull historical data from the citadel database at regular intervals (e.g. 1 minute) and insert this data into Oracle via ODBC.  In this way, I do not need to track the value changes in order to know when to write to Oracle.  I also considered replicating the citadel database using some other method, but I recall that the tables used by citadel are somewhat complicated.  I only need a simple table with columns for channel, timestamp, and data.  The "Read Trace.vi" will provide me data in this format.
    I do not need to update the Oracle database in real time, a few minute delay is acceptable. If anyone has a better idea or additional insight please let me know. Thanks.

    In terms of connectivity, you want to use ADO, not ODBC. Beyond that, it all depends on the structure of the data and what you are going to want to do with it. This is a very big question that you need to be getting some in-depth assistance.
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • What is the best way to extract/consume HR data to and from Peoplesoft?

    I have to do integration between Peoplesoft and Oracle ERP to send HR data from Peoplesoft to Oracle HRMS. I will use Oracle Fusion Middleware for building the interface. I don't have much experience in working with Peoplesoft tools, like PeopleTool, App Engine, Integration Broker, Component Interface and Oracle adapter for Peoplesoft. Can somebody please let me know what is the most popular or recommended or best way to extract HR data from Peoplesoft.
    The second interface will consume some HR data from Oracle. Once again I need to know different ways to insert data into Peoplesoft.
    I appreciate if you can share your experience for the above two scenarios.

    If you plan on buying a apple laptop you have two options. You can use time machine or pull the data off of it directly and put it onto the external. I advise the second choice. This way you do not have stuff that you dont need filling up the space on your new drive. Get your applications, music, documents, photos, videos and perhaps downloads and desktop; Put all of these on your external. And then reinstall osx onto your drive.
    *** If buying a windows machine next check the format of the drive in disk utility. You will need a program called Paragon NTFS to format the drive window compatibility. You have to pay for the free version but can download a trail version for free.
    http://www.paragon-software.com/home/ntfs-mac/
    Did i leave anything unanswered?

  • What is the best way to write a math book?

    should i use graphic tablets such as wacom,smartpen etc. or should i type it using a program like math type.
    thanks for replies.

    should i type it using a program like math type.
    Yes.
    http://m10lmac.blogspot.com/2008/12/typing-equations-and-formulas.html
    But for serious math publications I think LaTeX is often used.

  • What is the best way to back up an external drive with all my referenced files?

    I just backed everything up using carbon copy cloner but when Iaunch aperture none of the links work even when I try to relink.
    this is really frustrating
    m

    Tell us what happened, exactly, when you went through the Reconnect dialogue in Aperture when referenced images were not found.  I have done it successfully numerous times.
    Ernie

  • When traveling aboard, what's the best way to go about voice data services

    I will be traveling for 5 weeks in Turkey and I will like to be able to use my iPhone to make emergency calls and maybe be able to get 3G coverage for navigation and information purposes.
    I understand that service fees for this kind of situations can be astronomical, but does anybody knows if one can sign for local service since the iphone recognizes foreign networks and links to them when in other country.

    Thank you all for your help.
    Unfortunately my iPhone is locked to Telcel, and I just got an special voice plan for Turkey from them at about $4 USD per minute, which been expensive, it's about half of what the charge would be without it.
    But not data deal, so I think I will be limited to WiFi only.

Maybe you are looking for

  • The music on my iPod will not transfer to my iPhone or iPad from iTunes...

    The music on my iPod will not transfer to my iPhone or iPad from iTunes...

  • ADF app

    Hi All, I developed a brand new ADF application and deployed in weblogic server. It works great. When I try to run the same app in ipad (one with iOS 4.3.3 and another with iOS 4.3.2) safari browser, it does not work. Does ADF application work in iPa

  • How to connect Encore WT8-B-102 to the TV

    hi, I have toshiba encore wt8 b 102. I dont have micro hdmi, only micro usb. How can I connect my tablet to tv (no wifi) ? Message was edited by: kropek

  • IPod to iPhone Face Time - thank you for your help : )

    I have tried to use face time with someone who has an iPhone - both on IOS 5. We both have wi-fi and ended up having to switch to Skype instead. Can someone please help? Should we both be using e-mails to connect? I use her phone number to send text

  • SS2K_SYSLOGINS ORA-01401 using WB for SQL server 2000

    Hi there, I've read some threads on this issue on this forum but no solution to be found. I noticed that the NAME field on the table SS2K_SYSLOGINS has been declared as varchar2(30) and is reading the SYSLOGINS views column NAME ( as declared sysname