Reducing data rate written to a LVM file

I have a LabVIEW 7.1 question regarding data logging. I have designed a system that reads the pressure of 6 4-20Ma pressure transducers. This data is then filtered to remove any noise, and displayed to the user on the computer's screen.
I am currently reading the data at 1000HZ using the DAQ Assistant, which seems to give a clear picture of the pressure changes. My supervisor has asked me to allow the system to log the results for what may be 3 or 4 days. As you ca imagine, 48 hours at 1000 samples a second is a lot of data! I need some method to reduce the data rate written to the file. Ideally, I would like to take one second of data, find the highest reading, and then file a single entry and a timestamp interval of 1 second. So, for each hour, there would be 3600 rows with 7 columns (1 time stamp and 6 sensors). It would be nice if the 1 second average could be adjustable, so that if long term logging is to be done (a week, for example), the interval can be changed to 5 seconds or whatever.
I have tried reducing the data read rate as much as possible, but it seems that 100hz is the smallest that is possible. That is still too high... as the hard drive would fill up in no time.
Any help in this area would be appreciated. I have tried a few things, and all to no success. I have included a copy of the code for anyone to review.
Hardware:
- 1 X P3 Laptop running LabVIEW 7.1 data acquisition software
- 1 X NI DAQCard-6062E interface PCard
- 1 X NI SC-2345 data acquisition hardware with
- 3 X SCC-CI20 current input module
- 6 X Omega PX 0-600PSIA 4-20mA pressure transducer
thanks so much!
Andrew
Attachments:
Testing.vi ‏812 KB

You would have to talk with your supervisor first to determine what he intends to do with the log data and what degree of resolution he actually needs first. You probably want the 1000Hz Sampling rate so you can do decent filtering on the signal (hopefully your pressures aren't actually changing at a faster rate than that). I'm assuming you are returning a single result for those 1000 readings for each sensor. Specify some file logging duration of n-seconds (or minutes or whatever). Between file writes, pop each filtered measurement into an array (either 1 array of 6 dimensions or 6 1-dimension arrays). After n-seconds have passed, determine the min, avg, and max values from each sensor (array) and log those with your timestamp. So if you set your log timer for 1 minute, you would log a single min, max, and average reading of 60 readings for 6 sensors (this would only require 1 row with say a timestamp and 3x6 (18) columns for each sensor's min, max, average data). After a 24hour period you will have logged 1440 rows of data. In 3 days that would only be 4320 rows of data. All that is as easy as using a timer and a case structure around your logging function which would be triggered every n-seconds. Everything else you're doing would be the same. None of this really has much to do with labview as it is more of a logical explanation of how and when to acquire and log your data. What method are you using for storing your data? CSV, BIN, etc.. If you also want to display the data in a chart, I would recommend charting the same data you're logging, otherwise your chart will probably crash your system at 1000samps/second for 60 hours.. Once again, it depends on how your supervisor is analyzing the logged data. Make your log duration programmable and change it until he is happy. If he's(she's?) using Excel, your maximum log timer would be 9 seconds (Equates to ~6.67 Logs per Minute, ~400 Logs per Hour, ~9600 Logs Per Day, for a total of 28,800 Logs(rows) for 3 days -- Excel is limited to 32000 rows).

Similar Messages

  • LTE oversaturated? 3G data rates on LTE in Charlotte?

    For the last couple of days I have noticed extremely slow LTE rates that are often even slower than 3G in some cases - particularly the uptown area, but also in the North Charlotte area where I live. I have confirmed the same issue with multiple people who also have Thunderbolts that they are experiencing drastically reduced data rates; however, I don't think this is phone specific. I remember just last week I could pull down at least 5-10mbps and send back even more. The rates I've been seeing recently are <=.1mbps down and about the same up. That's absurd. I can't even stream music at low quality at that speed.
    What's going on Verizon? It's formal announcement time to keep your customers informed.

    Well, I may have figured it out partially. Something has happened to Verizon's coverage in various parts of uptown Charlotte - i.e. they're practically nonexistent. I can walk up and down College St and I find that some areas have next to no bandwidth capability whatsoever and others are full strength. This has absolutely nothing to do with the buildings as I have been all over uptown for the last year I have owned this phone and I never had a problem getting full LTE anywhere uptown. I would guess that the towers in the area have been re-prioritized or perhaps Verizon is no longer leasing near as much bandwidth in certain areas? Whatever the reason I have been reduced to practically zero coverage and now have ~1X data speeds unless I walk a block or two away. Awesome.

  • H.264 data rate difference between local and distributed encode

    Doing some testing today and noticed this curious behavior with compressor 3.5.1:
    Set video encode for a 50 second 1080i uncompressed file to H.264 at 50000 kbps (50 mbps). Audio is 24 bit, 48khz stereo with no compression.
    1. Sending the job to "This computer" for a local-only encode gives me a file size of 327 MB
    2. Sending the same job to the same computer using distributed encoding gives me a file size of 211 MB
    The data rate for the distributed encoded file is about 34 mbps, which isn't what was specified in the encode settings in Compressor. The data rate for the locally encoded file is about 52 mbps, which is what I would expect when the audio is taken into account.
    Why the difference? Both encoded files are the same duration, so there aren't any missing frames. They both look good and play well but this is giving me cause for pause.
    Also, another curious difference I noticed... The distributed file has the More Info fields populated with useful information in the Finder (dimensions, codecs, duration, channel count, total bit rate). The locally encoded file does not have this information.
    -Matt

    Hi,
    Not sure if this can be corrected whit customizing. Please see SAP note 1065932 for background on currencies in asset accounting. Some explanations on differences in currencies can be found there.
    Regards,
    Andre

  • What i smy actual data rate of FLV

    I need to be able to confirm my data rate of my final FLV
    file. I am using Sorenson Squeeze VP6 codec and have my video data
    rate at 150kbps. When I play it back using Quicktime or SWF player
    ( on OSX ) it says my data rate is 1400kbps. But when I play it
    back on windows XP using FLV Player from Applian and check the
    properties it says that I am at the correct bit rate. What gives?
    How do I double check that Sorenson has encoded my media at the
    correct bit rate.
    My basic settings in Sorenson are...
    2 pass vbr at 150 for video and 48 for audio.
    Thanks sooo much for your help!

    ok i've figured out the physics of this question.
    The Cache Read data rate is always larger than the Cache Write data rate, because the computer would have to be rendering to Cache faster than realtime for the Write rate to be higher, which would make it unnecessary to render to cache in the 1st place. So I'm really only worried about the Cache read data rate. Does adobe have a paper that tells us what the data rate is for different sequences.
    my 3 common workflows are
    canon h.264 1080 24p
    AVChd 1080 24p from my GH2 with a 44mb
    and
    r3d 5k epic footage 24p - (this is painful to edit )
    anyone know where this info is?
    thx,
    Jayson
    youtube.com/AWDEfilms

  • How to increase rate at which data is written to file?

    I have a program that reads data at 200kHz and I would like the program to write the data at the same rate. However, the fastest I seem to be able to get my program to write is about 5kHz. I have been using the DAQmx Read and the Format into File functions to read and write the data.
    I have tried moving the write function into a separate loop, so that the data would write to a file after the data collection was complete. However, this did not change the rate at which data was written to a file. Any suggestions for increasing the rate at which data is being to a file? Thanks for looking over my problem!
    Attachments:
    SampleWrite_Read.vi ‏58 KB

    Well, writing to a file is always slower since it takes some time to access the hard drive. I noticed in your program that you are writing into an ASCII file. That is also slower than if you write to a binary file. There are several examples that ship with LabVIEW that allow you to do High-Speed Datalogging (I actually believe that is the name of the examples). Those examples actually come in pairs, one does the datalogging, and another helps you read the file. I will recommend taking a look at them.
    The previous suggestions by Les Hammer is a great idea. Instead of acquiring 1-sample at a time, try acquiring 100 or 1000 samples and write them to the file.
    I hope that helps!
    GValdes

  • Problems with displaying read data from a .lvm file

    Hi all.
    I aquire data with the PCMCIA card 6036E. I aquire online in Labview 7 and store the data in a .lvm file. When i try to display the same data i aquired before with the "read .lvm file" express vi, the waveform chart redraws itself after an undefined time, sometimes it redraws faster and sometimes it takes longer. WHY? i have only one header per aquisition and if i restart the aquisition a new file is written! I tried almost everything. Is this a bug from labview 7?
    I really appreciate your help.
    best regards,
    Bernd

    Hi Khalid,
    Here is a simplified version of my VI and also 2 .lvm files from the logged data. Sorry for the size, but I aquire with a sample rate of 20 kS/s and my main frequency is only 0.3 Hz. In the 6MB file I aquired a little more than 3 periods. When you run the VI with this file at the beginning it redraws very often and very fast, then after some time it draws about 1 period and then it redraws again and so on. I want that the hole data from this file is displayed at once. The 900kB file is aquired with 10kS/s and about 1.5 periods. This is the last size the displaying is working with. You think it is possible that the VI only works until 1MB? But my data usually is much bigger than that. I hope you can use the data because it is
    zipped, but otherwise it wouldnt be possible to post it.
    Thank you very much for your help, I really appreciate it!
    Best regards,
    Bernd
    Attachments:
    simplified_VI_for_offline_data_display.vi ‏79 KB
    2004-07-01_Messung9.zip ‏191 KB
    2004-07-01_Messung4.zip ‏680 KB

  • How do I read my wave form data back from an LVM file?

    I collected a waveform and saved it using the LVM file format. I would like to read the waveform back into labview and display it with its timestamp and assigned name. The only way I could read the data back into Labview was to convert it to a number array. I figured if you could write a waveform and save all its data, you should be able to read it back rather easily. I've included my LVM file and two simple programs. The program I'm using is much larger, but these two programs are representative of what I'm trying to accomplish.
    Thanks
    Solved!
    Go to Solution.
    Attachments:
    Write To LVM.vi ‏96 KB
    Read From LVM.vi ‏67 KB
    Test Data_08-12-11_1252.txt ‏29 KB

    Hi Knoebel,
    To display the waveform data, you'll need to change a couple things.
    1. Open the "Convert from Dynamic Data" vi and change the Conversion to have a resulting data type of 1D array of waveform, as this is the datatype you are writing with the "Write To LVM.vi"  Currently you are converting to an array of scalars here, which is why you are losing timestamp data.
    2. Open the "Read from Measurement File" express VI and Change the Time Stamps to be Absolute (date and time) rather than relative. 
    3. If you want to display the timestamp on the waveform graph, pull up the properties window for the Waveform graph and change the Display Format to be Absolute Time for the X-axis and then check the Scales tab to be sure you have unchecked Ignore waveform timestamp on x-axes
    After making the change to convert from dynamic data type to 1D array of waveform you can also probe the wire going into the waveform graph to check the t0 and dt values of the waveform as you read from file.
    Lastly, if you look at the Write to LVM.vi block diagram, you will see a little red coercion dot between the waveform data wire and the Write to Measurement File data input terminal.  It would be better to use the "Convert to Dynamic Data" express VI to make this conversion.
    Hope this helps!
    Sherrie 

  • How do you fix error message "data rate for this file is too high for DVD.  You must replace this file with one of a lower data rate".

    When trying to burn a DVD it will go through the encoding step and at 98% we see the message 'data rate for this file is too high for DVD.  You must replace this file with one of a lower data rate".  We need help to correct this so we can complete burning to DVD. 

    What did you export from Premiere?
    Did you use the MPEG2-DVD preset... and did you make any changes to the preset?
    CS5-thru-CC PPro/Encore tutorial list http://forums.adobe.com/thread/1448923 may help

  • Error -20315 appear on the data saved in lvm files.

    Dear all,
    I use NI9205 as data acquisition board and DAQ assist in LabVIEW to get the signals. Sampling frequency was set to 250Hz, which means the dt should be equal to 0.004. The data were saved using Write to files as lvm files. When I read the data, the error 20315 shows up.  I found that in the saved file, dt all equal to 0. Open a small lvm file by notepad, I see the delta_x 0.000000. The files of data I have already acquired are huge, which means I cannot change Delta_x manually. Do you have any good ideas of how do I handle these files in Labview regarding the signal processing? So far I can not use almost anything because of the error.
    Thank you very much in advance.
    Kiwi

    Based on your description I have created a VI and attached the logfile. I havn't encountered the -20315 error.
    Please run this on your computer and let me know how it goes.
    Thanks!
    Dennis Morini
    Field Sales Engineer
    National Instruments Denmark
    http://www.ni.com/ask
    Attachments:
    Datalog.vi ‏187 KB
    test.zip ‏109 KB

  • PGC has an error--data rate of this file is too high for DVD

    Getting one of those seemingly elusive PGC errors, though mine seems to be different from many of the ones listed here. Mine is telling me that the data rate of my file is too high for DVD. Only problem is, the file it's telling me has a datarate that is too high, is a slideshow which Encore has built using imported jpg files. I got the message, tried going into the slideshow and deleting the photo at the particular spot in the timeline where it said it had the problem, now getting the same message again with a different timecode spot in the same slideshow. The pictures are fairly big, but I assumed that Encore would automatically resize them to fit an NTSC DVD timeline. Do I need to open all the pictures in Photoshop and scale them down to 720x480 before I begin with the slideshows?

    With those efforts, regarding the RAM, it would *seem* that physical memory was not the problem.
    I'd look to how Windows is managing both the RAM addresses and also its Virtual Memory. To the former, I've seen programs/Processes that lock certain memory addresses upon launch (may be in startup), and do not report this to Windows accurately. Along those lines, you might want to use Task Manager to see what Processes are running from startup on your machine. I'll bet that you've got some that are not necessary, even if IT is doing a good job with the system setup. One can use MSCONFIG to do a trial of the system, without some of these.
    I also use a little program, EndItAll2 for eliminating all non-necessary programs and Processes, when doing editing. It's freeware, has a tiny footprint and usually does a perfect job of surveying your running programs and Processes, to shut them down. You can also modify its list, incase it wants to shut down something that IS necessary. I always Exit from my AV, spyware, popup-blocker, etc., as these progams will lie to EndItAll2 and say that they ARE necessary, as part of their job. Just close 'em out in the Tasktray, then run EndItAll2. Obviously, you'll need to do this with the approval of IT, but NLE machines need all available resources.
    Now, to the Virtual Memory. It is possible that Windows is not doing a good job of managing a dynamic Page File. Usually, it does, but many find there is greater stability with a fixed size at about 1.5 to 2.5x the physical RAM. I use the upper end with great results. A static Page File also makes defragmenting the HDD a bit easier too. I also have my Page File split over two physical HDD's. Some find locating to, say D:\ works best. For whatever reason, my XP-Pro SP3 demanded that I have it on C:\, or split between C:\ and D:\. Same OS on my 3 HDD laptop was cool having it on D:\ only. Go figure.
    These are just some thoughts.
    Glad that you got part of it solved and good luck with the next part. Since this seems to affect both PrPro and En, sounds system related.
    Hunt
    PS some IT techs love to add all sorts of monitors to the computers, especially if networkded. These are not usually bad, but are usually out of the mainstream, in that most users will never have most of these. You might want to ask about any monitors. Also, are you the only person with an NLE computer under the IT department? In major business offices, this often happens. Most IT folk do not have much, if any, experience with graphics, or NLE workstations. They spend their days servicing database, word processing and spreadsheet boxes.

  • The data rate of this file is too high encore

    Hi,
    Im new to all this, and I need a real quick solution as this needs to be done for the next few days, please help.
    I get a "the data rate of this file is too high encore"the data rate of this file is too high for DVD" message when I try to brun to disk.`
    I assume this means the bitrate I used when exporting from premiere? I only have one file to go on the DVD, what bitrate should I use?
    Thanks

    What did you export from Premiere?
    Did you use the MPEG2-DVD preset... and did you make any changes to the preset?
    CS5-thru-CC PPro/Encore tutorial list http://forums.adobe.com/thread/1448923 may help

  • Gettin info about Flash file's data rate (and more) on Mac

    In Quicktime one can get all sorts of info like frame rate,
    data rate, codecs and more, but flah players don't do this. Right?
    I see there is special software for this for a PC, but what about
    Mac? Is there any way to get this info for files I didn't make
    myself.

    I am referring to video files that I am making out of .mov
    files. I am trying to find best cmpression options, but to do that
    it would be nice to study others' files.

  • Files failing to adhere to set data rate on export from Quicktime

    I am hoping somebody can shed some light on this.
    I open an uncompressed .mov file (output from FinalCut) with Quicktime with the intention of compressing it for the web. After File: export, in the options dialog box, in the field where you can restrict the data rate, I put a number, say 488 kbs/sec. But every time I do this, the finished compressed file has a data rate wildly higher, like 2000 kbs/sec, and so is too large for my purposes.
    I feel like I used to do this all the time- I have plenty of .mov files with data rates around 500 kbs/sec that I posted on my website, so this had to have worked at some time in the past.
    Anybody? Thanks

    Just a bump-

  • Data rate of AAC file converted with iTunes 7.5.0.20

    I have just noticed data rate of files converted to AAC with iTunes revision 7.5.0.20 deviate slightly from preset converting data rate in random manner. For example browser shows files with preset 192kBit/s data rate as 193 or 195kBit/s data rate, ones with 320kBit/s preset as 321, 322, 325kBit/s etc.
    Is this a encoder bug or a music content dependent data rate adjustment?
    Anyone a suggestion?
    Thanks and best regards

    Refer to one of the many threads already posted about this:
    http://discussions.apple.com/thread.jspa?threadID=1238169

  • How to name the data for each column I am acquiring in lvm file

    does anybody hint  How to name the data for each cloumn I am acquiring in lvm file.
    I want to tag or name ,eg temperature at top of a column which shows the temperature readings .I am writing into a labview measurement file.
    Thanks

    Use Set Waveform Attribute on each channel of your data.  Set an attribute with name "NI_ChannelName".  The value is a string containing the name you wish to call the channel.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

Maybe you are looking for