LOGGING RATE IN LABVIEW 7.0

is there an easy way to set a a different logging rate than the scan rate when moving data from DAQ assistant VI to Write Labview Measr. File VI.

Hi Saridi,
From what I understand you�re trying to acquire data at a certain rate say X Samples/s and then you only want to record a subset of the data. For example if your logging rate is X/2 Samples/s you would only record every other data point returned by the board.
The data is returned by the DAQ Assistant as the dynamic datatype. What you�ll need to do is manipulate this data to eliminate the samples you don�t want to write before you pass this data to the Write LabVIEW Measurment File VI. You can convert the dynamic data type to a waveform which you can then manipulate using the Convert from Dynamic Data VI. If you double click on this VI you can choose the type of data you want to convert the dynamic data to. Once you have the data in th
e waveform type you can then extract the data from the waveform using the Get Waveform Components function. You will then have the data in the form of an array and you can use the Decimate 1D Array function to pick out every other (or every third etc) element. You then need to rebuild the waveform so that you can pass it into the Write to LabVIEW Measurement File VI. One thing to remember is the dt for this new waveform is different from the original because you only have half as many samples. Please take a look at the attached screen shot for an example of the method I just described.
I hope this helps!
Sarah Miracle
National Instruments
Attachments:
df_5_24_04.bmp ‏252 KB

Similar Messages

  • Independen​t Sample and log rates

    I would like to log data at 0.5 second intervals while the daq assistant acquires data at 1KHz,
    I tried using the write to measurement file express vi but couldn't separate the sampling and logging rates.
    Any suggestions with code is always appreciated.
    Thanks !!
    Attachments:
    logger rates.vi ‏87 KB

    Your VI is saved in LabvIEW 2013, hence I can't open, either save it in previous version and upload or share the snapshot of block diagram.
    Well, have you tried running two parallel loops one for acquisition and other for data logging.
    Refer to these:
    1. Application Design Patterns: Producer/Consumer
    2. Data Acquisition Reference Design for LabVIEW
    I am not allergic to Kudos, in fact I love Kudos.
     Make your LabVIEW experience more CONVENIENT.

  • I want to play video on my computer to make some analysis to frames,the problem that I face ,I can't change video frame rate using labview,but I can change frame rate to the video out of labview using some program

    HI All
    I want to play video on my computer to make some analysis to it's frames,the problem that I face ,I can't change video frame rate using labview,but I can change frame rate to the video out of labview using some program .
    I used IMAQ AVI Read Frame VI
    for example I have avi video It's frame rate is 25 fbs ,my image processing code is very fast that can process more 25 fbs,so I want to accelerate video acquisition

    Hi abdelhady,
    I looked into this further, and reading an AVI file into LabVIEW faster than its frames per second won't be possible. LabVIEW could read in frames faster than 25fps, but because it will be pulling the available frame at that point in time this would just give you duplicate frames. If you want to be able to read in frames at faster than 25fps, you would need to speed up your AVI file before reading into LabVIEW.
    There's a good shipping example to show how to read in from an AVI file, "Read AVI File.vi". You'll notice that they add timing to make sure that the while loop runs at the right speed to match up with the frames per second of the file being read. This is to make sure you're not reading duplicate frames.
    Thank you,
    Emily C
    Applications Engineer
    National Instruments

  • Ace Module logging rate limit

    Hi All,
    I have tried to configure the above parameter but it doesn't seem to be working.
    The version running on the ACE is 2.3.4 and I am running multiple contexts.
    The below configuration was tried on one of the contexts, not being Admin.
    The command I used was :
    logging rate-limit 42 60 message 251010
    What I am trying to achieve here is receive notification that a rserver has failed its connectivity check, therefore alerting the relevant people.
    The issue I am encountering is that every second I receive all the alerts again.
    I am only wanting to receive the alert once if possible and gain once the rserver has come back online.
    Is this possible, if so please explain how I can do it?
    TIA.
    Jack.

    your rate limit should be giving you 42 of those messages per 60 seconds. But this is health probe failure which depending on how many does not necessarily mean server is down. (depends on fail count). also it is level 6 message. the message you really want is:
    Error Message    %ACE-4-442001:  Health probe probe name detected real_server_name
    (interface interface_name) in serverfarm sfarm_name changed state to UP
    Explanation    The state of a  real server changed from down to up.
    Recommended Action    None  required.
    442002
    Error Message    %ACE-4-442002:  Health probe probe name detected real_server_name
    (interface interface_name) in serverfarm sfarm_name changed state to DOWN
    suggest you do logging at level 4  and you will only see the message when server state changes

  • Reverse Engineering Portmon log to obtain Labview serial configuration settings

    Hi there,
    I am attempting to interface Labview with a custom-designed power supply device through a USB-RS232 adapter. The power supply came with a "debugging interface" programmed in java with which one can send and recieve commands to the board (detect, set voltage, get voltage, stop, reset, etc.) through the USB-RS232 adapter which works fine.
    I however am not experienced with java in the least and want to streamline command sequences with Labview, as I am interfacing the rest of my lab equipment with Labview. I dont have experience connecting serial devices, I do have USBTMC & GPIB VISA interfaces with the other equipment.
    Nonethless I've tried the advanced serial communication example provided by NI, (http://www.ni.com/example/27665/en/) - I have appropriately set to COM3 as my device manager (and MAX) find the correct port for the device, baud rate, data bits, stop bits, parity, flow control, XON/XOFF characters, Input Buffer Size, End read/write with termination character, which termination character, and how long till time out.
    Yet from my Portmon log, I'm seeing that the java interface sets a few other parameters that I (at the moment) do not have access to through the labview VI (or from MAX). How do I set Output Buffer Size? What are  EOF? EVT? Replace? XON/OFF limits? WC? How do I set them?
    Both Portmon log files are attached.
    Attachments:
    javaserial device config log.txt ‏4 KB
    configlabview.LOG ‏6 KB

    Output Buffer Size - This is set with the VISA Set I/O Buffer Size VI that is in the example you were looking at. The documentation says it is just the Input buffer size but it is actually the I/O buffer size.
    This forum post answers your question about XON/OFF Limits:
    http://forums.ni.com/t5/LabVIEW/Can-I-set-XonLimit-and-XoffLimit-for-serial-port-communication/td-p/...
    I am not sure yet about the other commands you have asked about. Have you looked around in the VISA Serial property node options? Have you made any progress?
    Applications Engineer
    National Instruments

  • Web server log files under LabVIEW RT

    Hello,
    Is there any log files for the web server under LabVIEW RT running on a PXI?  The log file checkbox is disable in the target options.
    Also, is there a way to find out if the web server is running on the target?
    I'm asking the question because I can't access anymore a remote panel (and even the root of the web server) on my PXI.  I reinstall everything and I still have the issue.  I'm pretty sure that IT changes some network settings but they say no.  I need to validate that everything is working on my side before before taking further action.
    Thanks,
    Patrick

    Hi Patrick, 
    I would suggest having the log file checkbox enabled in the target options.  That should show if errors are occurring while it is running. Are you running remote front panels and web services, or just remote front panels?
    Can you provide a picture of all the software you have installed on your PXI?  You should be able to find this in MAX under the PXI in Remote Systems.
    Scott A
    SSP Product Manager
    National Instruments

  • Data log rate of Counter Period acquisition

    I'm measuring and recording the period of a digital input pulse train (input to NI9421 card on a 'C' series DAQ chassis and recorded via LabVIEW SignalExpress 2009).  The 'Counter Period' acquisition step is configured to use '1 Counter' measurement method.
    In order to record the data I have to use '1 Sample (On Demand)' Acquisition mode.  
    When I review the recorded data, how can I determine:
    a) the sample frequency of the recorded data points   ?
    b) the absolute timestamp of any data point   ?
    The log file has no information regarding the recording frequency, and from my experiments this seems to change based on the measured period of the input.
    Any ideas??
    Solved!
    Go to Solution.

    Hi a.yearsley,
    The counters on the 1st generation cDAQ chassis (9172) do not support sample-clocked period measurements.  I'm assuming you are using this chassis since you mentioned that you have the module in slot 5 or 6.
    Your edge count workaround is very similar to the High Frequency 2 counter method.  In either case, you specify a time duration and count the edges of your external signal during this time.  The 2nd counter is used to generate the gate signal for the specified time duration.  Your measurement error with this method is up to 1 period of the external signal, so it is more commonly used with higher frequency signals (I'm not sure what the frequency of your signal is).
    The standard 1 counter method counts the number of ticks of an internal timebase (80 MHz) during one period of your input signal.  Since the signal itself is what is gating the measurement, the sample is latched into the buffer on the edge of the signal and is not clocked independently.  If desired, you could configure Implicit Timing, which would give you a period measured for every edge of the input signal.
    Taking the above idea one step further, you could just configure an edge count task with the 80 MHz timebase as the source.  Use the external signal as a sample clock.  The only difference between this and using the standard period measurement with implicit timing is that the counter is not reset after every sample.  This might make it easier to keep track if you want to log a sample every x seconds (i.e. once the total count passes a certain value).  You can find the period by subtracting consecutive counts and multiplying by the period of the timebase (12.5 ns).  The counter would rollover after about 53.69 seconds, but if you read the count as U32 there won't be any problem with the subtraction (0000 - FFFF = 1 if the numbers are U32).
    If you are on a 2nd generation cDAQ chassis (e.g. 9174, 9178, 9188), then you actually do have support for a sample-clocked period measurement.  You can choose whether or not to enable averaging.  The 9178/9174 user manual has diagrams showing the sample-clocked frequency measurement which is essentially the same thing (the driver inverts the period measurement to obtain frequency).  You must guarantee at least 1 edge of your external signal between sample clocks if using this method.  The clock can come from a number of sources--I would probably recommend using another counter to generate it.
    Best Regards,
    John Passiak

  • Logging data with LabView and FP-DI-330

    Hi,
    I have a problem.
    I am monitoring 6 ports of the FP-DI-330, via LabView.
    I have it runing in a while loop that desides the update frequency. It then get stored in a txt file with the time in Hours, Minutes and Seconds. I need the time in milli seconds to skill betwen the diferent input times. Because the different input´s can come in the same second.
    i have added the vi file
    Enyone have an sugestion?
    Attachments:
    FP_DI_330.vi ‏75 KB

    First of all...
    i will throw away the boolean display. But the Spreadsheet is the formation of the logfile is it not?
    I will need that to get the data nicely organized.
    I dont know how to get milli seconds displayed enywhere. If i use for example the tick count (MS) it starts with a number around 3287101, should it not start with 1 and then count up?
    "Further in your vi, you forgot to connect the current time to the Get Date/Time string. Can't you use the millisecond values for building the time-values?"
    How can i connect the current time to the Get Date/Time String? And how should i use the ms timer values for building time-values.
    file-info of the log-file? What?
    Thanks for the ideas but i could really use some help.
    I would think th
    at it could not be that dificult making a simple ms count to add to the time for every line of the logfile. But i cant get it to work.

  • Slow Log Rate Needed using cDAQ-9174

    All,
    I am logging strain data for a thermal sweep we are doing on one of our products.  The sweep occurs over a long period of time (48 hours) so I realistically want to log one signal per second.  I tried using a continuous sample and then subsetting it, but I can't get it to 1 sample per minute which is where I want it to be.  Does anyone have any suggestions on using the subset (or another method) to slow down the capturing of data for the log?
    Thanks,
    Adam

    Nice, sounds like you've figured it out.
    The 9235 does indeed have a minimum sample rate of 794 S/s when using the internal timebase.
    For future knowledge, here's what I would do to simplify things a little:
    Set your "Samples to Read" to match your "Sample Rate".
    -Since we have a limitation of nothing less than 794 S/s just set them both to 1k for simplicity.
    Create an Amplitudes and Levels step and select your strain channels as the "Export to DC Value"
    -This is a step I use for almost every test, strain especially. It will take your raw Waveform signal and turn it into a Scalar signal. Honestly not sure whats going on behind the scenes here but there's some averaging it does between the samples to read and the sample rate. Basically it works out to be a smoothing filter which yields a much cleaner signal for your final output signal.
    To determine your actual sample RATE when recording scalar signals, divide the samples to read into the sample rate.
    In our case 1000 samples to read divided by 1000 Hz is 1 S/s. Try it, I think it'll be inline with what you're looking for. I just ran a test file to be sure and with the above settings I recorded for 10 seconds and got 10 data points in my data file.
    I never save my data to log file either, I save to ASCII exclusively but the results should be the same whether you use the "Record" option or "Save to ASCII" step.
    Hope that helps!

  • Problems with Sampling rate in LabVIEW with 1/4 bridge strain gage

    Hello,
    I have a question.  I have SCXI 1001 and am trying to read strain gages and thermocouples.  In the slot 1 I have a 1122/1322 with thermocouples that works great.  In slot 2 I have a 1121/1321 that needs help.  I cannot figure out how to make a LabVIEW program to read the strain gage. 
    Is computer hardware a factor?  I have a relatively old Dell, 1.7 ghz, 256 MB RAM, and all of my programs bog my machine down.  My thermocouple program with 1 functioning channel takes almost 2 minutes to load. 
    Anyway, I program the DAQ assistant like I think it should be done, but I am having problems with the sampling rate.  Everytime I run the program I get an error message saying that it could not get the recommended samples in time.  I have had my delay time on my While Loop as low as .25 s and as high as 1.5 s.  I have changed my sampling rate from 1 hz to 1000 hz. 
    I would really appreciate any and all help on this matter.  I am still very new to LabVIEW. <I could not contain my excitement when my thermocouple program worked>
    Thanks,
    CDawgttu

    Hi CDawgttu,
    I think the best thing that we can do for the moment is to try and see what the maximum rate you can get on the 1121 is, when working on its own.
    According to the DataSheet the 1122 is more likely to be what's slowing you down.  There could also be some concerns with how the internal jumpers are set on the 1121, so you may want to make sure that the settings there match how you have configured your board in Measurement and Automation Explorer. 
    -You can set this up by right clicking on the SCXI module and adjusting the Properties.  On the Jumpers Tab you should then make sure the values match what you've set internally (SCXI-1121 User Manual - Chapter 2)-
    When you use just the 1121, see if that makes a difference in your overall application speed.
    Finally, it may be easier to use an example program than the DAQ Assistant since you are using such an old machine.  You can find Strain Gauge examples by opening LabVIEW and following this path:
    Help > Find Examples > Browse > Hardware Input and Output > DAQmx > Analog Measurements > Strain > Cont Acq Strain Samples.vi
    I hope some of these tips get you pointed down the right path.
    Regards,
    Otis
    Training and Certification
    Product Support Engineer
    National Instruments

  • Is there a way to output a buffered pattern on port 0 and input on ports 1,2, and 3 of the 6534 card while using an external clock to vary scan rate using LabVIEW?

    I am using a 6713 board to clock my 6534 board and want to output from the buffer a patten continuously on port 0 and read from ports 1,2,and 3 at the same time and clock speed.

    Hello;
    Unfortunately, you can't combine a 24 bit group by bundling 3 ports in the same group. Valid group sizez are 8 bits, 16 bits and 32 bits only. To accomplish that task, you will need an extra 6534 board, then configure that board to do a 32bit pattern input acquisition, and discard the data from one of the four ports.
    You can definitely synchronize both 6534s with the 6713 throug the RTSI bus. To synchronize all boards together, just route the AO Update clock from the 6713 board to one of the RTSI lines, and use two more instances of the Route.vi to route from that same RTSI line to the PCLK lines of the two 6534s.
    Hope this helps.
    Filipe A.
    Applications Engineer
    National Instruments

  • Labview serial update rate

    Hello,
    Currently I'm developing a BLDC controller with a microchip dsPIC, at this point I'm designing and tuning my control loop.
    For that I need some data monitoring, I want to Labview for this in combination with RS-232.
    So the question is how fast can labview receive data, current I used the basic serial example and changed it to I timed loop.
    With this I can receive data at 500Hz, but the DSP can put the data out at a higher rate but labview doesn't.
    I used termination characters.
    So my question is it possible to receive data at 1 or 2kHz ? max baudrate is 921600 (windows limited)
    And if possible what can I do to make it work

    Ok, we were typing at the same time.
    Also, are you sending just one character each way at a time? Do you have to process the data, like the remote sends a character, the LabVIEW program reads it and sends an appropriate response?  As Mike said, a timed loop gives you better timing accuracy than a simple loop with a Tick Count function or whatever, but running under Windows, MacOS or whatever it can have big variations if the OS decides to service another thread, say the operator wiggles the mouse rapidly. To do accurate timing requires either doing it in hardware, or a variation of that, or running under a RealTime Operating system, with hardware being the more accurate between those two.  
    How are you determining how many characters to read? Are you using the bytes at port function, and reading only 6 at a time? That allows you to "sync" your reads to the source, and if you put it into a producer-consumer pattern (shown in the LabVIEW examples) you can even be analyzing the returned data while receiving it.
    Putnam
    Certified LabVIEW Developer
    Senior Test Engineer
    Currently using LV 6.1-LabVIEW 2012, RT8.5
    LabVIEW Champion

  • How to measure and log frequency with fieldpoint CTR

    Hi,
    I am developing a data acquistion and control system for an engine dynamometer using the fieldpoint modules and Labview.  One of the most important signals is the engine speed, measured in RPM.  The RPM signal is a 0-12V pulse where one pulse equals one revolution of the engine.  As well as being an important piece of data for later analysis, engine RPM will also be in the input into a PID controller, so the signal must be both accurate and have a high measurement frequency. 
    Currently I am using the FP-CTR500 modules to measure the frequency of the signal.  I am already aware of the included frequency measurement VI example, as well as the one posted before for low frequency measurements, and I have gotten both to work with my setup.  I would be using the low frequency VI becuase the max frequency measurement would be in the 200Hz range. 
    The first problem I am having is with the structure of the VI and how the data is output.  The case structure in the VI activates when the counter is read and resets the counter, then switches to the next case.  I would like the RPM number to output out of the case structure into a write_to_file VI and PID controller input.  The problem is that when the case switches, the counter is reset to 0, which will be recorded in the written file. 
    This is some example output data (RPM):
    1232
    0
    2321
    0
    2400
    0
    2521
    0
    The data is being written correctly, but of course I can't have 0 readings when the case structure changes.  This would be especially problematic when input into a controller VI. 
    The next problem I am having is with sampling rate.  If I were to use the low frequency measurement VI, the sampling rate of RPM would be variable based upon the the speed of the signal.  Or, the original frequency measurement VI has an adjustable sampling rate.  Of course, in my system there are a number of other signals that need to be recorded at the same time.  I have found that as I am collecting data, the "write to file frequency" is entirely dependent on the read frequency of the frequency measurement.  Therefore, if I had set the read frequency VI to read at 1 Hz, data will be recorded only every 2Hz.  Ideally I would like an overall measurement frequency of all channels (mix of analog and digital) to read between 20-50Hz, but if I am limited by the frequency measurement. 
    Any ideas on how to solve this problem, either through Fieldpoint or Labview?
    I can post my VI if this help.
    Thanks,
    Huang

    Thanks for your reply. 
    I should probably describe my current setup before I go into anymore details with the problem.  As for my specific setup, I am using an FP-1000 connected with an AIO600, AI110, CTR500, and TC120 all running through the RS232 line to a desktop running labview 8.  The actual counter module is reading a tachometer signal output from a separate engine controller.  THe output is a 0-12V ON 50% duty cycle signal.  As for data logging, i am simply using a "write to measurement file"  Express VI.  I have a while structure which holds all of the express VIs which access the fieldpoint IO, and these are all routed to the "write to" VI. 
    As for the specifics of the data logging problem, as I said, when I set the count frequency of the "Fieldpoint Frequency Measurement" VI to 1Hz, (which means the VI calls the case structure at 2Hz), the "write to measurment file" VI is called at the same rate (2Hz).  Which means that the overall logging rate of the VI is only 2Hz.  Is there someway to decouple this? 
    I was able to solve the problem of calling the frequency variable from the case structure by using a local variable which is called outside the case structure. 
    And now I have been having a lot of problems with reading the actual frequency from my engine controller.  THe actual signal will only range from 0 to around 200Hz.  I noticed, by comparing the actual signal to what was being read in my labview program, that after around 80Hz the signal increases by around 1.5times more than the actual signal.  ie.  Actual signal = 100hz, Read Signal = 150Hz.  After trying to figure out what was the problem, I decided to change the Noise Filter settings to 200Hz.  It actually worked for all the frequency ranges up to around 150Hz but after that the filter attenuates the signal to the point where the actual signal is 160Hz, but is being read at 100Hz.  My question is if there is a way to change the actual filter setting outside of the 2 given setpoints (200hz and 40khz) or if you have any other suggestions on how to fix this problem?  I was thinking of creating a noise filter input in MAX so that i could play with the values in Labview, but am i only limited to those two filters? 
    Thanks again for your help,
    Huang

  • NI-DAQmx frequency sampling rate

    Hi there!
    I'm working on setting up a data acquisition Labview VI, to measure different signals on a test rig.
    I'm using the NI-DAQmx assistance (the Express VI?) to continously measure analog signals (Variable current, voltage and temperatures). This is working just fine, and i can change the sampling rate by writing to the express VI. The idea is, that the user can change the sampling rate from around 1 to 500 Hz. 
    We do however have a sensor that transmittes digital signals (a frequency), and are using a NI-9423 module to "read" it. As this is a digital signal, another NI-DAQmx express VI is needed to handle it (that's ok), but so far we can't figure out how to alter the sampling rate - it's apperently locked at 1kHz. 
    Being that we want to merge the analog and digital signals to one array, we are recieving overflow errors from the "analog" DAQ, if it's not set at exactly 1kHz. 
    So, in short - is it possible to change the sampling rate of a DAQmx recieving frequencies? So that we to DAQ assistences have the same sampling rate?
    Help would be greatly appreciated!
    - Nicklas
    Attachments:
    DAQissue.PNG ‏64 KB

    Unlike voltage measurements, which tend to be (more or less) instantaneous, frequency measurements take a finite (and often variable) amount of time.
    If it is a slow signal then you measure the number of counts of your reference clock that occur in one period of your input signal. As your input signal varies in frequency, so does the measurement rate. If it is a fast signal, you can either measure how long it takes to get n cycles or your input (again variable) or you could count how many cycles of your input occur in a fixed time period.
    The NI help on frequency measurements describes three different ways you can configure a counter to measure frequency.
    The long and short of this is that generally counter measurements come at variable measurement rates which can be problematic to fit in with a fixed rate loggin system. If the measurement period is much smaller than your desired rate then you can wait and trigger a measurement at regular intervals. If not, you can let the counter run at its own rate, placing the latest result on a notifier, and in another loop just read the latest measurement from the notifier each time you want to record a result. Depending if you counter is running faster or slower than your desired logging rate you will end up with either missed samples or repeated samples. There are inherant timing inaccuracies in both approaches because, unlike analog measurements, the counter measurement is not made at 'that exact time, now!' but over a period of time which may be long or short compared to your logging rate.

  • Labview VI

    Problem:
    I mistakenly logged data in labview signal express with out a buffer i.e one sample taken with a 1Hz sampling rate. Each new data point has a new header and as a result cannot be made viewable or converted to a text file. I have seen VIs that can maximize the buffer and rewrite the file with a single header but I have been unsuccessful in entering all the correct information into actual VI.
    Can someone please advise or walk me through the process?
    Thank you

    Hi Alpha,
    Problem: insufficient information...
    What's the file format you have right now and what file format do you need? Can you attach examples or the real file?
    What have you tried so far?
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

Maybe you are looking for

  • GUID of SWCV is not matching with SWCV in ESR(IR).

    Hi Experts,   I have developed two interface under same existing SWCV of existing interface. These two interfcae are under separate Namsapace declared into SWCV of existing Interface. Now we need to export the ID and IR objects to deploy the Interfac

  • How to send an attachment with password protection

    Hii all, I am working on sending an attachment from an ABAP program. The attachement is either .xls or .pdf format. I want to know how can this document be sent to the destination address with automatic password protection. The password should automa

  • Red Border Around Image

    Hi, In Photoshop CS3 I'm creating a 5 x5  pixel image in a green color. On my canvas it shows it completely green. I then emboss it. Before and after I emboss it I look at the navigator window and I see this red border around the green image. After I

  • Time Machine Options Settings Question

    Hi If using File Vault in the main/admin. profile. is the home folder supposed to be excluded in the TM options? running 10.5.6 and TM on a USB drive connected to an Airport Extreme. Thanks.

  • Need ur help badly

    dear members i installed the window 7 ultimate (64)  and i try to install the cs4 premium and i got this error