Frqequecny measuremen​t

I am using  12 bit 6024 E series card along with BNC 2090 . i am using this set up to measure voltages from different sensors.it works fine.now i want to measure frequency from a vortes meter. i do nt know where should i plug in my frequency signal in to  bnc 2090 or DAQ 6024 E series card . should i need some other conditioning unit to measure this frequecny . or my same existing setup can be usedd for this purpose. If yes how?should i use same diff input voltage input channels fo r measureing frequecny signals? i have no idea
Thanks

Hi Dost,
If you want to measure frequency in hardware then you would need addittional signal conditioning like the SCXI 1126. This is documented in a tutorial linked bellow:
http://zone.ni.com/devzone/conceptd.nsf/webmain/fe​39f389967440cc86256a6f005dee7c
Otherwise as long as you sample fast enough for the frequency you are measure you should be able to measure the frequency in software.

Similar Messages

  • [Need Help] DC Measuremen​t: Need more than 1 value/sec.

    Hi,
    I am measuring a dc voltage signal from a sensor using NI 6251. My goal is to acquire a DC measurement of the signal and save it in a spreadsheet file.
    In LabVIEW, i used the Express DAQ Assistant with Samples to Read at 1k and Sampling Rate at 1kHz. I then connected it to a low pass filter, then to the Express Amplitude and Level Measurements for a DC Measurement. Finally, I connected it to Save Measurement File.
    My problem is this: I only get one (1) dc value per second in my spreadsheet file. I wish to have at least 100 dc values per second but I do not know how to do this. I am aware that the DC value is somewhat an average of a set of samples. Maybe I could try to increase the sampling rate to 10kHz and probably get the DC Value for every 100 samples but I have no idea how to implement this.
    Could anyone help me?
    With best regards,
    Jason

    Crispo wrote:
    Hi,
    I am measuring a dc voltage signal from a sensor using NI 6251. My goal is to acquire a DC measurement of the signal and save it in a spreadsheet file.
    In LabVIEW, i used the Express DAQ Assistant with Samples to Read at 1k and Sampling Rate at 1kHz. I then connected it to a low pass filter, then to the Express Amplitude and Level Measurements for a DC Measurement. Finally, I connected it to Save Measurement File.
    My problem is this: I only get one (1) dc value per second in my spreadsheet file. I wish to have at least 100 dc values per second but I do not know how to do this. I am aware that the DC value is somewhat an average of a set of samples. Maybe I could try to increase the sampling rate to 10kHz and probably get the DC Value for every 100 samples but I have no idea how to implement this.
    Could anyone help me?
    With best regards,
    Jason
    You need to understand some basic arithmetic. With a sample rate of 1000S/sec and 1000 samples, that will take 1 second to acquire and then the express VI will reduce that to a single sample. You are getting exactly what you have programmed. Basic arithmetic would tell you that if you requested 100 samples, that would take .1 second and you would have 10 readings in your file. Request 10 samples which would take .01 second and voila, 100 readings.

  • How can I get all the options of Field Point Explorer 3.0.2 in Measuremen​t and Automation Explorer 3.1?

    I am a student using LabVIEW and Compact Field Point to implement a
    senior design project in Electrical and Computer Engineering.  We
    have LabVIEW 7.1 and a cFP-2020 and additional modules.  We are
    trying to use the CTR-500 module to output a pulse train to drive a
    stepper motor.  We have found instructions online of how to do
    this, but they all use Field Point Explorer.  The instructions
    make use of options in Field Point Explorer that are not included in
    Measurement and Automation Explorer (the software included with the
    cFP).
    I have downloaded Field Point Explorer 3.0.2 and followed the
    directions.  Now, in LabVIEW, when I using the newly created .iak
    file, I get a dialogue asking to find the SubVI: 'FP Read (Float Array
    -IO).vi'.  Included with LabVIEW are subvis for FP Read, but not
    with the float array part.  So I figure I cannot use Field Point
    Explorer created .iaks with LabVIEW 7.1.  So how cannot I access
    the same options in Measurement and Automation that I was able to find
    in Field Point Explorer?
    Thanks

    Hi,
    FieldPoint Explorer is no longer used to configure FieldPoint systems. However, I do not think that the configuring your system in FieldPoint Explorer is causing the error.
    FieldPoint systems are now setup in Measurement and Automation Explorer (MAX).  Information on setting up FieldPoint systems in MAX can be found in the MAX help under: Installed Products>> FieldPoint. Also, I recommend upgrading to the latest FieldPoint driver and version of MAX.  The FieldPoint VI's will be slightly different when you upgrade, so there is a good chance that this will eliminate the error.
    Regards,
    Hal L.

  • Measuremen​t, visualizat​ion and saving data in parallel: Performanc​e question

    Hello,
    I have written an application with 3 loops running in parallel.
    The first loop does only measure and analyze measurement values from a DAQmx device (3 analog input signals from 3 sensors with 1000 Hz).
    The second loop does only do the visualization with a graph per sensor continously. The data will be sent from the first loop through a queue.
    The third loop only saves the data to a file after a measurement has finished. The data will be sent at the end of a measurement from the first loop, too.
    There are 3 measurements running asynchronous.
    That means it could be that only one sensor will be read, but it also could be that 3 sensors will be read. The duration of each measurement phase and the beginning/end is asynchronous.
    Now I have the following problem:
    Measurement 1 starts
    A short time later measurement 2 starts
    Measurement 2 will end, the measurement values will be saved into a binary file
    Measurement 1 is still running but the visualization of measurement 1 stops for about 1 second during the saving process. After the data is saved, the visualization runs normally again (no data is lost because of the queue).
    Why does tha graph stop its visualization during the saving process (I have a dual core cpu)?
    How can I do this in a way, the user does not see any lags?
    It all works fine but the "interrups" look very unprofessional.
    Regards
    Matthias

    Hello,
    I'm using the producer/consumer pattern.
    Maybe it could be, that the dll calls I'm using for saving will interrupt the whole program: http://lavag.org/files/file/212-sqlite-labview/
    When I use the LabVIEW File-I/O vis all is fine. But when I use these database vis my application will lag.
    Any ideas why this is so? Could it be that the dll calls freeze the application during the saving process (LabVIEW 2011)?
    Here are thze dll settings:
    Attachments:
    dll_settings.PNG ‏50 KB

  • Performanc​e measuremen​ts

    Hello. I'm trying to verify the CMRR specifications for the NI PXI5922 Digitizer using the two channels in differential mode, but I don't reach the specification. The signal source is a PXI 5441 AVG.  The CMRR is measured by applying the same signals to both channels on the digitizer via a T-coupling and quite short coaxial cables of the same length from the AVG. And then the resulting voltage is measured to estimate the CMRR = 20*LOG(Vcm/Vout), where Vcm is the applied common mode voltage and Vout is the resulting output. The results I get for 1 kHz is about 94 dB and for 100kHz about 75dB. According to the spec the 5992 should have over 90 dB CMRR up to 300kHz.  I was using the given examples in Labview to generate the signal and making the measurement. So why doesn't our 5922 meet the spec? Is there a better way to validate the CMRR? Or might it be something wrong with the digitizer? Best RegardsStefan Johansson SP

    Thank you for the answer! If the spec that you are linking to is the latest then my measurements are not that far from the "typical" curve showed in figure 1. The spec that I have is more optimistic, showing a flat curve all the way up to 200kHz. The 50 dB that you are talking about is specified when the channels are used in unbalanced differential mode. This means that one signal is applied to the "shield" of the coaxial cable, giving the 5922 two differential channels. I want to verify the CMRR when two channels are used as one "real" differential channel and the CMRR for that is in figure 1. Best RegardsStefan Johansson  

  • Frequency/​Period measuremen​t with USB-6218

    I have been trying to get a USB-6218 to measure either frequency or period and have been running in to issues. I have attempted to connected to each counter gate (PFI1 or PFI2) and have not had any luck. I am attempting to meausre lower-frequency signals (1Hz-1KHz) so the Low-Frequency, 1 counter approach seems to be the one I need. The end goal is to get it working in Labwindows. I have made attempts using both sample LabWindows programs as well as MAX. In each of them, it appears that the frequency measurement is actually coming back with the pulse width of both the low and high portions of the signal. For instance, if I am trying to meausre the period of a 100 Hz signal, with a 75% duty cycle, it will toggle between 0.0075 and 0.0025 seconds for the period. If I increase the frequency of the signal to 10 KHz, it measures the frequency well. Is there something I am missing? Based on the manual, I should be able to meausre lower frequency signals using this method. Any help is greatly appreciated.
    Thanks.

    Can you upload a screenshot of your Task in MAX is setup? Is this happening at all lower frequencies or just at 100 Hz? Lets make sure we can get this working in MAX before adding LabWindows/CVI on top of it.
    Sometimes this can happen if the rise/fall time of the signal is not within the specifications of the device. This KnowledgeBase article shows the acceptable rise/fall times. 
    Applications Engineer
    National Instruments

  • How can I see my USB device in the measuremen​t&automati​on explorer?

    I'm trying to use an USB-device with LabView7.0.
    I followed the manual "using NI-VISA3.0 to control your USB device" which I found on the NI-webpage.
    I generated an .inf file using the VISA driver developement wizard and installed it in the INV folder of my system (I'm using Win2000).
    The following problems occured: after I connect my USB device the "add new hardware wizard" pops up and aks for a driver for the new hardware. when I direct it to the folder with the generated .inf file it doesn't accept this file as a driver.
    second: the USB device doesn't appear in the measurement&automation explorer. I assume that as long as my device doen't appear there it is not recognized by NI-VISA and the comm
    unication from LV wouldn't work?
    How can a make my USB device visible in the meas.&aut. explorer?
    Regards
    Tobias

    Tobias,
    In the VISA Driver Development Wizard, you are required to specify the Vendor and Product ID. These numbers are what Windows will use to determine whether a specific driver is appropriate for a given device. If these numbers do not match between your device and the Windows driver (INF file) that you generate, you will see the behavior that you describe. Check to make sure that you are using the appropriate values. Also, did you right click on the INF file and install it as directed in step 2.2.2?
    Page 5-87 of the VISA Programmer's reference manual specifies how to do a viOpen to a USB device in RAW mode, but you still need the same sorts of Product ID and Mfg ID that you needed for the INF file generation. Furthermore, if you are communicating with
    your instrument directly, you will need to know exactly what command set it expects, so make sure you get that from the manufacturer as well. Finally, I want to mention that if this device complies with the USBTMC standard, then none of this is necessary--the steps above are for setting up a RAW USB connection. See this link for information.
    Scott

  • File name and Measuremen​t in the same table?

    Hi,
    I have written the following VI that opens up each image file from a folder and measures a certain dimension on the image. It then puts the result of the measurement in a table.
    The table has one column and N rows since I have N images in my folder. How can I modify this VI so that it puts the filename in the same table as well as the measurement?
    I want the table to look like:
    filename1 measurement1
    filename2 measurement2
    and so on.
    Currently, it just shows
    measurement1
    measurement2
    and so on.
    Here is the VI:
    Solved!
    Go to Solution.
    Attachments:
    measurement.vi ‏68 KB

    You can't use the Express Table. As the properties page says, it is for numeric data. You will have to use a little bit of actual LabVIEW to create the table. A table is just a 2D string array. So, convert the numeric to a string, build a 1D string array from the file name and numberic, and use a shift register to build a 2D array. Something like the code below.
    Message Edited by Dennis Knutson on 05-07-2009 01:43 PM
    Attachments:
    Build Table.PNG ‏9 KB

  • Whats the best way to get an averaged measuremen​ts while scanning?

    I currently have a PXI 4071 DMM and a PXI 2575 Mux. I've played around with the examples of how to do scanning of channels which all works out fine. However the scanning only takes one measurement every time it steps down the scan list. What I need is for every scan step to take X number of measurements and average them before moving on to the next scan step. I've played around with the Sample Count and Trigger count but cannot seem to get this done correctly. I've gotten it to the point where it runs through the scan list multiple times and I can average from that but I don't want to tax the relays that much. Can anyone point me in the correct direction for this?

    Hey Duke,
    I can say with great confidence that the 407x DMM is much more accurate
    than any DAQ product NI sells; if you're seeing unexpected accuracy errors after acquiring one data point with the 407x DMM, I think the best method is to figure out the discrepancy, not average it out.  I recommend you verify the DUT signal with a scope (you can use the DMM in digitizer
    mode) to see if the signal is still settling as you take your first measurement.  If this is the case, each successive measurement becomes much closer to the value you expect,  so averaging 5 samples allows the system to settle at the later points, and we get close to the actual value.  If this is the case, then I recommend you increase the settling time property (configuration»Advanced»Settle Time) enough that your DUT signal has settled and then take a single measurement (with increased aperture time, if desired, as mentioned previously).  Note that the value you obtain will differ from the old system if the above conditions exist... the 407x measurements are more accurate.
    Obviously, if you scope the signal and it's steady during your first measurement, then my previous post would solve the problem while still taking only one measurement.  For my own curiosity, what type of measurement are you performing?
    Keep us up to date on your progress and don't hesitate to post of with any additional questions.
    -John Sullivan
    Analog Engineer

  • Number of Points used in FFT calcuation​s using Spectral Measuremen​ts

    Hi folks,
    I'm a pretty new user of labview (Ver 7.1) and I am trying to perform spectral analysis of power systems to ensure that they comply with stated standards. I'm using PXI-1002 system with PXI-6025E DAQ cards. I am able to get the analogue data into the program and display the time and frequency domain data to screen however I require a specific resolution to the spectral analysis to comply with the standard. obviously I can set the sampling frequency but I am unable to set the number of points of the actual FFT using either the "spectral measurements" function or other specific FFT VIs.
    Can the number of points be set manually or do the functions some how deside the best number of points depending on the amount of data passed to it, I've found that increasing the "Scan to read at a time" value of the AI config VI I'm using seems to increase the resolution but I don't know how the FFT functions deal with "Scan to read at a time" values that are not of 2^n.
    Cheers
    The Fat Controller
    90% of all experts aggree that 1 out of 10 experts are wrong

    Hi,
    The answer lies in the labVIEW Help file< If you dig deep enough through the hierarchy of the Spectral Measurment Express VI's, you will end up eventually to the Power Spectrum.vi or Real FFT.vi that can be found in the Analyze>>Signal Processing>>Frequency Domain Palette (see screenshot attached)>
    The computation details are given in the help for those VIs
    I will let you go through those files for details, but basically, when the number of samples in the input signal is a valid power of 2, the VIs compute the fast Fourier transform using the a fast radix-2 FFT algorithm.
    When the number of samples in the input sequence is not a valid power of 2 but is factorable as the product of small prime numbers, the VIs compute the discrete Fourier transform using an efficient DFT algorithm according to the type of trnasorm that is executed (i.e Real, Complex)
    But the help file explains it better than me
    Hope this helps,
    Cyril Bouton
    Active LabVIEW Developper
    Attachments:
    ScreenShot008.gif ‏29 KB

  • Visual C++ 6.0 error with Measuremen​t Studio ActiveX controls

    Hi,
    I'm triyng an evalutation version of Measurement Studio for VC++ 6.0 for my company.
    I'm working on a single document application with a menu bar.
    I have several dialog linked to the menu bar but, at run time, the application crash with memory leaks error.
    This problem appear only if I attach a Measurement Studio controls on the dialog. 
    If I add a control on the main document this error don't appear.
    Could you help me?
    Is there some Measurement Studio patch solving my problem? 
    Thanks a lot
    Federico Bettin
    Applied Materials - Baccini spa 

    Thanks Lucius for your help but I'm not developing any software code!
    .... I think this problem depends on the installation or the software version .... I think and I hope !
    I'm very doubtful because I would like to buy the Measurement Studio for my company ..... obviously, before I have to solve this problem .......
    Bye
    Federico Bettin
    Applied Materials - Baccini spa 

  • Why do I get a MissingMan​ifestResou​rceExcepti​on when I try to use an MFC Control Wrapper in a Windows Forms Project with Measuremen​t Studio 7.0?

    I can use the .NET Windows Forms Controls without problems, but when I try to add an MFC control, everything appears to work fine until runtime. The program throws the aforementioned exception claiming that the resources haven't been added to the compiled project. Any help is appreciated. Thanks.
    Victor

    That is correct that you will need to create an interop wrapper for the 3D graph ActiveX control if you want to use the 3D graph in a Windows Forms project. However, data types like CNiReal64Vector are only in the C++ interfaces and are not in the ActiveX interface, hence you can use normal .NET data types and will not need to create the C++ data types. The interop wrapper will convert the .NET data types to the expected ActiveX data types and will pass those data types on to the ActiveX control. For example, here is a C# example that demonstrates how to plot a dual sine surface on the interop wrapper of the 3D graph:
    const int size = 40;
    double[] xData = new double[size];
    double[] yData = new double[size];
    double[,] z
    Data = new double[size, size];
    for (int i = 0; i < size; ++i)
    xData[i] = yData[i] = ((i - 20.0) / 20.0) * Math.PI;
    for (int i = 0; i < size; ++i)
    for (int j = 0; j < size; ++j)
    zData[j, i] = Math.Sin(xData[i]) * Math.Cos(yData[j]) + 2.0;
    graph3D.Plot3DSurface(xData, yData, zData, Type.Missing);
    - Elton

  • How do I get a timestamp for each daq counter measuremen​t?

    Hello, I am currently using DAQmx to read a PWM signal and calculate the pulse width using the implicit timer. Is it possible to also get the timestamp of each pulse width measurement? I have been able to do this with an analog signal by using a waveform output, but I can't figure out how to do this with a counter measurement. Thanks in advance for the help!

    I think that you have asked a great question. Something that many engineers may not be aware of is that the DAQ device itself does not actually create the time stamp. It is created by the comptuer system when the data is retreived from the DAQ card. Because of this, the time stamp is actually delayed by a significant amount. This delay may be less than the resolution for time stamps (one millisecond) but it can sometimes be greater than that. This Knowledge Base article discusses it this in more detail: http://ae.natinst.com/public.nsf/web/searchinterna​l/5d42ccb17a70a06686256dba007c5eea?OpenDocument
    If your pulses are closely spaced, the generating a timestamp for each aquisition is not practical. But, if your pulses are spread apart by a significant amount (more than 0.1 seconds or so depending on the specifics) then the timestamp might be accurate enough to be practical. One way to do that with counters would be to use the Get Date/Time in Seconds VI in LabVIEW just after the DAQmx Read VI. Then, once the Counter Input measurement is complete, LabVIEW will create that timestamp very quickly afterward. Please notice that there will be some time delay between the completion of the read and the creation of the time stamp.
    Jeremy P. 
    Applications Engineer
    National Instruments

  • How can I configure the CTR 1 GATE to output a pulse (10us long) and then immediatel​y take a pulse width measuremen​t?

    I'm using the BNC-2120 DAQ and LabVIEW for interfacing with an ultrasonic position sensor.  The sensor is the "PING))) ultrasonic range finder."  It measures the distance from PING))) to some object directly in front of it.  It has 3 pins (5V, ground, and a signal pin).  The 5V and ground are easily taken care of with the 5V and digital ground outputs on the DAQ.  The signal (SIG) pin works in the following way:
    1) Send a 10us, 5V pulse to SIG.  This triggers PING))).
    2) Wait 200us.  PING))) takes a distance measurement.
    3) SIG outputs a square wave with a specific pulse width. 
    The pulse width varies with the distance of the object away from PING))).  I've tried using CTR 1 OUT to generate the pulse, and then using CTR 1 GATE to measure the pulse width.  However, the measurement is stuck measuring 0V because CTR 1 OUT is on the same pin!  So PING))) tries to output a pulse, but cannot due to the fact that CTR 1 OUT forces the SIG voltage to be 0V.  So I need to use just one counter I/O line to trigger and measure.  How can I do this?

    Hi Matttastica, 
    What DAQ
    card are you using? The reason I ask is that the PFI lines on some cards can be
    used as PFI lines and digital I/O lines, while on others they can only be used
    as PFI lines. The pinouts and diagrams for our cards can be found at www.ni.com/manuals. You can accomplish
    what you are looking to do if your PFI line is a DIO lines as well.
    I would
    suggest doing two tasks, one for counter output and one for counter input.
    (Note that both these are not going to be the same line). The counter output
    will be used to do a single pulse generation, while the counter input will be
    used to read back the period. 
    This will
    work because, at first, we are going to route the line from the counter output
    to the counter input line by using a ‘DAQmx connect terminals’ (basically
    making the input line an output line for a moment). First, setup the tasks and
    the DAQmx connect, start the tasks and have the counter output do a ‘wait until
    done’ to ensure the pulse is sent. After this is done, do a DAQmx disconnect
    terminals (turning the input back from an output to an input), and then do a
    DAQmx read for your counter input (period measurement). You may look at the
    shipping examples for pulse generation and period measurement in LabVIEW (Help »
    Find Examples... » Search tab … search for examples)
    One note is
    that since this is software timed, it may not be fast enough to meet your 200us
    timing. If this ends up being the case, please look at the 6552, as it can do
    per clock cycle direction change very fast.
    David L.
    Systems Engineering
    National Instruments

  • Low sample-rat​e measuremen​ts on the PCI-6115 DAQ card

    I need to measure an analog signal at a sampling rate of a few tens to hundreds of Hz in sync with the rising edge of an external clock. I have a PCI-6115 DAQ card w/ Labview 6.1 and NI-DAQ 6.9.2. The PCI-6115 is a high speed card and has a minimum sample rate of 10 KS/s. Is there any way of implementing a low sampling rate measurement using the PCI-6115 in sync with an external clock?
    Thanks in advance.

    Kuldeep,
    It is possible to do what you are describing above (in fact I don't think an external clock is required to do this), however, bear in mind that the reason for this minimum sampling frequency is due to the ADCs used on this high speed board. The ADCs used are pipelined ADCs, meaning that when a signal is digitized, it is digitized in distinct stages within the ADC (in the case of the 6115, I think there are 3 stages involved). Data is moved from one stage of the ADC to the next each time a sample clock pulse is recieved. If too much time elapses between these clock edges, the signal to be digitized can actually 'leak' off of the ADC. This can result in improper digitization, which can lead to less accurate measurements. So, while it is possible to mak
    e the device sample below it's minimum rate, it may be advisable to sample faster than the rates required by your application, and either average multiple data points per measurement, or throw away extra points taken.
    I hope this helps,
    Dan

Maybe you are looking for

  • Can I upgrade my Macbook Pro from 2007?

    I have a Macbook Pro bought June 2007 (I thought late June so should be the mid-2007 model) but am not sure it can be upgraded to Mountain Lion.  I checked the tech specs and my processor looks like the late 2006 model, i.e. 2.33 GHz Intel Core 2 Duo

  • Unable to create data-domain in Endeca 3.0v

    Hi, I have installed Endeca 3.0 server in non-SSL mode. The verification URL http://localhost:7001/endeca-server/ws/manage?wsdl is opening up. I could login into the WebLogic Administration Console and could see that the oracle.endecaserver Web appli

  • Strange error when enabling SSL on Oracle HTTP Server

    Hi, In our production environment Oracle HTTP Server starts fine when SSL is disabled. We've enabled SSL in our dev/uat environments using instructions from the Oracle Documentation. It was pretty straightforward. When i tried to do the same in our p

  • Query in Optimal Coding - Assigning INT TAB to FIELD SYMBOLS dynamically.

    TABLES: tkesk. DATA: itkesk type table of tkesk. DATA: BEGIN of itab_RP occurs 11, tabname type string, END of itab_RP. DATA : wa_rp like line of itab_Rp, wa_onemore like line of itkesk. FIELD-SYMBOLS: <fs> LIKE itkesk. ITAB_RP-tabname = 'ITKESK'. AP

  • Mappning of input/output for std services against tables i ERP?

    Hi, Anyone know if there is any documentation for the mappning of input/output data in std services, e.g. Create Order against the fields in VBAK backend ERP? Thanks, Emma