Function LabVIEW to acquire high-frequence data

Hi,
Actually, in a program created with LV 4.1, we use "AI Sample Channel" in a while loop to acquire data from 3 channels on a PCI-6025E card. But the frequence is limited to about 0.08s. Now, we try to port the program to LV 6.1. My question is what the function in LV we should use to perform acquisions at highest frequence. We found "AI Acquire Waveforms" in LV's standard library, but it implies a lots of modifications and we don't know if it's the fastest method.
Thanks.

I would advise you to study the data acquisition examples and the manual of LV.
By using another way of doing DAQ, together with a slight modification of your program, your are able to at least a 10 or 100 times increase in throughput.
The trick is that you don't have to re-initialise all channels, sample-rates,gains, etc, etc. when doing DAQ. This is what happens when you use some of the basic DAQ input vi's, but these are not intented for measuring data in a fast repeating way.
Just let your LabVIEW application sample the 3 channels as a background job.
Reading out buffers will give you the acquired values, one of the powers of NI-DAQ.
Patrick de Boevere

Similar Messages

  • Acquiring High resolution data from usb mouse movement

    Is there a way of acquiring high resolution data from the movement of a usb mouse?  I have a NI PCI 6221 daq card to use. I have a Pentium 4 PC with 1 Gb of RAM. I need to get the position, velocity and acceleration of the mouse.
    Is there a way to do it with the above hardware.
    Thanks in advance

    I don't see how you could use a PCI-6221 to get high resolution mouse movement measurements. The PCI-6221 can acquire voltages at up to 250kS/s, but what voltage would you measure? It could also read in digital data at 1MHz, but you would have to be able to understand the USB communication protocol your mouse uses, and I doubt that your mouse vendor will give out that information. You might be able to take your mouse apart and hook up leads to the sensors around the trackball and do a sort of quadrature encoder application, but there's no guarantee we're dealing with TTL digital signals here (you might even be using an optical mouse!).
    Your best option - and I have no idea how good this option is - is to use the driver already installed for your usb mouse. What software application are you going to use to program your application that will measure the mouse movements?
    If you would consider using LabVIEW, you could set up an event structure to capture mouse movements across the front panel. Each event registered includes a timestamp from the CPU's millisecond clock as well as the coordinates of the mouse. If you stored up a buffer of previous mouse positions and times, you could infer a velocity, perhaps with the help of interpolation functions.
    All of this would have somewhere on the level of millisecond timing accuracy. Were you envisioning something else?
    Jarrod S.
    National Instruments

  • When I upload large streams of high frequency data the x axis scales incorrectly

    I have data at 5 kHz for 20 seconds. when I upload this data into the report tab the x axis scales incorrectly and cuts off either the head or tail of the data. The x axis always rounds to the nearest second while i need it to start with the data. Any ideas?

    Hi wpruitt3,
    Take a look at this knowledge base article.
    Best,
    Kristen

  • VI for capturing data at high frequency

    Hi,
    I have been trying to build a VI, which would read data from a machine at around 50KHz. I am accepting one analog signal from this machine, converting into digitized values and storing ALL the digitized values into a file. I could not afford to loose even one data. I would really appreciate if anyone has an example as to record data at such a high frequency.
    Thanks in advance
    Dhawal

    One channel running at 50kHz is simple - in fact in the world of DAQ it's not even very fast. Check out the examples for seamless data acquisition. Basically, the idea is to set up a circular buffer that the DAQ writes to and LV reads from. As long as LV extracts the data faster than the DAQ is inserting it the process can run forever.
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • High frequency Labview-Simulink Interface

    Hi,
    I am building a Simulink and Labview model to work with each other using SIT. However, before I even begin, I would like to know whether this Labview-Simulink interface will or will not be able to run at a relatively high frequency - possibly around 80 Hz? Also, if there is any tips or hints on how to make the program run faster, such as run Simulink and Labview on different computers which are connected via LAN, etc., they are highly appreciated. And if this is of any help, I am using a P4 2.0 GHz with 256 MB of RAM computer.
    Thanks and have a nice day!

    Lan,
    A large part of the speed will depend on the size of the model. The CPU would need to be divided between simulating the model and doing the updates, but since you are looking for a high speed it is probably a simple model. I ran a very simple model and tested to see how often I could receive values from the NI Sink in the model as well as write to a parameter of the model. In this simple setup, I recieved updates from the NI Sink at a rate faster then 1000 Hz (less then 1 ms per update on average), and writing to the parameter happened a bit slower at about every 25 ms. Since it is windows these numbers are only averages and individual iterations will vary.
    If you want to provide the model with a stimulus as it is running from an external input (like a DAQ
    card) then you will want to use a real-time system. This functionality is also part of SIT, where you use Real-time Workshop to build a dll and example VIs for running the model on a LabVIEW Real-Time target. Building a DLL in this way you can still use the same user interface to view the parameters and values in the model, but the inports and outports of the model will be connected to hardware. In this type of simulation the rates can be significantly faster.
    Carl L
    National Instruments
    www.ni.com/ask

  • How do I use the High Speed Data Logger with multiple I/O devices?

    I am using the High Speed Data Logger vi to read from a 16 channel A/D card (NI PCI-MIO-16E). The project may require more than 16 channels. How can I use High Speed Data Logger to read from two A/D cards? Will it be able to write the data to one file?

    The High Speed Data Logger vi will not acquire and right to multiple DAQ boards at the same time without modification. LabVIEW is more than capable of doing this what you are trying to do, but you will have to modify the code.
    Regards,
    Anuj D.

  • How to send TTL output AND acquire AI voltage data using USB-6211

    Hello,
    I am relatively new to Labview, so please bear with me.  I have a research application involving 2 pressure transducers and a high-speed camera.  I wish to acquire analog voltage data from the 2 pressure transducers.  However, at the start of the acquisition, I will need to send a single TTL output to trigger the camera.  This TTL pulse must be sent out at exactly the same time that the AI acquisition begins, in order to ensure that my 2 pressure measurements and camera images are 'synchronized' in time.
    Is this possible on the USB-6211 running with LabView 8.20?  I currently have a fairly simple LabVIEW vi that uses a software trigger to start an AI acquisition - I have attached it with hopes that it may help anyone willing to assist me.  I would prefer to be able to simply add something to it so that it will output a TTL pulse at the start of the acquisition.  
    Thank you in advance.
    Regards, Larry
    Message Edited by Larry_6211 on 12-19-2008 11:24 AM
    Attachments:
    USB6211_v1.vi ‏212 KB

    Hi All,
    I'd like to clear a few things up. First, you'll find that if you try to set the delay from ai start trigger and delay from ai sample clock to 0, you'll get an error. Due to hardware synchronization and delays, the min you can set is two. Note that when I say two, I am referring to two tick of the AI Sample clock timebase, which for most acquisitions is the 20MHz timebase. I modified a shipping example so you can play around with those delays if you want to - I find that exporting the signals and looking at them with a scope helps me visualize what is going on. The Manual has some good timing diagrams as well but it looks like you've already hit that. The defaults would give you a delay of  250ns from the start trigger - is this too high for your situation? What is an acceptable delay? I tend to think that "exactly the same time" is a measure of how precise rather than an absolute (think delays in cable length making a difference.)
    With all that in mind, I see a few options:
    Start your camera off of the AI start trigger (an internal signal) and just know it is 250 ns before your first convert. 
    Export the convert clock to use as a trigger. This assumes your camera can ignore the next set of convert clocks.
    More complicated option: Internally you have an ai start trigger, sample clock and convert clock. From your start trigger to the first convert is 250ns but if you export your convert clock you're going to get future convert clocks as well. One option would be to generate a single triggered pulse using a counter (start with the  Gen Dig Pulse-Dig Start.vi example) with the AI start trigger as the trigger for the counter, an initial delay of 250 ns, and a high time of whatever you want it to be. This should give you a singe pulse at very close to same time (on the order of path delays) as your first convert clock. 
    Hope this helps, 
    Andrew S
    MIO DAQ Product Support Engineer
    Getting Started with NI-DAQmx
    Measurement Fundamentals
    Attachments:
    Acq&Graph Voltage-Int Clk.vi ‏37 KB

  • Best method for collecting low frequency data

    Hello everyone,
    I'm looking for suggestions on the best way to collect relatively low frequency data (about 1 Hz). I know there are a few different ways to do so in labview such as the DAQ assistant or making DAQ mx and making your own virtual channel. Also there are an abundence of different settings to choose from. I'm using an NI 9215 DAQ card and am collecting voltages. I would be interested to here any opinions on a method for doing so and maybe the settings that they would use.
    The reason I'm asking is because I'm just using the DAQ assistant but I'm really not sure if that's what I want to be using. I feel like there is a better way.
    Thank you all!

    winterfresh11 wrote:
    Is this different from triggering? Because this particular DAQ card can't be triggered.
    There is a big difference between triggering and sample clock.  The trigger tells the DAQ to start acquiring data.  The sample clock tells the DAQ when to take a sample.  You trigger once per acquisition.  The sample clock just keeps on going until the acquisition is complete (either aborted or desired number of samples is acquired).
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Problems with displaying high frequency sine wave

    Hi all,
    I don't why LabVIEW has problem with high frequency display.
    Someone plzzz explane me how to see a better grapth on grapth?
    check this function waveform generation.vi in examples go to search and type "waveform"
    set the feq to 1k.
    Thanks and regards

    I think that this is related to LabVIEW, but more basically to the Shannon theorem : if you want to describe a periodic function, you need to take enough samples. First, expand the x scale to see a delta t of 0.001 s (ie a period of your 1000 Hz signal). Remeber to set the autoscale property on OFF. Then go to "sampling info" and increase the sampling frequency. See the result...
    CC
    Chilly Charly    (aka CC)
             E-List Master - Kudos glutton - Press the yellow button on the left...        

  • Help Counting high frequency voltage spikes

    Equipment:  NI USB-6229     (250 kS/sec Analog In., 16 bit DAC, 32 bit Counters and internal clocks <= 80 Mhz)
                          LabView 14
    Problem:
    I have an experimental application where I need to count voltage spikes (integer #) caused by electrons hitting a sensor. These spikes can be as frequent as 500,000 Counts/sec. The spikes are not going to be the same voltage everytime, but they will be visible above the noise so I need to allow the user to select a Threshold Voltage that triggers a real count rather than noise spikes.
    Attempts:
    To count such a high frequency, I deduce that I need to use a Counter Input to read fast enough, HOWEVER, I wasn't able to find a way to set a threshold voltage for a Counter Input because I believe they expect a TTL signal anyways, which I won't have. To set the threshold, I realize that Analog Input reads can be triggered at a selected level which is great but the Analog Input Sampling Rate is only 250kS/sec which won't catch every count in my project.
    I have a program that uses the Count Edges channel and it is accurate within 3% of the expected # of counts. I was just testing it with a function generator and the program doesn't count unless the signal's voltage is above 2-3 V which won't work for my application. I will post what I have. Does anybody know of a way to trigger off only at selected Voltage levels using counters, or know of a way to filter through the noise to get real spikes?
    Thanks!
    Solved!
    Go to Solution.
    Attachments:
    ElectronCountsTest.vi ‏29 KB

    Thanks for the reply johnsold. I didn't think to use a comparator but that is good to know that I have that option. I was and still am hoping for some kind of trick to do this programmatically.   One other idea that I was playing with is offsetting two or three different Reads on the same signal to read the signal at different times. If this is a possibility, it may be able to double or triple my sample rate to 500k or 750kS / sec. Anyone else have any ideas on this solution?

  • High frequency power measurements

    Hey,
    I'd like to know if there are developments in measuring electrical high frequency signals with labview without using an extern power analyser. At the moment i'm using a yokogawa power analyser but i'd like to know if it's possible to log HF signals without the help of a power analyser... Are there NI products on the market for this purpose?
    Thx,
    Andy

    Hello,
    In the case of frequencies up to 200 KHz, NI can provide several solutions using the 'standard' data-acquisition boards or digitizers (scopes), from a low to very high accuracy solutions.
    A good solution can be one of the high speed M-series boards (PCI-625x).  These boards have up to 32 multiplexed channels with a resolutions of 16-bit at a speed of 1 MS/s  (500 KHz).
    A better solution would be a S-series boards.  These data-acquisition boards sample all input channels simultaneously.  We have boards with 2, 4 and 8 channels and sample frequencies of 10 MS/s  (up to 5 MHz if needed).  S-series board are the boards with product numbers PCI-61xx.
    The best solution is to use a digitizer (scope).  Also here a lot of possibilities going from low to higher bandwidth and resolution.
    The most flexible is the PXI-5922.  A 24-bit digitizer if you sample @ maximum 500 KS/s.  This board only exist in the PXI platform.
    Then NI has 8-bit digitizers (normal resolution for scopes) from 15 till 125 MHz bandwidth.  If you need a higher resolution they have solution up to 14-bit (very high for scopes) @ 100 MHz.
    Please give your local NI Office a phone call.
    They have technical engineers who can discuss your needs and provide you a solution.
    Best regards,
    Joeri
    National Instruments
    Applications Engineering
    http://www.ni.com/ask
    Make our forums great:
    If you like the answer, don't forget to "Kudos!".
    "Accept the Solution" if your question is answered!

  • High frequency

    Hi everyone, im pretty new with Labview, and I need a help. I would like to simulate square wave with high frequency (40MHz).
    1. Can anyone help me, how can I simulate the signal (test2.vi) in a moving (time) axis?
    2. I would like to catch the datas and write it into .txt or excel. I found this example (0807-LVM_Beispiel.vi). Can I get the datas like this from my 40Mhz frequency generator?
    A little help would be nice. Thanks.
    Attachments:
    test.jpg ‏525 KB
    test2.vi ‏29 KB
    0807-LVM_Beispiel.vi ‏88 KB

    thanks for your help.
    Ive been searching, and now I've changed my test2.vi into test CIC.vi and modified it. If you may see it, im trying to simulate the signal with frequency 40MHz and, with sampling frq. 800MHz and sample rate 1000samples or max. ~300.000samples (and save it in a txt.file). Actually, my task is, that im gonna need to make a labview program that can take as many points as possible, that might be not periodic. I mean, we gonna use NI card (in near future, not bought yet) to take some input datas, and to get all the data-samples in high frequency range. Is it possible to make such a program (capture the data) without knowing in the first place which NI card that we are going to use. But i believe, we gonna buy a digitizer, digital I/O and might also mxi controller (I've looked at the offered device list) from NI. Thanks a bunch. Is it much easier after buying the devices? (like getting the device's driver and maybe program that support it).
    One other question, in my test CIC.vi, im saving my points into .txt file (only the amplitude). How is the trick to take also the x-axis points (time axis) and save it in the same .txt file.
    Attachments:
    test CIC.vi ‏49 KB
    CIC 1 sub.vi ‏21 KB
    Save 1Data.vi ‏18 KB

  • Bandpass filter amplifies only noise at high frequencies and saturates the opamp output

    I have designed a fourth order bandpass filter using the opamps TL081C. It is designed to operate at a center frequency of 40 kHz with a bandwidth of 2 kHz and a gain of -12.5 . I am using an SCB-68 board for data transmission and reception, one channel each for transmitter and receiver. I have grounded the adjacent channels to both the transmitter and the receiver in order to avoid crosstalk. However, at the output, I get a signal which looks like an amplitude modulated signal and is, most probably, noise. The amplitude of this signal is always around 9V, no matter what is the amplitude of the input (even if the input is 0V, the result is the same). If I scale down the whole BPF to operate at a center frequency of around 20 kHz, everything works fine. Therefore, I think the problem lies not in the design of the BPF but the use of the board at higher frequencies. Please reply as soon as possible.

    You have done something wrong. You should read the manual for both the SCB-68 and your DAQ card once more. In such cases I often use MAX, and a battery as source. Then a DAQ input is not connected (floating) you will see measurements like you are doing now. Some high level voltage often modulated with mains frequency.
    Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
    (Sorry no Labview "brag list" so far)

  • NI Mydaq gain loss at Higher frequencies

    Hi There,
                      I am using a MYdaq for my student project, I am using this is an audio sine wave generator and analysis tool as part of the project.
    I am sampling well above nyquist of my highest frequency (20K) at 100K sample rate both on my OP DAC and IP ADC and I have a buffer size of the same size on the I/P and output.
    I am using the audio L/R inputs and outputs (AC coupled) - This is looped fro the purpose of this excercise.
    When I generate frequencies (via the Sinewave Vi in labview - "NI_MABase.lvlib: sine waveform.vi"  and my FS and #s is both set to 10000.) i get a drop in level above approx 10K starting at about -1dB up to -3dB at 20K
    Can anyone help me in tracking down the cause of this?
    Regards

    Hi,
    Just to for my own sanity, have I got these values right? 
    - Sampling at a frequency of 100kHz. (Analogue Input and Output)
    - Intending of sampling a 20kHz signal.
    The specifications of the NI myDAQ can be found here. These specify that the device can sample up to 200kS/s for both Analogue Input and Analogue Output, meaning that the maximum clean sinusoid we can both generate and acquire should be 20kHz, with 10S to represent the wave. Higher frequencies will require lower samples and the signals will look more chopped and aliasing will occur (20kHz*10S = 200kS/s). In order to appropriately measure the 20kHz wave, we need to choose a maximum sample frequency of 200kHz for clean waves. With this in mind 100kHz sample rate should be okay.
    Is it possible for me to check the code that you have written so far? What I imagine could be happening is that we're seeing these issues due to the slowness of software. If you look at your code and the DAQ VIs are continuously having to check what values to write, rather than having these values delegated down to the hardware layer prior to running the task, this could be the source of the issue. This is because in software, there is a maximum update rate of 1kHz; this is why all of the base clocks in LabVIEW software are 1kHz clocks. If we're constantly having to poll what kind of digital output values we want to write, this will affect the consistency of the generated signals.
    Kind Regards,
    Alex Thomas, University of Manchester School of EEE LabVIEW Ambassador (CLAD)

  • FYI - High precision data issue with sdo_anyinteract on 11.2.0.3 with 8307

    For anyone that may happen to be experiencing issues with SDO_ANYINTERACT on 11.2.0.3 with high-precision geodetic data, I currently have a service request in to fix it.
    The issue we have is with locating small polygons ("circles" from 0.5"-4") in the 8307 space. The metadata we have specifies a 1mm tollerance for this data, which has worked fine since (as I remember) 10.1. Support verified it works fine up to 11.2.0.2, then is broken in 11.2.0.3.
    So if you are pulling your hair out - stop. ;-) The SR# is 3-5737847631, and the bug# (will be) 14107534.
    Bryan

    Here is the resolution to this issue...
    Oracle came back and said what we have at that tolerance is unsupported and we were just lucky for it to have worked all these years. They are not going to fix anything because it technically isn't broke. We pointed out that the documentation is a little unclear on what exactly supports higher precision, and they noted that for future updates.
    When asked if they would entertain a feature request for a set of high-precision operators (basically the old code) in future release - they basically said no. So for the few items that we much have higher precision - we are on our own.
    What still makes us puzzled is that apparently no one else is using high-precision data in lat/lon. Amazing, but I guess true.
    Anyhow, here is what we used to use (up to 11.2.0.3) which worked fine at a 1mm tollerance:
    Where mask_geom is:
    mask_geom      :=
             sdo_geometry (2001,
                           8307,
                           sdo_point_type (x_in, y_in, 0),
                           NULL,
                           NULL);
    SELECT copathn_id
      INTO cpn
      FROM c_path_node a
    WHERE     sdo_anyinteract (a.geometry_a2, mask_geom) = 'TRUE'
           AND node_typ_d = 'IN_DUCT'
           AND ROWNUM < 2;Basically this finds indexed geometry and compares it to a single mask geometry (a simple point for the x/y given). Only one row is returned (in case they overlapped duct openings - not normal).
    Since this no longer returns any rows reliably for items less than 5cm in size, here is our work-around code:
    SELECT copathn_id
      INTO cpn
      FROM (  SELECT copathn_id,
                     node_typ_d,
                       ABS (ABS (x_in) - ABS (sdo_util_plus.get_mbr_center (a.geometry_a2).sdo_point.x))
                     + ABS (ABS (y_in) - ABS (sdo_util_plus.get_mbr_center (a.geometry_a2).sdo_point.y))
                        distdiff
                FROM c_path_node a
               WHERE sdo_nn (a.geometry_a2,
                             mask_geom,
                             'distance=0.05 unit=m') = 'TRUE'
            ORDER BY distdiff)
    WHERE node_typ_d = 'IN_DUCT'
       AND ROWNUM < 2;Essentially we use sdo_nn to return all results (distance usually is 0) at the 5cm level. At first we though just this would work - then we found that in many cases it would we return multiple results all stating a distance of 0 (not true).
    For those results we then use our own get_mbr_center function that returns the center point for each geometry, and basically compute a delta from the given x_in,y_in and that geometry.
    Then we order the results by that delta.
    The outer select then makes sure the row is of the correct type, and that we only get one result.
    This works, and is fast (actually it is quicker than the original code).
    Bryan

Maybe you are looking for

  • Looking for advice on export settings - one file to suit all purposes?

    Hi all, here is my current workflow when exporting: Adobe Premiere CS5 - Export to mpeg2-bluray or h264-bluray 5.1 file using surcode plugin, which gives me a seperate video and audio file which I then import into encore cs5 to author a bluray and/or

  • Creation of work center CR01 (error message in last page)

    Hi During creation of work center .After all the configuration being done ,in the last page the error message displayed is (ACTIVITY SET CAN NOT BE ALLOCATED TO 2009). Some of the major setting done in the last page is This is Costing Lab 1) work Cen

  • MM - updating Valuation Type in PO History

    Hi, I have a problem during updating the PO History, related with Valuation Type. The process detailed below: 1-Create PO (ME21N) 2-Inbound Delivery (VL31N) 3-MIGO in Blocked Stock (Mov. Type 103) 4-MIGO for Release Blocked Stock 105 But the problem

  • Calling functions in a package from jdbc

    I am trying to call a function in a PL/SQL package from a java program using JDBC. Is this a valid option? The statement I am using is the following: stProc = myconnect.prepareCall ("{? = call process_monitor_pkg.open_sod_cursor}"); I am receiving th

  • How view and edit source code before publish to iWeb

    I noted the following iWeb page plays music when it opens: http://homepage.mac.com/kkirkster/maya_test/ How does one change the source code in iWeb before it is published so that such a change as he did works in opening a blog page? Dennis Power Mac