Scan rate using hardware timing.

How can I set the scan rate accurately to 1000 Hz using hardware timing.

Wire 1000.00 to the Scan Rate input of the AI Start VI. That will set up the scan clock on the DAQ board to run at 1000 Hz. Note that it will only work on NI DAQ boards.
Daniel L. Press
PrimeTest Corp.
www.primetest.com

Similar Messages

  • Is there a way to output a buffered pattern on port 0 and input on ports 1,2, and 3 of the 6534 card while using an external clock to vary scan rate using LabVIEW?

    I am using a 6713 board to clock my 6534 board and want to output from the buffer a patten continuously on port 0 and read from ports 1,2,and 3 at the same time and clock speed.

    Hello;
    Unfortunately, you can't combine a 24 bit group by bundling 3 ports in the same group. Valid group sizez are 8 bits, 16 bits and 32 bits only. To accomplish that task, you will need an extra 6534 board, then configure that board to do a 32bit pattern input acquisition, and discard the data from one of the four ports.
    You can definitely synchronize both 6534s with the 6713 throug the RTSI bus. To synchronize all boards together, just route the AO Update clock from the 6713 board to one of the RTSI lines, and use two more instances of the Route.vi to route from that same RTSI line to the PCLK lines of the two 6534s.
    Hope this helps.
    Filipe A.
    Applications Engineer
    National Instruments

  • PXIe-4141 hardware timed 4- channel sweep

    In the attached example of FET/BJT characterisation using hardware timed two channel sweep, for each gate voltage the drain values are sweeped.
    If I understand the trigger settings correctly :-
    For each drain value, V is applied and I is measured. So if there are 2 Vg values and 3 Vd values, the PXI fetches 6 Vd & Id and 2 Vg, Ig values.
    To put it more bluntly,
    seq start
     Vg = 3V Ig = .....
    seq start
    Vd = 0.5,Id = ....(Ig = ?)
    Vd = 1,Id = ....(Ig = ?)
    Vd = 1.5 Id = ....(Ig = ?)
    seq complete
    Vg = 4V, Ig = ......
    seq start
    seq complete
    What if we wish to measure Ig for all iterations of Vd as well ? Can guess to configure 'MeasureWhen' trigger of Gate to 'SourceComplete' of Drain. Is that doable/feasible in sequence-source-mode ? Or one should apply each value in single-point-source-mode. Wish to iterate 5 values in each 4 cascaded channels at the end.

    Hello,
    Last time I asked for your code but I think that you need more information about the hardware configuration for your PXI. That is why I found some material in the help file of the PXIe-4141 which is given below.
    I hope this helps!
    •Once the sequence runs Sequence Loop Count times, the operation is complete, and NI-DCPower generates the Sequence Engine Done event, as shown in the following figures.
    The following figure illustrates a sequence with the niDCPower Measure When property is set to Automatically After Source Complete or the NIDCPOWER_ATTR_MEASURE_WHEN attribute is set to NIDCPOWER_VAL_AUTOMATICALLY_AFTER_SOURCE_COMPLETE.
    The figure could you find in the screenshot. 
    You can find a screecshot in the attachment.
    Regards,
    Hossein
    Attachments:
    Help PXIe-4141.png ‏250 KB

  • Unable to measure frequency below 20 Hz on a NIDAQ 9178 chassis with NI 9402 even while using a hardware timed delay

    Hello,
    I am trying to measure frequency using NI 9402 in NI cDAQ9178 chassis. I am setting the clock for my counter channel to be my chassis ai Sample Clock.
    I am able to measure frequency above 20 Hz. For frequencies less than 20Hz, I get the following error:
    DAQmx Error: Multiple Sample Clock pulses were detected within one period of the input signal. Use a Sample Clock rate that is slower than the input signal. Ifyou are using an external Sample Clock, ensure that clock signal is within the jitter and voltage level specifications and without glitches.Task Name: _unnamedTask<0>
    Status Code: -201314
    Setting the Rate to 1 also not does resolve the issue.
    OTHER DETAILS:
    * Running on 64 bit, Win7 platform.
    * NIDAQmx Driver Version: 14.5
    I had posted regarding this earlier and I was told that this might be because the counter is armed immediately before the first sample is taken. The recommendation was to add a hardware-timed delay using the DAQmxSetStartTrigDelay method to the AI task. I have added this delay but I still receive the same error message. The previous post I had mentioned can be found below:
    http://forums.ni.com/t5/Multifunction-DAQ/Cannot-measure-frequency-below-20-Hz-on-a-NIDAQ-9178-chassis/td-p/1537274
    I have also attached my current code which has the delay. Is this a bug in the driver? If yes, can we have a CAR# to track this?
    Thanks.
    Regards,
    Varun Hariharan
    The MathWorks

    Alright so I got everything working correctly in both C and LabVIEW code.
    The problems is in fact with the first sample, as John_P1 suggested. You simply need to delay that first sample from being requested. It is simple to do this in software instead of hardware.
    If you are using CVI, just add #include <utility.h> to the top of your .c file and then add a delay before your DAQmxErrChk (DAQmxStartTask(AItaskHandle)); line.
    Comment out / remove the DAQmxSetStartTrigDelay(AItaskHandle,10); line, as it wasn't doing what we thought it would. (Hardware delay).
    I added Delay (.05); to delay long enough for the full period of the input signal at 20 Hz.
    Depending on your frequency, this value may need to be adjusted. 100ms wouldn't be a bad idea.
    This is expected behavior, and I don't think we need a CAR.
    Let me know what you think!
    -CB

  • Using cDAQ timers to repeat a hardware-timed finite-sample pulse train clock

    Hi all,
    As part of a complex project, I must implement an acquisition hardware interface for a linear motion sensor using the Synchronous Serial Interface (SSI) output common in industrial motion control. (I have a fair bit of experience in digital electronics, but I'm new to hardware timer-synchronized digital I/O in LabVIEW.)
    To do this, I need to create a hardware-timed bursted pulse train TTL clock signal. Each burst consists of 25 high-low transitions with a full-cycle period of 2.67 microseconds (375kHz). The output must then be held high until the next burst, which occur at 1ms intervals.
    Using cDAQ timers and a NI 9401 (based on the example at http://www.ni.com/example/30256/en/), I've been able to create the pulse train burst as described (see attached VI image). Next I need to configure another timer to trigger this burst to repeat at 1ms intervals.
    Does anyone have any pointers about the best way to accomplish the hardware timing for repetition of the pulse train?
    Any suggestions of alternative strategies or observations as to the ways my noobish code is stupid or inefficient are welcome as well!
    Thanks!
    Solved!
    Go to Solution.
    Attachments:
    Hardware Timed Pulse Train Clock.jpg ‏102 KB

    Hey Ryan, 
    A picture of the behavior you are seeing would be helpful. The NI 9411 should only be reading 50 bits every 1 ms.  
    It may not be possible to read 25 bits (do you mean 50 bits? 25 high 25 low) and push it to a queue without encountering an overflow error. If you take a look at the above code the digital input will receive 50 sample clocks every 1 ms. This is equivalent to acquiring 50 points every 1 ms which is an acquisition speed of 50 samples/1 ms=50 kHz. The read loop must keep up with the 50 kHz rate otherwise the buffer will overflow. In the above example I’ve set the read to pull 5000 samples (x) with the assumption that the loop will take less than or equal to .1 seconds (y). This yields a software acquisition speed of 50 kHz (5000 samples/100 ms). If the loop speed is faster than 100ms then the 10 seconds timeout on the DAQmx read will allow for the read to block so the FIFO may be filled.
    The options available for question 2 are below. They may be used separately or in conjunction.
    Move the DAQmx Read for the NI 9411 to its own independent while loop, set the DAQmx Read to acquire 50 samples, do not graph the data, and pass the data to a Queue for processing in a consumer loop. This will increase the loop speed which may allow you to keep up with the 50 kHz acquisition speed. This may not work because the loop speed will need to be 1 ms or less.
    Increase the value of the Samples per Channel control that goes into the DAQmx Timing VI. This will increase the DAQmx Software Buffer size. This buys time until you receive an overflow error because the DAQmx Software Buffer is being filled faster than samples are removed.
    Read in 5000 sample chunks (producer loop), push this to the queue, and perform the analysis in 50 bit chunks (consumer loop). The additional queue created should allow for the acquisition loop to keep up.
    Regards,
    Izzy O.
    Product Support Engineer
    ni.com/support

  • How to increase scan rate of NI Switch SCXI 1130

    Hi,
    I have NI PCI 4070 DMM used with NI SCXI 1130 sitch module. I have connected 10 thermocouples to 1130 module. I am scanning the channels and reading the values in the program using niSwitch and niDMM VIs. I am using software trigger in the program. I have configured Software Trigger in niDMM configur trigger and niDMM configure Multipoint. I get the correct values when i scan using chi->com0, where i goes from 0 to 9. But the problem is that the rate of scanning is very slow.
    There is niSwitch Configure scan Rate.vi, here i have given scan delay as 0 second.
    It takes one second for one channel when i run the program. why is this , is this because i used software trigger for each channel scan? how to improve the scan rate. ?

    Sorry for the confusion, I started writing a post and got interrupted and came back to it too late.  You can disregard the last post and here is my final answer:
    I would actually recommend that you use synchronous scanning if you want to maximize the speed of your scan, rather than using software triggers.  If you use synchronous scanning, the DMM will generate a digital pulse (Measurement Complete) each time it completes a measurement, allowing the switch to advance to the next entry in the scan list the instant the DMM has completed its measurement.  The DMM will then take the next meausurement after a specified harware-timed interval.  This will be much more efficient than sending software triggers back and forth to time the scanning.  To set up your application using synchronous scanning, follow these steps:
    Open the LabVIEW shipping example "niSwitch DMM Switch Synchronous Scanning.vi", found in the NI Example Finder in the folder Hardware Input and Output » Modular Instruments » NI-SWITCH (Switches).
    Physically connect the Measurement Complete output trigger from the DMM to the trigger input of the switch.  How you will do this depends on what type of chassis you are using (PXI/SCXI combo chassis or separate chassis) and what switch terminal block you're using.  If you need assistance with this please provide more details about your hardware setup and I'd be happy to help out.  The following resource may be helpful here: KnowledgeBase 3V07KP2W: Switch/DMM Hardware Configurations.
    Select valid values for all other front panel controls and run the VI.
    I hope this is helpful.  Please let me know if I have misunderstood your application, or if you would like me to go into more detail on any specific part of the solution provided above. 

  • Need urgent help with HSDIO hardware timing

    Hi everyone,
    I need urgent help regarding HSDIO hardware timing. I've been working in a project which generating serial ramp using HSDIO pxie device. 
    I'm using clock rate 40MHz and generating 14 bit of boolean for each step of ramp. And I have to generate simply 256 steps ramp.
    Which means, 256 (steps) x 14 (boolean array) x 25 ns (period of 1 boolean value) = 89,6 ns.
    What I'm doing right now is with using index of FOR loop as my input data (converting the index into 14bit boolean), then write into pxie device in every iteration,
    which means, my data is getting into output in every 1ms time, right? (I'm using windows)
    And I want to be able to generate faster than that. 
    How can I prewrite my 256 steps ramp, then write them all at once into pxie device. I'm really stuck here.
    In the picture can you see how I do the write into device in every iteration of FOR Loop.
    Regards,
    Yan.

    hi, thanks for responding.
    with using example of dynamic generation with script, I can manage to generate the ramp with controllable delay (generate the whole waveform, including delay with script command, then write to the card).
    But I still have 1 question, I can test the output of the generation using oscilloscope and cant see the start delay (I'm writing delay at the start, before generating the ramp). My signal generated at 0 sec.
    How can I check this start delay? is there any good example delivered with Labview to check this generation? Somehow I cant use the "dynamic generation and acquisition" example to see my generation (cant figure out how to capture the generated signal).
    regards,
    Yan.

  • How to set the scan rate in the example "NI435x.vi" for the 435x DAQ device?

    I am using LabVIEW 6i, A 4351 Temperature/Voltage DAQ, and am using the example vi "NI 435x thermocouple.vi"
    How do I set the scan rate on this VI? I added a counter to just to see the time between acquisitions, and it is roughly 4s for only 4 channels. The Notch Filter is set at 60, but is there a control for scan rate on this example. I'd like to acquire around 1 Hz.

    Using this DAQ example, it may be easiest to simply place a wait function in the loop and software time when the data is extracted from the board. You can also take a look a the NI-435x pallette and the examples it ships with, along with the timing specs on page 3-3 of the 435x manual and set the filters to meet your needs:
    http://www.ni.com/pdf/manuals/321566c.pdf
    Regards,
    Chris

  • Keithley 2701 LV driver: Setting the scan rate?

    I downloaded the Keithley 2701 LV driver and the example programs work great. There is just one question that I have. I cannot find anywhere in the examples or in the LV drivers how to change the scan rate or scan interval. Everything else I'm able to change, such as number of points, channels, voltage levels, etc., but I can't figure out how to change my scan rate to once every 0.5 seconds or 1 second.
    I'm not sure, maybe this is a question for Keithley but since it is a LV driver I decided to post here first.
    I'm using LV8.2
    Jeff

    I am using the Keithley 2700 with the 7700 cards.  There is a vi named "ke27xx Configure Aperture Timing.vi" in the "ke27xx.llb" .
    The vi sets the integration time for the A/D converter.  I was measuring 500uv and needed to improve my accuracy so increased my integration time.
    Brian
    LV 8.2
    Brian

  • Non-time based scan rate

    How do I acquire analog data with a scan rate based on a digital trigger and not based on time. I have two signals one analog and one digital. I want to acquire an analog data point every time the digital signal goes from high to low and from low to high.
    Thank you.

    I don't think LabVIEW DAQ hardware supports triggering on both edges. It appears you have to trigger on one or the other.
    The only thing that I can think of to try is to take your analog signal and split it into two different analog inputs. Use the example "Acq&Graph Voltage-Int Clk-Dig Start.vi". Insert it in a new vi twice. You will have to open it and modify the connector so the Physical Channel and Trigger Edge are inputs and the graph is the output. Then wire each one with a different analog input and trigger edge and run them sequentially. Hopefully it will run fast enough for you.
    Randall Pursley

  • During slow scan rates, are events captured between scans?

    We are using 32 thermocouples to measure a heated cabinet performance and are using the Continuous Acquisition to Spreadsheet VI. I am saving the data to a text file while plotting the data to a chart in Labview 6.1. I only wanted to read temperature every 60 seconds (0.01667 scans/sec) but set the scan rate rounds off to 2 decimal places, so at 0.02 scan rate we read data every 50 seconds. I set the buffer size to 1 and the number of scans to write at a time to 1. Does this mean I have a lag of one scan in the buffer or am I seeing the latest scan written to the chart? Also, temperature events between writes to the screen sometimes show up on the chart, sometimes not. It seems to depend on how long the e
    vent takes place or how close to the next write. I would appreciate any insight on how the software handles the events between writes. Thanks.

    Willard,
    You can actually change the digits of precision in LabVIEW to display more than 2 decimal places. If you right-click on the scan rate control and select 'Format and Precision' you can alter the digits of precision to include 5 decimal places.
    By setting the buffer size to 1 you are setting a memory size to hold only 1 scan at a time. By setting the number of scans to write at a time to 1 that means that every time the AI Read VI is called within the while loop, the VI will only read 1 scan from the buffer. Therefore, the hardware is set-up to read 1 scan every 50-60 seconds and so the hardware places that scan in the buffer. Then every time the AI Read VI is called in software, it reads 1 scan from the buffer. So, if the AI Read VI already read
    the 1 scan from the buffer, it will wait until the buffer gets the next scan from hardware (50-60 seconds) and then plot it on the chart. So, with the set-up you have, you will not see any lag of temperature data. Once the hardware scans, the software will almost immediately read that scan. I hope this helps.
    Regards,
    Todd D.
    Applications Engineer
    National Instruments

  • Scan rate too slow for I/O buses

    Hello,
    Could somebody please explain what the attached message means and how I can overcome it
    Thanks in advance 

    Thank you for the answer. I apologize for not attaching the
    image the first time. My hardware is cRIO 9014 with a 9104 chassis and modules
    AO, DO, DI and AI and I am using LabVIEW 8.6.1. The concept is to control a
    unit either by a touch panel (TPC 2106T) or by my PC. For this reason, I created
    a project containing the cRIO controller, the touch panel and two
    different vis (for now I leave the Touch panel out of the game, I mentioned it
    just because is included in the project), one under the CRIO controller and one
    on the PC. I also made a Library containing shared variables to obtain
    communication between cRIO and PC (what I am planning to do in the future is to
    set the RT vi as start up vi (in order to run always) and controlling the unit
    either by the PCvi or by the touch panel vi. What I want to do now is running
    the 2 vis (RTvi and PCvi) simultaneously. For the RTvi I used a timed loop with
    the synchronized to scan engine selected in the input node. For the PC vi I use
    a while loop with a time delay. When I am trying to deploy the RTvi I receive
    the message sent in my previous post. The scan engine settings are: scan period
    100sec and network publishing period 100ms. I hope that my explanation is clear
    now. Please let me know if you need further information.
    Thanks in advance

  • Scan rate exceed the limit?

    the program runs good when we use 600000hz as scan rate,but it gives error when we use 700000 hz as scan rate, saying that the scan rate exceed the hardware limit. but the manual of the hardware(DAQ board), gives 1250000 hz as the upper limit of scan rate.what is the possible true reason for this error?

    When entering the scan rate, you have to take into account how many channels you are scanning to determine the sampling rate. For instance, when scanning 3 channels at 700000 Hz, you are scanning through all 3 channels at 700000 Hz and so you're trying to make the board sample at 700000*3 Hz (2100000 Hz). This exceeds the limit of the hardware.
    Even at 600000 Hz (sample rate of 600000*3 = 1800000 Hz) you are above the maximum sampling rate. 1250000 is specified maximum sampling rate but sometimes you can acquire above the maximum rate. I hope this helps.
    Regards,
    Todd D.
    Applications Engineer
    National Instruments

  • Use hardware to generate a certain number of pulses using PXI-5652

    Is it possible to generate a certain number of periods (turn on the output for a predetermined amount of time using hardware with micro-second resolution) using the PXI-5652.

    Hello Matt,
    If you had a Vector Signal Generator, this would be more controllable using something similar to this forum.
    Otherwise, the only way to control time on the 565x would be to use OOK modulation scheme. This will allow you to turn on the signal for a little time On and Off.
    There is a shipped example called "565x Digital Modulation.vi" (on versions 1.3 and later of the RFSG driver) that can perform this task. Basically it uses the function named "niRFSG Configure Digital Modulation User Defined Waveform.vi" to write the user bit sequence and with property nodes you will configure the OOK scheme and the symbol rate (refer to specifications of your card for the possible values).
    Hope this helps,
    Gerardo O.
    RF Systems Engineering
    National Instruments

  • I want to play video on my computer to make some analysis to frames,the problem that I face ,I can't change video frame rate using labview,but I can change frame rate to the video out of labview using some program

    HI All
    I want to play video on my computer to make some analysis to it's frames,the problem that I face ,I can't change video frame rate using labview,but I can change frame rate to the video out of labview using some program .
    I used IMAQ AVI Read Frame VI
    for example I have avi video It's frame rate is 25 fbs ,my image processing code is very fast that can process more 25 fbs,so I want to accelerate video acquisition

    Hi abdelhady,
    I looked into this further, and reading an AVI file into LabVIEW faster than its frames per second won't be possible. LabVIEW could read in frames faster than 25fps, but because it will be pulling the available frame at that point in time this would just give you duplicate frames. If you want to be able to read in frames at faster than 25fps, you would need to speed up your AVI file before reading into LabVIEW.
    There's a good shipping example to show how to read in from an AVI file, "Read AVI File.vi". You'll notice that they add timing to make sure that the while loop runs at the right speed to match up with the frames per second of the file being read. This is to make sure you're not reading duplicate frames.
    Thank you,
    Emily C
    Applications Engineer
    National Instruments

Maybe you are looking for

  • USB Stick Won't Appear on Desktop or Disk Utilities

    Hi there, One of my USB sticks doesn't seem to appear on my iMac. I have looked in disk utility and it doesn't appear there so I can't format it, and I have plugged it into another Mac and have had the same issue. My external hard drive still appears

  • Envy x360 battery short and CMOS/RTC | No dedicated CMOS battery = Kafkaesque HP support nightmare

    Relevant Case Number: 3015038426 Caveat: If I'm wrong about this, I'll eat my hat with a sauce made of ultra-expensive brand-name HP ink. Question: Does anyone at HP want to take reponsibility for a series of the company's tech support representative

  • Can i embed a file in my pages document

    Hi there, I'm providing links to my work in a pages doc. For work that has no website, I wanted to embed a pdf file within Pages. So that you'd click on an "internal" link and the pdf would open. If not pdf, I can always use a jpg. Is this doable? If

  • Front Panel Dilemma

    I am using Run VI invoke node to run a VI that is reentrant and for passing a single paramter to the same I am using set control node. Now the calling VI will be an exe when packaged and needs to run as an NT service wherein I am not allowed to use a

  • HT5772 movies on mac

    After I delete the movie the cover art is not visible anymore, even if in my country is possible to download again the movie. There's something wrong... PS: I wanted to delete the file because I received an email from apple to do it because the movie