Timed DAQ input

Hello. I'm trying to program my DIO 96 board using the NI-DAQ software in
C++. I need to perform data transfers a 1KHz. That consists of one write
and one read cycle. Does anybody know how this is accomplished?

Hello. I'm trying to program my DIO 96 board using the NI-DAQ software in
C++. I need to perform data transfers a 1KHz. That consists of one write
and one read cycle. Does anybody know how this is accomplished?

Similar Messages

  • Can you perform simultaneous timed digital input and output using a PCI-6120?

    Is it possible to do simultaneous timed digital input and output using a PCI-6120? It seems that timed digital operations require using the group read/write commands, which utilize an entire port. Since the PCI-6120 has only 1 digital I/O port, I would like to be able to use 2 lines as outputs and 1 line as input, and do both in a deterministic fashion. i.e. create a digital signal with known pulse widths and read an input line at a known time after the outputs were set. Is this possible to do with only one digital I/O port?

    Hello,
    This can be done in LabVIEW. There is actually an example that installs with NI-DAQ.
    Below is a link to a Knowledge Base that explains how to find the correlated digital I/O examples.
    http://digital.ni.com/public.nsf/websearch/B849664604EB34B886256D12005B5520?OpenDocument
    Just take a look at the example titled �Continuous CDIO with external clock (E).vi�
    Best regards,
    Justin Tipton
    National Instruments

  • DAQ Input Signal kept rising and stop at Max Voltage when run with nothing attached

    I just installed PCI 6229 card and a BNC 2110 (Labview 8.6v). When I started the DAQ Input, the signal kept rising and stop at max Voltage  (10V) when run with nothing attached (instead of fluctuating at around 0). Just wondering what's wrong with it and how to solve it. Terminal configuration was "Differential"
    I just created an input DAQ with a scope and while loop to look at it. Please let me know if I do anything wrong
    Thanks
    Solved!
    Go to Solution.

    I just re-read your original post.  Why do you have the terminal setting to differential?  This sounds like a single ended setup to me.  Change the terminal setting to RSE.  With a differential setting. you need to connect one source to AI1 and another source to AI9.  The difference between the two will be reported.  With RSE, only AI1 will matter and the voltage with respect to ground will be reported.  Make sure you have your AO ground tied to your AI ground.
    Message Edited by tbob on 06-15-2010 05:44 PM
    - tbob
    Inventor of the WORM Global

  • How can I protect a DAQ-Input against over-voltage ?

    I have a NI 4472 DAQ together with NI 2501 multiplexer. How can I protect the input ports against over-voltage ?

    You could use some zener diodes or varistors. The 2501 and the 4471 only provide a voltage range of 10 Volts. If voltage is of concern the 445x provide a voltage range of 42 volts and the 2503 has a voltage range of 30 volts. This would allow for more lenient voltage variations.

  • DAQ input protection (overvolta​ge & overcurren​t) question

    Hello Everyone
    I´m designing a DAQ board as my thesis work.
    I have several questions about input protection, I have searched and searched and I have found only 2 circuits, a crowbar and another one using 2 zenner and a PTC resetable fuse or a ordinary fuse. 
    My question is, if there is any  other circuits that can protect inputs.
    The measurement limits are +-20V with a maximum frequency of 100 kHz. At the input there will be a INA163 IA, and then the rest of the circuit. I have to protect these inputs to 230 Vac. 
    If someone knows any other method of doing this, please reply. I'm pending to the PTC+diodes.
    Thank you

    That seems a good approach, to keep it simple. About the resistors I don´t know yet if I´m going to use SMD type or the normal type, I have to see if there is enough space on the PCB. Probably if I choose the SMD type I will have to connect several in series as you said (230 V SMD resistors with those values are a bit hard to find, Ive checked farnell), but I will test it.
    About the zenner, when choosing it, It should whitstands also 230V of direct applied voltage right?
    I´ve attached a file with the circuit, as I pretend to measure diferential signals with the INA163, I think that this circuit is correct using your sugestion.
    Now about the PTC, for the overcurrent protection, if I´m not wrong there are 3 parameters that must be taken into account:
    - Holding current (I think this is the normal working current)
    - Tripping current (At this value the PTC cuts the current going into the circuit)
    - Operating voltage (I think I must choose the 230V version)
    The holding current I´ve pointed it to 50mA with a tripping current of 100mA (I have no ideia if this is ok), the maximum current input of the INA163 is 10mA, the resistive divider will limit also the input current, but I would like to have some more protection. 
    Shoud I put one PTC in each signal branch (Vin+ and Vin -)?
    Attachments:
    circuit.png ‏17 KB

  • Daq input resolution

    I am using the 12-bit 6115 DAQ board for aquiring noisy, low-voltage signals (less than 100mV). I have set up a voltage range in MAX to +/- .500V. With 12 bits in a perfect world, this should give me about .25mV quantization steps.
    My problem: when I measure a signal around 0V with noise on it, it fluctuates between only 2 or 3 different voltages, all well beyond .25mV apart (more like a 2-5mV). I understand there are most likely some other sources of error at work here, but it seems that with a noisy signal, I should be able to see much more variation than this.
    I have set up voltage ranges in MAX. Is it necessary to do the same in the Traditional DAQ AIConfig? Is there something else I'm missing? Or, is this the best I can hope for?
    BC

    Hello DC. Thank you for contacting National Instruments. I took a look at the specifications for the PCI 6115. The accuracy is .71 mV at +/- .5 V input. At +/- .2 V, the accuracy improves to .39 mV, which might be better for your noise.
    There are two things that I would like you to try to improve your signal. First, increase your scan rate, you might not be taking in enough points. Second, make sure that the Data Mode is Continuous. If it is in Strip Chart Mode, the data might look skewed. If these settings don't improve the data, please check your configuration.
    Once you get a good signal in MAX, you should use the same setup in LabVIEW. Unless you are scaling your data, you should set the input limits in AI Config.vi. This will improve the accuracy of the signal. I hope this answers all of your questions. Have a great day!
    Marni S.
    National Instruments

  • DAQ Input

    Hi all,
    I am using USB 6008 Daq configured with digital input in port 0. I would like to monitor and generate an event when the input comes (userevet/value signaling). How could i do it? Please give me any advice/reference.
    Sasi.
    Certified LabVIEW Associate Developer
    If you can DREAM it, You can DO it - Walt Disney

    Hi,
    The value signaling gives me ridiculous result. So i went to producer/consumer design. Even thoughi had little problem. How can i solve this?
    Sasi.
    Certified LabVIEW Associate Developer
    If you can DREAM it, You can DO it - Walt Disney

  • DAQ Input Signal into Array

    Hi,
    I have an input (voltage) signal comming into my DAQ assistant (6008). This information displays well into a chart, but I also want to covert the same data into array format so that I can do further calculations. So, basically I would like to put a DAQ signal into an array. I'm having a lot of trouble doing so, please help,
    thank you
    Nick

    Hello Nick,
    I can use the function "From DDT" in the Function palette Express -> Signal Manipulation.
    When you place it on the block diagram, a windows opens and you can configure: 1D Array of Scalar - Automatic.
    Best regard
    Nick_CH

  • Inputs read from daq are overwritten

    Hey there
    I have a Daq input reading into a spreadsheet file
    The daq tells me that one is supposed to have a while loop around it, and I can't get it to run without one, so okay
    But my main problem is that this means that it overwrites my written file each time the while loop repeats
    It also asks me to choose the file to write in multiple times
    How would I go about fixing this?
    Thank you
    Solved!
    Go to Solution.

    Yes you can convert numeric to string, check the attached VI. I would recommend you to go through basic LabVIEW materials and also play with NI example which comes with LabVIEW. Remember do not use the attached example along with data acquisition, always use seperate loops.
    The best solution is the one you find it by yourself
    Attachments:
    Write2File.vi ‏19 KB

  • Stopping a currently running DAQ task for m-series

    I'm running a hardware timed analog input data acquisition task on a PCI-6229 m-series DAQ card that takes 200 us.  Every 250 us the program reads the data and restarts the task.  The difficulty is that the program sometimes has a late start and the next time the thread reads the task is still in progress.  I'd like to guarantee the task is stopped every time the program reads the data.  I've tried the following three sets of commands when the thread wakes up:
    Attempt 1:
    if( board->Joint_Status_2.readAI_Scan_In_Progress_St() )
         board->AI_Command_1.writeAI_Disarm(1);
         board->AI_Command_1.flush();
    Attempt 2:
    if( board->Joint_Status_2.readAI_Scan_In_Progress_St() )
         board->AI_Status_1.setAI_STOP_St(kTrue);
         board->AI_Status_1.flush();
    Attempt 3:
    if( board->Joint_Status_2.readAI_Scan_In_Progress_St() )
         board->AI_Mode_1.setAI_Start_Stop(kTrue);
         board->AI_Mode_1.flush();
    They seem to randomly work.  Sometimes the task stops immediately, sometimes it reads a few more times, and sometimes it just keeps reading.  The positive part of these commands are that the task can be restarted by simply issuing the aiStart(board) command again -- most of the time.  Is there something that I can send to the card to reliably stop any currently running AI tasks and at the same time allow the aiStart(board) command to be used to start the next set of readings?
    You may ask why I'm doing this.  I've had a lot of problems losing track of the inputs after 13 hr to several days at 250 kHz.  By restarting the task every loop and clearing the DMA buffer, I can guarantee the first element in the buffer is the first input read.  I'm using DMA so if the task is still running when I send the aiStart(board) command, it can screw up this balance.  You may argue that I should keep track of things more closely, but this system means that if the inputs somehow become switched the next time the thread runs it will automatically correct the problem.  This self-correction is a critical feature.
    Thanks.
    Aaron

    Hi Aaron-
    The bitfields you attempt to write are problematic for a few reasons.  First, AI_Disarm is only safe to use for idle counters and may not work reliably if the acquisition is currently running (which it sounds like you have observed).  AI_STOP_St is a read-only bit, so writing it will have no effect.  Finally, AI_Start_Stop controls an unrelated functionality (essentially, it decides whether an AI_Start -> AI_Stop cycle constitutes a "scan".  This is actually the only mode of the STC2 that makes much sense to use on M Series).
    There are a couple of bitfields in AI_Command_2 that might help.  AI_End_On_SC_TC is a strobe bit that disarms the AI_SC, AI_SI, AI_SI2, and AI_DIV counters when an SC_TC event occurs.  AI_End_On_End_Of_Scan provides the same functionality for when an AI_Stop occurs.  So basically, you could determine a regular interval boundary number of scans to stop on (using End_On_SC_TC) or just stop at the end of the "current" scan (using End_On_End_Of_Scan). 
    I haven't tested this, but it should work.  Let me know if you have problems using either of these methods.  Hopefully this helps- 
    Message Edited by Tom W [DE] on 03-14-2008 03:21 PM
    Tom W
    National Instruments

  • Can I use a DAQ with my CCD?

    I have a multifunction DAQ card and LabView 6.1. I would like to capture images using a CCD camera and am hoping that all I need is software, is this available? Do I need a new card and if so which one?

    If your camera is analog, you can use you multifunction DAQ card to read in the analog signal. You will have to monitor the signal for the H-Sync and the V-Sync pulses and then convert the meaningful analog picture signal into digital values. For more information on the analog video signal please reference the following document:
    Anatomy of a Video Signal
    If the camera is digital, you will not be able to read in the signal with the multifunction DAQ board. The multifunction DAQ boards can perform static digital input and output (software timed). For reading in a digital signal you will need a hardware timed digital acquistion. Our
    DAQ boards in the 653x family can be used for hardware timed digital input and output. In this situation, you would use the pixel clock of the camera to time the acqusition of the 653x device.
    If the camera is analog you should consider purchasing one of National Instruments framegrabber boards for you acquistion. These boards are designed specifically for this purpose. (NI 1409 or NI 1411)
    If the camera is digital, I would suggest visiting our online camera advisor to see if your camera is compatible with our digital framegrabbers (NI 1422, NI 1424, NI 1428)
    Industrial Camera Advisor
    Will Denman
    Application Engineering
    National Instruments

  • Engine Dynamometer Brake Control, DAQ & LabVIEW PID

    Hi all, I am in the middle of a project to design, build and test a controller for an eddy current engine dynamometer.  I have an idea of how the inputs, outputs and overall process go, but am not sure how to best implement the necessary features in LabVIEW.  I have access to a NI USB-6211 DAQ, and a PC with LabVIEW 8.6, DAQmx drivers and the PID toolkit installed.
    On the electrical and mechanical side, an SCR firing board takes a 0-5VDC analog control signal to vary the amount of current passed through 380V three-phase electrical lines hooked into a large field coil.  Varying the input 0-5VDC signal results in a directly related variation of input current into the field coil, which in turn affects the strength of the magnetic field generated by the coil.  A large ferromagnetic rotor spins concentrically within the coil.  If the magnetic field increases, eddy currents are induced in the rotor causing it to slow down and heat up, and vice versa.  The engine-under-test is bolted to the large rotor and is put under load by the effects of the induced magnetic field and eddy currents.
    The plan is to have LabVIEW manage the 0-5VDC SCR firing board control signal.  The dynamometer currently has manual rotary knob controls, but the types of tests that are currently possible are limited.  The goal of the overall project is to produce "dyno sheets," plots of engine torque and horsepower over the motor's usable RPM range.  The problem, and motivation for this project, is that the manual controls cannot provide repeatable, precise measurements necessary for "power sweep" tests used to produce dyno sheets.
    Power sweep tests are used by all engine and chassis dynamometers to gather an evenly distributed collection of data across the engine's usable RPM range.  The idea is that the engine should be forced to accelerate its RPM at the same rate from just off-idle to rev limit.  Bolted to a dyno and given its druthers, most engines will accelerate more slowly off-idle and more quickly in their upper RPM power bands.  Load must be controlled so that the engine can spin as freely as possible down low in the RPM range, and be forced to maintain constant acceleration as it tries to pull away in the upper RPM range.  Human, manual control of rotary knobs can provide a respectable effort in this situation, but the problem becomes very apparent when comparing back-to-back, "identical" tests on the same engine, with the same operator.  Repeatability of torque and power measurement tests is very important to understanding how distinct changes to the engines mechanical and fluid systems affect its torque output, along with other symptoms.
    I hope the background is helpful.
    There are RPM and Torque inputs into LabVIEW for the engine under test.  In the design stage, I figured I would be able to implement a PID controller in LabVIEW to vary the SCR firing board's 0-5VDC control signal.  The PID loop would control the 0-5VDC signal so as to allow the RPM of the engine-under-test to accelerate as closely as possible to an operator-chosen rate.  The USB-6211 DAQ has two analog outputs, one of which can be used for the 0-5VDC control signal.  The DAQ also has two digital counter circuits.  One of them is used for counting and determining the continually changing frequency of a TTL pulse train driven by engine-under-test RPM.  Lastly, one of eight analog inputs is used to measure a 0-5VDC analog input signal from a strain gage signal conditioner indirectly measuring engine-under-test torque output.
    I worked with LabVIEW as a student in school, but never attempted to design VI's from scratch until now.  I spent the last week or so practicing with the DAQmx Assistant and later the broken-out DAQmx LabVIEW code for bench-testing the counter and analog inputs for RPM and Torque.  I plan to begin experimenting with PID controls this week, but I have stumbled into a few trouble spots with the DAQ input code already.
    As of right now, it seems that the PID control loop will only use RPM data, not engine torque data.  I would like to make sure that the sampling settings being used provide just the right amount of coverage, not using more DAQ or PC resources than necessary.  I figure this will assure the sampling process and controller will run as close to real-time as possible without relatively large-scale changes to the system.  Due to mechanical limitations of the dynamometer, the engines under test will never exceed 3600 RPM.  A variable reluctance sensor is positioned closely to a 60-toothed trigger wheel bolted to the dyno's rotating assembly.  The VR waveform is passed through a LM1815 based VR signal conditioning circuit to produce a TTL pulse train.  This digital signal is then piped into of the counter inputs on the USB-6211 DAQ.
    (3600 Revolutions per Minute * 60 Teeth per Revolution) / 60 Seconds per Minute = 3600 Teeth per Second (Hz)
    The maximum frequency of the RPM signal will be 3600Hz.  I started to try different DAQmx Timing settings. It seems the three main options are Timing Source, Timing Rate and Number of Samples.  I have the book "LabVIEW for Everyone," and read Chapter 11 on LabVIEW with DAQmx, but I had trouble figuring out exactly how these three fields affect a Counter input, as compared to the examples in the book covering analog inputs.  If it's not too much trouble, could anyone shed any light on this?
    In case it's interesting, here are some pictures of the engine dynamometer and some of the older data equipment that is being replaced.
    Engine Dyno Pictures
    Thank you for any help!
    Rob

    CoastalMaineBird wrote:
    As it happens, I am involved with a large project that controls engines, as well.  I have two control loops, one controls dyno drive based on speed, and the other controls throttle based on torque.  Usually only one is in PID mode at a time, the other is open loop.  I have to run the engine thru a prescribed speed-torque cycle and measure how much exhaust pollution occurs.
    We do "Torque Maps" which are sweeps similar to what you describe. You put the throttle 100% ON, and ramp up the dyno speed, all the while capturing data.  You get a map of how much torque you can obtain at any given speed. 
    I do it on a PXI box to avoid timing issues.  When I last tried to use the host computer to do this, Windows was just not able to devote the time to getting the data right.  The PXI box guarantees that I can control the loop at 100 Hz (I have tested it up to 1000 Hz).
    Anyway, to your specific question: 
    I started to try different DAQmx Timing settings. It seems the three main options are Timing Source, Timing Rate and Number of Samples. 
    The counters are general-purpose, so they can be configured in many different ways.
    The TIMING SOURCE is where the basic timebase comes from.  If you're measuring a high-frequency signal, you want a higher-frequency timing source (so you have resolution to spare).  If you're measuring a low-frequency signal, you want a low-freq timebase (so you don't run out of counter bits).  If your max input is 3600 Hz (277 uSec), then a 1 MHz timebase (1 uSec) will give you  one part in 277 resolution WORST CASE.  A 10-MHz timebase will be 10 times as good.
    I don't know where you are getting the TIMING RATE and # SAMPLES inputs, can you explain? 
    That's a very interesting project you are working on, certainly a lot more involved and impressive.  I think I saw a few screenshots of the LabVIEW interface on your website and wow!...that is really nice.  The emissions analysis equipment can get very pricey, but I can't think of a more timely test to be directly involved in at this point in time.
    I briefly researched standalone LabVIEW system options and real-time capabilities.  I am hoping that the barebones nature and low operation count of the VI will allow a Windows host PC to be sufficient for controlling the process.  This is largely an economic motivation to keep the overall project cost down.  If it turns out that PXI hardware is necessary, it will also be possible to look down that path at that point in time.
    When I first looked at LabVIEW DAQmx reference materials, I suspectec I would need to go beyond the DAQ Assistant and break down DAQmx tasks into their lower level start, read, write, stop and clear components.  After reading more and working with additional LabVIEW PID examples, it looks like I may be able to stick with DAQ Assistant sub-VI's.  I was confused at the time by the different DAQmx timing components and chose to begin with a question on the counter input.  Today, a DAQ Assistant sub-VI seems to be handling the counter input well enough.
    I most likely need to focus on PID parameter tuning at this time.  If it turns out that the timing configuration of the counter RPM input is preventing the system from working as required to perform a power sweep, or follow a Torque Map like you mentioned, then I will need to revisit it.  For now I'm just going to be happy that its portion of the system seems to be working.  
    Message Edited by el bob on 03-24-2009 01:45 PM

  • DAQ, Hardware

    I am still learning LabVIEW 8.0. However, most of the hardware we use in the school's lab like signal generator, are quite expensive.
    Are there any small or afordable hardware I can obtain, so that I can learn Data Aquisition on my on computer at home. 

    Hi,
    NI definitely has some low cost options that you could get started with.  For instance, the low cost USB DAQ modules are a great place to get started with software timed Analog Input, Analog Ouput and Digital I/O.
    If you need hardware timed Analog I/O and Digital I/O (for applications
    that require higher sample rates), than there are PCI options such as
    the PCI-6221, as well as USB options such as the USB-6211.
    Let me know if you have questions on these modules, and best of luck on your data acquisition.
    Have a great day!
    Travis W

  • Simultaneous Digital Input and Output from NI 9403

    Hi,
    I was wondering if it is possible to digital output through one line in this module(NI 9403) while simultaneously reading from 10 other lines?
    Basically what I need is for line0 to always be closed(true), and read lines1:10 - at very least 1sample/second.
    I have attached a stripped down version of my code illustrating my error. I thought I was getting the error because the two tasks were using different clock configurations, but I tried using the same inputs and that doesn't work for me.
    Any input is greatly appreciated.
    Thank you
    Attachments:
    Digital I-O.vi ‏22 KB

    Hi Matthew,
    The behavior you're seeing is expected for the 9403 module, since it is a serial module. The DAQmx help has the following restrictions in the Digital I/O Considerations for C Series Devices section.
    Timed digital input/output restrictions:
    You cannot use parallel and serial modules together on the same hardware timed task. 
    You cannot use serial modules for triggering. 
    You cannot do both static and timed tasks at the same time on a single serial module. 
    You can only do hardware timing in one direction at a time on a serial bidirectional module.
    So you can only do timed input or output at one time, but not both. However, based on the VI you attached and the description of what you want to do, you don't necessarily need both timed tasks in your loop. You can create a static output task to update that one line and after that, you can just start your timed input task on the rest of the lines. This should maintain the state on the DO line, since you're not acquiring from it.
    Cheers,
    Cristina
    Cristina

  • How to use voltage signals as input signals into USB-6211

    Hello everyone,
    I have a USB-6211 and a PDQ80A -quadrant photo detectot from thorlabs, i am trying to take three voltage signals from the PDQ80A into my PC using USB-6211 from NI. THese voltage signals are X,Y and sum (X+Y).  
    Looking forward to your reply.
    Thanks in advance,
    _Perseus 

    Technically, the question was in the title.  But of course, the answer to that is... Connect them using WIRES.   It's designed for voltage signals, just wire your cell outputs to whatever DAQ input channels you're going to read.
    Using LabVIEW: 7.1.1, 8.5.1 & 2013

Maybe you are looking for

  • Attachments - why fix what wasn't broken!

    Prior to OX Lion, when an email arrived with attachments you were able to select them individually (or more using command-click) and drag them to whichever folder you liked directly from mail. Now all you get is to save them individually or all at on

  • Time machine and network file server

    Hi, Is it possible to set up the time machine to a network file server which my macbook is able to see and use? Right now, time machine is limited to be used with a delicated external HD or another Mac... It would be convenient to share files between

  • I cannot figure out how to edit my h.264 footage in Premiere Pro. Please help.

    I have been trying to import AVI video into Premiere pro CS5. I have a new Dell Laptop with Windows 7. Every time I try to import the video into Premiere it only shows up as audio files. However, Windows Media Player plays the files just fine. I've t

  • If i buy the product red ipod touch 5th generation, will the colour come off, meaning will it rub of or wear of as time goes?

    I am looking forward to buying the iPod touch 5th generartion product red colour. I want to know if the red colour will wear off over time or will it rub off or something like that, or will the colour stay on firm like the other ipod colours.

  • One album divided into many parts

    Some of my songs in iTunes that should be listed under a single album title are divided into several albums(all with the same album title). I've gone into "get info" and made sure that every listing is spelled, punctuated and phrased the same way, bu