DAQ Input

Hi all,
I am using USB 6008 Daq configured with digital input in port 0. I would like to monitor and generate an event when the input comes (userevet/value signaling). How could i do it? Please give me any advice/reference.
Sasi.
Certified LabVIEW Associate Developer
If you can DREAM it, You can DO it - Walt Disney

Hi,
The value signaling gives me ridiculous result. So i went to producer/consumer design. Even thoughi had little problem. How can i solve this?
Sasi.
Certified LabVIEW Associate Developer
If you can DREAM it, You can DO it - Walt Disney

Similar Messages

  • DAQ Input Signal kept rising and stop at Max Voltage when run with nothing attached

    I just installed PCI 6229 card and a BNC 2110 (Labview 8.6v). When I started the DAQ Input, the signal kept rising and stop at max Voltage  (10V) when run with nothing attached (instead of fluctuating at around 0). Just wondering what's wrong with it and how to solve it. Terminal configuration was "Differential"
    I just created an input DAQ with a scope and while loop to look at it. Please let me know if I do anything wrong
    Thanks
    Solved!
    Go to Solution.

    I just re-read your original post.  Why do you have the terminal setting to differential?  This sounds like a single ended setup to me.  Change the terminal setting to RSE.  With a differential setting. you need to connect one source to AI1 and another source to AI9.  The difference between the two will be reported.  With RSE, only AI1 will matter and the voltage with respect to ground will be reported.  Make sure you have your AO ground tied to your AI ground.
    Message Edited by tbob on 06-15-2010 05:44 PM
    - tbob
    Inventor of the WORM Global

  • How can I protect a DAQ-Input against over-voltage ?

    I have a NI 4472 DAQ together with NI 2501 multiplexer. How can I protect the input ports against over-voltage ?

    You could use some zener diodes or varistors. The 2501 and the 4471 only provide a voltage range of 10 Volts. If voltage is of concern the 445x provide a voltage range of 42 volts and the 2503 has a voltage range of 30 volts. This would allow for more lenient voltage variations.

  • DAQ input protection (overvolta​ge & overcurren​t) question

    Hello Everyone
    I´m designing a DAQ board as my thesis work.
    I have several questions about input protection, I have searched and searched and I have found only 2 circuits, a crowbar and another one using 2 zenner and a PTC resetable fuse or a ordinary fuse. 
    My question is, if there is any  other circuits that can protect inputs.
    The measurement limits are +-20V with a maximum frequency of 100 kHz. At the input there will be a INA163 IA, and then the rest of the circuit. I have to protect these inputs to 230 Vac. 
    If someone knows any other method of doing this, please reply. I'm pending to the PTC+diodes.
    Thank you

    That seems a good approach, to keep it simple. About the resistors I don´t know yet if I´m going to use SMD type or the normal type, I have to see if there is enough space on the PCB. Probably if I choose the SMD type I will have to connect several in series as you said (230 V SMD resistors with those values are a bit hard to find, Ive checked farnell), but I will test it.
    About the zenner, when choosing it, It should whitstands also 230V of direct applied voltage right?
    I´ve attached a file with the circuit, as I pretend to measure diferential signals with the INA163, I think that this circuit is correct using your sugestion.
    Now about the PTC, for the overcurrent protection, if I´m not wrong there are 3 parameters that must be taken into account:
    - Holding current (I think this is the normal working current)
    - Tripping current (At this value the PTC cuts the current going into the circuit)
    - Operating voltage (I think I must choose the 230V version)
    The holding current I´ve pointed it to 50mA with a tripping current of 100mA (I have no ideia if this is ok), the maximum current input of the INA163 is 10mA, the resistive divider will limit also the input current, but I would like to have some more protection. 
    Shoud I put one PTC in each signal branch (Vin+ and Vin -)?
    Attachments:
    circuit.png ‏17 KB

  • Daq input resolution

    I am using the 12-bit 6115 DAQ board for aquiring noisy, low-voltage signals (less than 100mV). I have set up a voltage range in MAX to +/- .500V. With 12 bits in a perfect world, this should give me about .25mV quantization steps.
    My problem: when I measure a signal around 0V with noise on it, it fluctuates between only 2 or 3 different voltages, all well beyond .25mV apart (more like a 2-5mV). I understand there are most likely some other sources of error at work here, but it seems that with a noisy signal, I should be able to see much more variation than this.
    I have set up voltage ranges in MAX. Is it necessary to do the same in the Traditional DAQ AIConfig? Is there something else I'm missing? Or, is this the best I can hope for?
    BC

    Hello DC. Thank you for contacting National Instruments. I took a look at the specifications for the PCI 6115. The accuracy is .71 mV at +/- .5 V input. At +/- .2 V, the accuracy improves to .39 mV, which might be better for your noise.
    There are two things that I would like you to try to improve your signal. First, increase your scan rate, you might not be taking in enough points. Second, make sure that the Data Mode is Continuous. If it is in Strip Chart Mode, the data might look skewed. If these settings don't improve the data, please check your configuration.
    Once you get a good signal in MAX, you should use the same setup in LabVIEW. Unless you are scaling your data, you should set the input limits in AI Config.vi. This will improve the accuracy of the signal. I hope this answers all of your questions. Have a great day!
    Marni S.
    National Instruments

  • DAQ Input Signal into Array

    Hi,
    I have an input (voltage) signal comming into my DAQ assistant (6008). This information displays well into a chart, but I also want to covert the same data into array format so that I can do further calculations. So, basically I would like to put a DAQ signal into an array. I'm having a lot of trouble doing so, please help,
    thank you
    Nick

    Hello Nick,
    I can use the function "From DDT" in the Function palette Express -> Signal Manipulation.
    When you place it on the block diagram, a windows opens and you can configure: 1D Array of Scalar - Automatic.
    Best regard
    Nick_CH

  • Timed DAQ input

    Hello. I'm trying to program my DIO 96 board using the NI-DAQ software in
    C++. I need to perform data transfers a 1KHz. That consists of one write
    and one read cycle. Does anybody know how this is accomplished?

    Hello. I'm trying to program my DIO 96 board using the NI-DAQ software in
    C++. I need to perform data transfers a 1KHz. That consists of one write
    and one read cycle. Does anybody know how this is accomplished?

  • Inputs read from daq are overwritten

    Hey there
    I have a Daq input reading into a spreadsheet file
    The daq tells me that one is supposed to have a while loop around it, and I can't get it to run without one, so okay
    But my main problem is that this means that it overwrites my written file each time the while loop repeats
    It also asks me to choose the file to write in multiple times
    How would I go about fixing this?
    Thank you
    Solved!
    Go to Solution.

    Yes you can convert numeric to string, check the attached VI. I would recommend you to go through basic LabVIEW materials and also play with NI example which comes with LabVIEW. Remember do not use the attached example along with data acquisition, always use seperate loops.
    The best solution is the one you find it by yourself
    Attachments:
    Write2File.vi ‏19 KB

  • How to use voltage signals as input signals into USB-6211

    Hello everyone,
    I have a USB-6211 and a PDQ80A -quadrant photo detectot from thorlabs, i am trying to take three voltage signals from the PDQ80A into my PC using USB-6211 from NI. THese voltage signals are X,Y and sum (X+Y).  
    Looking forward to your reply.
    Thanks in advance,
    _Perseus 

    Technically, the question was in the title.  But of course, the answer to that is... Connect them using WIRES.   It's designed for voltage signals, just wire your cell outputs to whatever DAQ input channels you're going to read.
    Using LabVIEW: 7.1.1, 8.5.1 & 2013

  • Count the Pass/ Fail in a DAQ reading testing ?

    I want to write the while loop to read the DAQ signal. Each time, the DAQ input signal is compared with the spec. limit. The signal that is within the limit will be recorded as " Pass " while the signal out of spec. will be recorded as " Fail ". At the end of the testing ( I break the program), the testing result will be displayed on the testing panel.
    I believe my question is similar to the programming of " A=A+1 ". Can anybody help me to resolve the problem ?

    Tian,
    Are you looking for just the number of passed and number failed measurements or a record of each individual test. If you are only concerned with the number of passed/failed tests then use a shift register. Right-click on the border of the loop and choose Add Shift Register. Then use a case structure. If it passed the test (true case) then increment the value. If it did not pass the test (false case) then just wire the value straight through the case structure. To get the number that failed simply subtract the number that passed from the iteration count.

  • Why is the DAQ outputs distorted?

    I am trying to output two voltage sine waves(90 degrees out of phase) from my DAQ card. I use a for loop to create two arrays of 1000 samples each. I then send each of these arrays to my DAQ card for sampling of 5000S/s. One of the outputs(measured on an oscilliscope is perfect while the other clips and is distorted. Why am i not getting twp perfectly good sign waves?Is this a hardware issue?
    Please find attached my LabView file.
    Thanks,
    Gilly
    Attachments:
    Sine_forloop_nearlydone.vi ‏91 KB

    I haven't looked at your vi yet, don't have 8.6 on this machine (please identify the version when attaching vi's), but do have some questions. Are they both set to the same amplitude? Are you looking at them individually (re: single channel scope) or both channels at the same time? Are they both connected to the DAQ inputs in the same way. Usually, if there isn't an actual hardware problem with the outputs, distortion and clipping are caused by; 1) trying to output a waveform of an amplitude greater than allowed, 2) trying to drive a seriously mismatched load. In the later category hooking up to the DAQ incorrectly might cause a problem.
    Putnam
    Certified LabVIEW Developer
    Senior Test Engineer
    Currently using LV 6.1-LabVIEW 2012, RT8.5
    LabVIEW Champion

  • How is to set the input range of PCI-MIO-16E-1 (6070E) to be 0 to +5V?

    How is to set the input range of PCI-MIO-16E-1 (6070E) to be 0 to +5V? Thank you very much.

    Hi x2am,
    Here is a link to a document about setting input range limits.
    DAQ Input Limits
    Hope this helps!
    Jeremy L.
    National Instruments
    Jeremy L.
    National Instruments

  • I still acn't get my picture box to flicker between two grey levels WHILE the DAQ takes a measurement.

    I received a suggestion that I contain the two parts (grey level ficker and signal capture) in seperate while loops and use a local variable to pass a control to the measurement part from the flicker, with the control triggering the true statement of a case structure inside of the measurement while loop. In each of the cases (both vi's are listed below) the picture box stops swithing between the two grey levels while the measurement is being taken by the DAQ. In my application, I have a photometer taking continuous measurements and sending a proportional voltage signal to the DAQ input. If the picture box (which is what the photometer is measuring the br
    ightness of) does not switch back and forth between the grey levels while the DAQ is sampling the incoming signal from the photometer, then the photometer is putting out no signal. No grey level switching, no signal. So the DAQ ends up measuring a flat line. Does anyone have any ideas as to how I can fix this, please try nd run my vi's before you make suggestions, just to see what is happening. Also, if you have a solution, give it to me in remedial form, if you can. I am new to labview.
    Thanks for all your help.
    Attachments:
    Acq_N_Scans_to_File_(wdt)_with_flicker_2while.vi ‏127 KB
    Acq_N_Scans_to_File_(wdt)_with_flicker.vi ‏127 KB

    I apologize, one of the vi's that I sent was incorrect. I did, more or less, the same as the example you gave me. Like the example you gave me, it stops switching between grey levels momentarily while the DAQ captures the signal. Is it possible to have the two run simultaneously? I will attch the vi that I intended to send, and would appreciate any further ideas you may have on how to fix the problem.
    Attachments:
    Acq_N_Scans_to_File_(wdt)_with_flicker_2while.vi ‏135 KB

  • Daq acquisition at specified not-equally-spaced times

    Hi,
    I think I alreay ask this question, but I think I was in a different section or discussion group... sorry if I repeat myself
    I have a fast DAQ PCI-6120.  It collects alot of data very fast and due to this, the data amounts are pretty large and the pc chokes up sometimes.  What I really need, though, is to acquire data at exponentially growing time intervals, for instance (times after multiple-acq. data start):  1e-5s,3e-5s, 1e-4s, 3e-4s,1e-3s, 3e-3s, 1e-2s, 3e-2s, 0.1s, 0.3s, 1s, 3s, 10s.  That way I only have 10-20 data points, instead of collecting a million data points (if I set the time interval 1e-5s, or fixed sampling rate 100000 S/s). 
    So , in principle, can one set a DAQ input to collect multiple samples after start, where the times are set through a user-specified array and not a sampling frequency?
    thank you in advance,
    Alex

    Hi Alex,
    Can you provide more information about your application?  Does the user specify the sampling rate to change?  If not, then what determines how the frequency is set?  Is this between different channels or all one channel?   Why not take 10-20 data points at a slower sampling rate?
    Regards,
    h_baker
    National Instruments
    Applications Engineer
    Digital Multimeter Resources

  • Engine Dynamometer Brake Control, DAQ & LabVIEW PID

    Hi all, I am in the middle of a project to design, build and test a controller for an eddy current engine dynamometer.  I have an idea of how the inputs, outputs and overall process go, but am not sure how to best implement the necessary features in LabVIEW.  I have access to a NI USB-6211 DAQ, and a PC with LabVIEW 8.6, DAQmx drivers and the PID toolkit installed.
    On the electrical and mechanical side, an SCR firing board takes a 0-5VDC analog control signal to vary the amount of current passed through 380V three-phase electrical lines hooked into a large field coil.  Varying the input 0-5VDC signal results in a directly related variation of input current into the field coil, which in turn affects the strength of the magnetic field generated by the coil.  A large ferromagnetic rotor spins concentrically within the coil.  If the magnetic field increases, eddy currents are induced in the rotor causing it to slow down and heat up, and vice versa.  The engine-under-test is bolted to the large rotor and is put under load by the effects of the induced magnetic field and eddy currents.
    The plan is to have LabVIEW manage the 0-5VDC SCR firing board control signal.  The dynamometer currently has manual rotary knob controls, but the types of tests that are currently possible are limited.  The goal of the overall project is to produce "dyno sheets," plots of engine torque and horsepower over the motor's usable RPM range.  The problem, and motivation for this project, is that the manual controls cannot provide repeatable, precise measurements necessary for "power sweep" tests used to produce dyno sheets.
    Power sweep tests are used by all engine and chassis dynamometers to gather an evenly distributed collection of data across the engine's usable RPM range.  The idea is that the engine should be forced to accelerate its RPM at the same rate from just off-idle to rev limit.  Bolted to a dyno and given its druthers, most engines will accelerate more slowly off-idle and more quickly in their upper RPM power bands.  Load must be controlled so that the engine can spin as freely as possible down low in the RPM range, and be forced to maintain constant acceleration as it tries to pull away in the upper RPM range.  Human, manual control of rotary knobs can provide a respectable effort in this situation, but the problem becomes very apparent when comparing back-to-back, "identical" tests on the same engine, with the same operator.  Repeatability of torque and power measurement tests is very important to understanding how distinct changes to the engines mechanical and fluid systems affect its torque output, along with other symptoms.
    I hope the background is helpful.
    There are RPM and Torque inputs into LabVIEW for the engine under test.  In the design stage, I figured I would be able to implement a PID controller in LabVIEW to vary the SCR firing board's 0-5VDC control signal.  The PID loop would control the 0-5VDC signal so as to allow the RPM of the engine-under-test to accelerate as closely as possible to an operator-chosen rate.  The USB-6211 DAQ has two analog outputs, one of which can be used for the 0-5VDC control signal.  The DAQ also has two digital counter circuits.  One of them is used for counting and determining the continually changing frequency of a TTL pulse train driven by engine-under-test RPM.  Lastly, one of eight analog inputs is used to measure a 0-5VDC analog input signal from a strain gage signal conditioner indirectly measuring engine-under-test torque output.
    I worked with LabVIEW as a student in school, but never attempted to design VI's from scratch until now.  I spent the last week or so practicing with the DAQmx Assistant and later the broken-out DAQmx LabVIEW code for bench-testing the counter and analog inputs for RPM and Torque.  I plan to begin experimenting with PID controls this week, but I have stumbled into a few trouble spots with the DAQ input code already.
    As of right now, it seems that the PID control loop will only use RPM data, not engine torque data.  I would like to make sure that the sampling settings being used provide just the right amount of coverage, not using more DAQ or PC resources than necessary.  I figure this will assure the sampling process and controller will run as close to real-time as possible without relatively large-scale changes to the system.  Due to mechanical limitations of the dynamometer, the engines under test will never exceed 3600 RPM.  A variable reluctance sensor is positioned closely to a 60-toothed trigger wheel bolted to the dyno's rotating assembly.  The VR waveform is passed through a LM1815 based VR signal conditioning circuit to produce a TTL pulse train.  This digital signal is then piped into of the counter inputs on the USB-6211 DAQ.
    (3600 Revolutions per Minute * 60 Teeth per Revolution) / 60 Seconds per Minute = 3600 Teeth per Second (Hz)
    The maximum frequency of the RPM signal will be 3600Hz.  I started to try different DAQmx Timing settings. It seems the three main options are Timing Source, Timing Rate and Number of Samples.  I have the book "LabVIEW for Everyone," and read Chapter 11 on LabVIEW with DAQmx, but I had trouble figuring out exactly how these three fields affect a Counter input, as compared to the examples in the book covering analog inputs.  If it's not too much trouble, could anyone shed any light on this?
    In case it's interesting, here are some pictures of the engine dynamometer and some of the older data equipment that is being replaced.
    Engine Dyno Pictures
    Thank you for any help!
    Rob

    CoastalMaineBird wrote:
    As it happens, I am involved with a large project that controls engines, as well.  I have two control loops, one controls dyno drive based on speed, and the other controls throttle based on torque.  Usually only one is in PID mode at a time, the other is open loop.  I have to run the engine thru a prescribed speed-torque cycle and measure how much exhaust pollution occurs.
    We do "Torque Maps" which are sweeps similar to what you describe. You put the throttle 100% ON, and ramp up the dyno speed, all the while capturing data.  You get a map of how much torque you can obtain at any given speed. 
    I do it on a PXI box to avoid timing issues.  When I last tried to use the host computer to do this, Windows was just not able to devote the time to getting the data right.  The PXI box guarantees that I can control the loop at 100 Hz (I have tested it up to 1000 Hz).
    Anyway, to your specific question: 
    I started to try different DAQmx Timing settings. It seems the three main options are Timing Source, Timing Rate and Number of Samples. 
    The counters are general-purpose, so they can be configured in many different ways.
    The TIMING SOURCE is where the basic timebase comes from.  If you're measuring a high-frequency signal, you want a higher-frequency timing source (so you have resolution to spare).  If you're measuring a low-frequency signal, you want a low-freq timebase (so you don't run out of counter bits).  If your max input is 3600 Hz (277 uSec), then a 1 MHz timebase (1 uSec) will give you  one part in 277 resolution WORST CASE.  A 10-MHz timebase will be 10 times as good.
    I don't know where you are getting the TIMING RATE and # SAMPLES inputs, can you explain? 
    That's a very interesting project you are working on, certainly a lot more involved and impressive.  I think I saw a few screenshots of the LabVIEW interface on your website and wow!...that is really nice.  The emissions analysis equipment can get very pricey, but I can't think of a more timely test to be directly involved in at this point in time.
    I briefly researched standalone LabVIEW system options and real-time capabilities.  I am hoping that the barebones nature and low operation count of the VI will allow a Windows host PC to be sufficient for controlling the process.  This is largely an economic motivation to keep the overall project cost down.  If it turns out that PXI hardware is necessary, it will also be possible to look down that path at that point in time.
    When I first looked at LabVIEW DAQmx reference materials, I suspectec I would need to go beyond the DAQ Assistant and break down DAQmx tasks into their lower level start, read, write, stop and clear components.  After reading more and working with additional LabVIEW PID examples, it looks like I may be able to stick with DAQ Assistant sub-VI's.  I was confused at the time by the different DAQmx timing components and chose to begin with a question on the counter input.  Today, a DAQ Assistant sub-VI seems to be handling the counter input well enough.
    I most likely need to focus on PID parameter tuning at this time.  If it turns out that the timing configuration of the counter RPM input is preventing the system from working as required to perform a power sweep, or follow a Torque Map like you mentioned, then I will need to revisit it.  For now I'm just going to be happy that its portion of the system seems to be working.  
    Message Edited by el bob on 03-24-2009 01:45 PM

Maybe you are looking for