Daq input resolution

I am using the 12-bit 6115 DAQ board for aquiring noisy, low-voltage signals (less than 100mV). I have set up a voltage range in MAX to +/- .500V. With 12 bits in a perfect world, this should give me about .25mV quantization steps.
My problem: when I measure a signal around 0V with noise on it, it fluctuates between only 2 or 3 different voltages, all well beyond .25mV apart (more like a 2-5mV). I understand there are most likely some other sources of error at work here, but it seems that with a noisy signal, I should be able to see much more variation than this.
I have set up voltage ranges in MAX. Is it necessary to do the same in the Traditional DAQ AIConfig? Is there something else I'm missing? Or, is this the best I can hope for?
BC

Hello DC. Thank you for contacting National Instruments. I took a look at the specifications for the PCI 6115. The accuracy is .71 mV at +/- .5 V input. At +/- .2 V, the accuracy improves to .39 mV, which might be better for your noise.
There are two things that I would like you to try to improve your signal. First, increase your scan rate, you might not be taking in enough points. Second, make sure that the Data Mode is Continuous. If it is in Strip Chart Mode, the data might look skewed. If these settings don't improve the data, please check your configuration.
Once you get a good signal in MAX, you should use the same setup in LabVIEW. Unless you are scaling your data, you should set the input limits in AI Config.vi. This will improve the accuracy of the signal. I hope this answers all of your questions. Have a great day!
Marni S.
National Instruments

Similar Messages

  • DAQ Input Signal kept rising and stop at Max Voltage when run with nothing attached

    I just installed PCI 6229 card and a BNC 2110 (Labview 8.6v). When I started the DAQ Input, the signal kept rising and stop at max Voltage  (10V) when run with nothing attached (instead of fluctuating at around 0). Just wondering what's wrong with it and how to solve it. Terminal configuration was "Differential"
    I just created an input DAQ with a scope and while loop to look at it. Please let me know if I do anything wrong
    Thanks
    Solved!
    Go to Solution.

    I just re-read your original post.  Why do you have the terminal setting to differential?  This sounds like a single ended setup to me.  Change the terminal setting to RSE.  With a differential setting. you need to connect one source to AI1 and another source to AI9.  The difference between the two will be reported.  With RSE, only AI1 will matter and the voltage with respect to ground will be reported.  Make sure you have your AO ground tied to your AI ground.
    Message Edited by tbob on 06-15-2010 05:44 PM
    - tbob
    Inventor of the WORM Global

  • 55TL515U Keeps Hunting for Input Resolution

    I am trying to plug a Microsoft Surface into my TV to use it as a computer monitor.  When I do this, the TV continuously hunts for the resolution and refresh rate.  It shows it at the top of the TV (like when you first plug something in), but keeps changing and never settles on one or shows the picture.
    I have connected the Surface to a different TV with the same HDMI cable and it worked just fine.  I have also used the TV as a monitor for an older computer with the VGA input and had no problems.
    Any suggestions?

    If you can't solve your issue and no one in the community can help, contact support to speak with a tech support agent at (800) 631-3811.
    - Peter

  • How can I protect a DAQ-Input against over-voltage ?

    I have a NI 4472 DAQ together with NI 2501 multiplexer. How can I protect the input ports against over-voltage ?

    You could use some zener diodes or varistors. The 2501 and the 4471 only provide a voltage range of 10 Volts. If voltage is of concern the 445x provide a voltage range of 42 volts and the 2503 has a voltage range of 30 volts. This would allow for more lenient voltage variations.

  • DAQ input protection (overvolta​ge & overcurren​t) question

    Hello Everyone
    I´m designing a DAQ board as my thesis work.
    I have several questions about input protection, I have searched and searched and I have found only 2 circuits, a crowbar and another one using 2 zenner and a PTC resetable fuse or a ordinary fuse. 
    My question is, if there is any  other circuits that can protect inputs.
    The measurement limits are +-20V with a maximum frequency of 100 kHz. At the input there will be a INA163 IA, and then the rest of the circuit. I have to protect these inputs to 230 Vac. 
    If someone knows any other method of doing this, please reply. I'm pending to the PTC+diodes.
    Thank you

    That seems a good approach, to keep it simple. About the resistors I don´t know yet if I´m going to use SMD type or the normal type, I have to see if there is enough space on the PCB. Probably if I choose the SMD type I will have to connect several in series as you said (230 V SMD resistors with those values are a bit hard to find, Ive checked farnell), but I will test it.
    About the zenner, when choosing it, It should whitstands also 230V of direct applied voltage right?
    I´ve attached a file with the circuit, as I pretend to measure diferential signals with the INA163, I think that this circuit is correct using your sugestion.
    Now about the PTC, for the overcurrent protection, if I´m not wrong there are 3 parameters that must be taken into account:
    - Holding current (I think this is the normal working current)
    - Tripping current (At this value the PTC cuts the current going into the circuit)
    - Operating voltage (I think I must choose the 230V version)
    The holding current I´ve pointed it to 50mA with a tripping current of 100mA (I have no ideia if this is ok), the maximum current input of the INA163 is 10mA, the resistive divider will limit also the input current, but I would like to have some more protection. 
    Shoud I put one PTC in each signal branch (Vin+ and Vin -)?
    Attachments:
    circuit.png ‏17 KB

  • DAQ Input

    Hi all,
    I am using USB 6008 Daq configured with digital input in port 0. I would like to monitor and generate an event when the input comes (userevet/value signaling). How could i do it? Please give me any advice/reference.
    Sasi.
    Certified LabVIEW Associate Developer
    If you can DREAM it, You can DO it - Walt Disney

    Hi,
    The value signaling gives me ridiculous result. So i went to producer/consumer design. Even thoughi had little problem. How can i solve this?
    Sasi.
    Certified LabVIEW Associate Developer
    If you can DREAM it, You can DO it - Walt Disney

  • DAQ Input Signal into Array

    Hi,
    I have an input (voltage) signal comming into my DAQ assistant (6008). This information displays well into a chart, but I also want to covert the same data into array format so that I can do further calculations. So, basically I would like to put a DAQ signal into an array. I'm having a lot of trouble doing so, please help,
    thank you
    Nick

    Hello Nick,
    I can use the function "From DDT" in the Function palette Express -> Signal Manipulation.
    When you place it on the block diagram, a windows opens and you can configure: 1D Array of Scalar - Automatic.
    Best regard
    Nick_CH

  • Feature Request: More input resolution choice.

    It really would be a tremendous help to be able to select more 16:9 resolutions between 1920x1080 and 1280x720, as that is an absolutely enormous jump.  Particularly in terms of performance.
    Supporting 16:10 resolutions would be great, too, since, half of the people that I work with have monitors with this aspect ratio and many with 1900x1200.

    Hmm,
    FMLE shows the resolutions of the detected Capture Device.
    If you need other resolutions / aspect ratio you could use resizing and cropping.

  • Timed DAQ input

    Hello. I'm trying to program my DIO 96 board using the NI-DAQ software in
    C++. I need to perform data transfers a 1KHz. That consists of one write
    and one read cycle. Does anybody know how this is accomplished?

    Hello. I'm trying to program my DIO 96 board using the NI-DAQ software in
    C++. I need to perform data transfers a 1KHz. That consists of one write
    and one read cycle. Does anybody know how this is accomplished?

  • Inputs read from daq are overwritten

    Hey there
    I have a Daq input reading into a spreadsheet file
    The daq tells me that one is supposed to have a while loop around it, and I can't get it to run without one, so okay
    But my main problem is that this means that it overwrites my written file each time the while loop repeats
    It also asks me to choose the file to write in multiple times
    How would I go about fixing this?
    Thank you
    Solved!
    Go to Solution.

    Yes you can convert numeric to string, check the attached VI. I would recommend you to go through basic LabVIEW materials and also play with NI example which comes with LabVIEW. Remember do not use the attached example along with data acquisition, always use seperate loops.
    The best solution is the one you find it by yourself
    Attachments:
    Write2File.vi ‏19 KB

  • I can't trigger my E series board with PFI0 because it's resolution is too low

    Hi All,
    I am using LabVIEW 7.0 and DAQmx to configure a 6062E-Daq Card. I am
    using the DAC output to apply small voltage steps (as low as 10 mV) to
    a test system. I want to trigger the acquistion of a separate voltage
    signal on an AI channel whenever these small voltage steps are applied.
    To do this, I have sent my DAC output to the PFI0 trigger channel and
    am acquiring my measured signal on AI1. My vi is essentially the
    Acq&Graph Voltage-Int Clk-HW Trig Restarts.vi only my DAQmx Start
    Trigger is changed from Analog Edge to Reference AnalogEdge so I can
    measure 200 pre-samples.
    The vi works beuatifully when my DAC output changes are quite large
    (say 100mV). If I go to smaller pulses then the trigger never fires. I
    understand that the PFI0 has lower input resolution (usually 4 bits
    lower than AI channles) so I think this is my problem.
    I understand that I can use an AI channel to act as my trigger channel
    but in this case, because I am using an E series board, I can only
    measure one channel (my trigger channel) and still have a Reference
    trigger. Is there a way to collect my data with pre-samples, using
    these small triggering steps ?
    Ian

    Hi Ian,
    I just came across this thread and I wrote an application (Utilizing a Software Circular Buffer for Data Acquisition - LV 7.1) awhile back to help do exactly what Laura was talking about, but without taking too much processor time.
    They key to everything is to store the most recent data in a software
    circular buffer.  It works pretty well even running an application
    for hours at a time because it will only retain the last N seconds
    worth of data and no arrays are being built so it keeps memory usage
    fairly low.  You can then look at the data within the circular
    buffer, determine some sort of condition and then view the last Y
    seconds worth of data on whatever row of data you want.
    It's slightly complicated, but with a little work you can get it to
    work for what you are trying to do.  This may help you find a
    solution between now and when you can get your M-Series board.
    Regards,
    Otis
    Training and Certification
    Product Support Engineer
    National Instruments
    Attachments:
    DAQmx-ReadFromCircularBuffer.llb ‏209 KB

  • DAQ questions – Urgent !

    Hi everybody,
    I’m a beginner in the field, so I need a piece of advice from DAQ experienced users:
    1. What is the definition of ‘kilo-Symbols /s’ and why the rate unit is not the Hertz?
    2. What NI board should I choose if I have these main requirements:
    - at least 4 channels (simultaneous acquisition) - voltage
    - at least 2000 readings per second
    - as flexible as possible for future use (something like output channels, internal timers, etc.)
    - to be used with normal thermocouples (K type) and accelerometers
    3. What is (are) the most important thing(s) related to a DAQ board? (generally speaking)
    Thank you in advance.
    Ela

    My $.02
    1. Data acquisition boards are specified in samples per second. Nyquist says that your sample rate must be at least twice the frequency you are measuring to accurately reproduce the signal.
    2. If you truly need simultaneous sampling on 4 channels, NI only has the 6110. Unless the phase difference is really that critical, go with one of the E series boards. Some other thing that you want to look at are input resolution (number of bits) and input range. My recomendation would be for you contact your local NI sales engineer or call NI directly.

  • Multi usb daq cards for stain gauge calib

    Hi guys,
    I am new to labview and tring to build a daq system use multiple usb cards. My application is to calibrate an array of 36 strain gauges. I know there are cards that have 40 analog inputs (I use single input from amplifier) but budget is tight, boss want to use cheap multiple cards. I already have a ni6009 and need 4 more ni5008 to get a total of 40 inputs (resolution is not a problem). The guy told me that the problem with many of these usb cards is that they can't synchronize. But I think my application need not to synchronize these cards. I only want to read all the ports at "approximately same" time, so  software timing will do. Is it correct? Will a use one vi to read all these cards?
    Another thing is that my computer only have 4 usb ports. Can I use a usb hub or should I use an adaptor?
    Is there any other solution? Any suggestion will be welcomed.

    Hi Xiao,
    “Synchronization” can mean anything from devices on either side of the world operating on a clock disciplined to GPS, to starting to devices at about (ie: a few milliseconds difference) the same time. The USB 6008 can accept a digital trigger, so you could make it so that all the USB 6008’s start on the same trigger. The lack of synchronizing ability of this device that “the guy” mentioned most likely has to do with this device’s inability to share a clock. Although each device will start at the same time each devices clock with me slightly off from one another and with time this error will accumulate and could become significant. If that doesn’t sound like a problem and starting all the devices at about the same time is fine then your approach seems reasonable.
    You absolutely can use one VI to read each of these cards, you would have to create a different task for each but they can all be in one VI.
    In answer to your last question, if you do use a hub, a powered USB would be better, however, I would recommend a true PCI or PCIe USB card in lieu of a hub. USB hub quality varies greatly between manufacturers, and unfortunately I don’t have recommendation of a USB hub use.
    Matt
    Applications Engineer
    National Instruments

  • Engine Dynamometer Brake Control, DAQ & LabVIEW PID

    Hi all, I am in the middle of a project to design, build and test a controller for an eddy current engine dynamometer.  I have an idea of how the inputs, outputs and overall process go, but am not sure how to best implement the necessary features in LabVIEW.  I have access to a NI USB-6211 DAQ, and a PC with LabVIEW 8.6, DAQmx drivers and the PID toolkit installed.
    On the electrical and mechanical side, an SCR firing board takes a 0-5VDC analog control signal to vary the amount of current passed through 380V three-phase electrical lines hooked into a large field coil.  Varying the input 0-5VDC signal results in a directly related variation of input current into the field coil, which in turn affects the strength of the magnetic field generated by the coil.  A large ferromagnetic rotor spins concentrically within the coil.  If the magnetic field increases, eddy currents are induced in the rotor causing it to slow down and heat up, and vice versa.  The engine-under-test is bolted to the large rotor and is put under load by the effects of the induced magnetic field and eddy currents.
    The plan is to have LabVIEW manage the 0-5VDC SCR firing board control signal.  The dynamometer currently has manual rotary knob controls, but the types of tests that are currently possible are limited.  The goal of the overall project is to produce "dyno sheets," plots of engine torque and horsepower over the motor's usable RPM range.  The problem, and motivation for this project, is that the manual controls cannot provide repeatable, precise measurements necessary for "power sweep" tests used to produce dyno sheets.
    Power sweep tests are used by all engine and chassis dynamometers to gather an evenly distributed collection of data across the engine's usable RPM range.  The idea is that the engine should be forced to accelerate its RPM at the same rate from just off-idle to rev limit.  Bolted to a dyno and given its druthers, most engines will accelerate more slowly off-idle and more quickly in their upper RPM power bands.  Load must be controlled so that the engine can spin as freely as possible down low in the RPM range, and be forced to maintain constant acceleration as it tries to pull away in the upper RPM range.  Human, manual control of rotary knobs can provide a respectable effort in this situation, but the problem becomes very apparent when comparing back-to-back, "identical" tests on the same engine, with the same operator.  Repeatability of torque and power measurement tests is very important to understanding how distinct changes to the engines mechanical and fluid systems affect its torque output, along with other symptoms.
    I hope the background is helpful.
    There are RPM and Torque inputs into LabVIEW for the engine under test.  In the design stage, I figured I would be able to implement a PID controller in LabVIEW to vary the SCR firing board's 0-5VDC control signal.  The PID loop would control the 0-5VDC signal so as to allow the RPM of the engine-under-test to accelerate as closely as possible to an operator-chosen rate.  The USB-6211 DAQ has two analog outputs, one of which can be used for the 0-5VDC control signal.  The DAQ also has two digital counter circuits.  One of them is used for counting and determining the continually changing frequency of a TTL pulse train driven by engine-under-test RPM.  Lastly, one of eight analog inputs is used to measure a 0-5VDC analog input signal from a strain gage signal conditioner indirectly measuring engine-under-test torque output.
    I worked with LabVIEW as a student in school, but never attempted to design VI's from scratch until now.  I spent the last week or so practicing with the DAQmx Assistant and later the broken-out DAQmx LabVIEW code for bench-testing the counter and analog inputs for RPM and Torque.  I plan to begin experimenting with PID controls this week, but I have stumbled into a few trouble spots with the DAQ input code already.
    As of right now, it seems that the PID control loop will only use RPM data, not engine torque data.  I would like to make sure that the sampling settings being used provide just the right amount of coverage, not using more DAQ or PC resources than necessary.  I figure this will assure the sampling process and controller will run as close to real-time as possible without relatively large-scale changes to the system.  Due to mechanical limitations of the dynamometer, the engines under test will never exceed 3600 RPM.  A variable reluctance sensor is positioned closely to a 60-toothed trigger wheel bolted to the dyno's rotating assembly.  The VR waveform is passed through a LM1815 based VR signal conditioning circuit to produce a TTL pulse train.  This digital signal is then piped into of the counter inputs on the USB-6211 DAQ.
    (3600 Revolutions per Minute * 60 Teeth per Revolution) / 60 Seconds per Minute = 3600 Teeth per Second (Hz)
    The maximum frequency of the RPM signal will be 3600Hz.  I started to try different DAQmx Timing settings. It seems the three main options are Timing Source, Timing Rate and Number of Samples.  I have the book "LabVIEW for Everyone," and read Chapter 11 on LabVIEW with DAQmx, but I had trouble figuring out exactly how these three fields affect a Counter input, as compared to the examples in the book covering analog inputs.  If it's not too much trouble, could anyone shed any light on this?
    In case it's interesting, here are some pictures of the engine dynamometer and some of the older data equipment that is being replaced.
    Engine Dyno Pictures
    Thank you for any help!
    Rob

    CoastalMaineBird wrote:
    As it happens, I am involved with a large project that controls engines, as well.  I have two control loops, one controls dyno drive based on speed, and the other controls throttle based on torque.  Usually only one is in PID mode at a time, the other is open loop.  I have to run the engine thru a prescribed speed-torque cycle and measure how much exhaust pollution occurs.
    We do "Torque Maps" which are sweeps similar to what you describe. You put the throttle 100% ON, and ramp up the dyno speed, all the while capturing data.  You get a map of how much torque you can obtain at any given speed. 
    I do it on a PXI box to avoid timing issues.  When I last tried to use the host computer to do this, Windows was just not able to devote the time to getting the data right.  The PXI box guarantees that I can control the loop at 100 Hz (I have tested it up to 1000 Hz).
    Anyway, to your specific question: 
    I started to try different DAQmx Timing settings. It seems the three main options are Timing Source, Timing Rate and Number of Samples. 
    The counters are general-purpose, so they can be configured in many different ways.
    The TIMING SOURCE is where the basic timebase comes from.  If you're measuring a high-frequency signal, you want a higher-frequency timing source (so you have resolution to spare).  If you're measuring a low-frequency signal, you want a low-freq timebase (so you don't run out of counter bits).  If your max input is 3600 Hz (277 uSec), then a 1 MHz timebase (1 uSec) will give you  one part in 277 resolution WORST CASE.  A 10-MHz timebase will be 10 times as good.
    I don't know where you are getting the TIMING RATE and # SAMPLES inputs, can you explain? 
    That's a very interesting project you are working on, certainly a lot more involved and impressive.  I think I saw a few screenshots of the LabVIEW interface on your website and wow!...that is really nice.  The emissions analysis equipment can get very pricey, but I can't think of a more timely test to be directly involved in at this point in time.
    I briefly researched standalone LabVIEW system options and real-time capabilities.  I am hoping that the barebones nature and low operation count of the VI will allow a Windows host PC to be sufficient for controlling the process.  This is largely an economic motivation to keep the overall project cost down.  If it turns out that PXI hardware is necessary, it will also be possible to look down that path at that point in time.
    When I first looked at LabVIEW DAQmx reference materials, I suspectec I would need to go beyond the DAQ Assistant and break down DAQmx tasks into their lower level start, read, write, stop and clear components.  After reading more and working with additional LabVIEW PID examples, it looks like I may be able to stick with DAQ Assistant sub-VI's.  I was confused at the time by the different DAQmx timing components and chose to begin with a question on the counter input.  Today, a DAQ Assistant sub-VI seems to be handling the counter input well enough.
    I most likely need to focus on PID parameter tuning at this time.  If it turns out that the timing configuration of the counter RPM input is preventing the system from working as required to perform a power sweep, or follow a Torque Map like you mentioned, then I will need to revisit it.  For now I'm just going to be happy that its portion of the system seems to be working.  
    Message Edited by el bob on 03-24-2009 01:45 PM

  • Image Resolution Question 150dpi vs 300dpi in Final Layout

    In the final offset color printing of a file, does it make a difference if the images in the document are 150 dpi or 300 dpi? I know they shouldn't be less than 150, but does making them higher dpi make them really better? Or does it just make the file larger? If you think it does make the image better, please let me know why. I'd be particulary interested to hear from someone who works with a printing house. Or if there is something on Adobe's website addressing this issue, please add the link.
    Thank you,
    Marilyn

    Yes, it does make a difference.
    These numbers do not come out of the blue. A typical laser photosetter has an output resolution of 2400 dpi [*]. Each of the halftone dots that make up halftone images can be one of 256 values of gray (a hardcoded PostScript limitation). That would make the absolute minimum size of each dot 16 x 16 output pixels. 2400 divided by 16 is 150, hence, you can fit 150 complete halftone dots across an inch.
    However, the calculated position of the halftone dots does not take the actual photosetter halftone position in account (your image may not start at exactly the internal start of a halftone dot). Each of your single dots may be broken into two because they may not be "aligned" with the dots the output device produces. So, on every relatively sharp color boundary of two input pixels (which are output as exactly 2 output halftone dots), the color of this dot will be rounded to both the left and right input pixel (and upper and lower one).
    Doubling the input resolution means than more than a single input pixel is taken into account per halftone dot (of which at least one will be entirely inside the dot), smoothing things out.
    There is also an upper limit on which it will not matter anymore whether you throw in more input pixels per halftone dot -- the typical upper useful value used by InDesign and Distiller is 450 dpi -- images above this are downsampled to 300 dpi, so (apparently) the difference between 300 and 450 dpi is neglectable.
    [*] Modern machines may advertise a higher resolution, even upwards of 3000 dpi, but it's at the operators discretion to switch back to a lower rez to save time and memory -- and no skilled operator will (or should) go below 2400 dpi for halftone images.

Maybe you are looking for

  • How can I manage six email accounts WITHOUT them all appearing (twice) in the Folders column?

    How can I manage six email accounts WITHOUT them all appearing (twice) in the Folders column? All I need is ONE INBOX (like Windows Mail used to do) ... I can see in the 'Account' column of each message which email account it's using. Then, I either

  • ITunes 8.2 will not install

    I had this problem with the last itunes update as well. When i click to download the new update it downloads normally but when it gets to the point where it says 'installing' the progress bar doesnt move. I left my computer for an hour and it still h

  • Problem with JDeveloper 3.1

    I got the folowing message while opening JDev 3.1. Please tell me how to contact the support group. I have already tried it in the MetaLink. But I am not able to register in MetaLink. Message Begin: JniPortal for C:\PROGRAM FILES\ORACLE\JDEVELOPER 3.

  • Reinstallation, no Apple Works

    I did a clean reinstall of my operating system, now Apple Works is missing. Shouldn't it be there?

  • Time out error while trying to open workbook

    Hi all,         In analyzer open workbook While trying to open the JT dashboard, timeout error is coming. Please give suggestion on that. Thanks & Regards R. saravanan