Voltage Range E Series DAQ

I have a PCI-MIO-16E DAQ.  My input voltage range is 0.7 - 1.1 volts.  The manual seems to suggest that I can only choose voltage ranges that have discrete values, ie. 0-1; 0-2; 0-5 etc. 
The NI Measurement & Automation Explorer allows me to select 0.7 - 1.1 volts, however.  So my question is:  will the resolution be based on dividing the range 0.7 - 1.1 volts into 4096 parts, or by dividing 0 - 2 volts into 4096 parts?
Thanks,
Shawn Clark

The actual input ranges of the DAQ board are fixed. I don't know the specifics of your board but the ranges might be +/-10, +/-5, +/-1, 0-1, 0-5, 0-10, etc. When you specify a minimum and maximum value, NI-DAQ uses that information to select the most appropriate range. So, in your case, if the board supports 0-2 volts, that is the range that will be selected and that's the value to use for resolution. If the DAQ board does not have a 0-2 range and only has a 0-5 range, then you need to use this number.

Similar Messages

  • How do i read out the actual voltage range of the daq board as calculated by labview based on my input voltage range

    I am somewhat new to DAQ so I hope I describe this well enough, but I have encountered a requirement for my VI that I am not sure how to obtain from LabVIEW. It is my understanding that when using a DAQ board (I'm using a PCI-6034 E) you can specify the voltage range you expect from your measurements, and then the software calculates the actual voltage range it will be measuring to ensure that you get the best resolution in your range that you can given the voltage step size, etc. of the equipment. I know how to extract the voltage input that I enter into the virtual channel, but I was wondering how I can extract the actual voltage range that the hardwar
    e/software is using based on its own calculations from my input. Can anyone help me with this?

    If I understand correctly, you want to retrieve the actual gain value (and, therefore, usable input range) used by your DAQ card for a particular measurement.
    I think there's a good KnowledgeBase entry with your solution entitled "Programmatically Determining the Gain Used by a DAQ Device". Here's the URL:
    http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/c93d3954ec02393c86256afe00057bb0?OpenDocument
    Regards,
    John Lum
    National Instruments

  • How can I programmatically change the voltage range settings in a DAQ Assistant

    Hi,
    First post here.  
    I need to be able to change the voltage range properties of a daqmx DAQ Assistant based on user input.  My hardware, an SCXI-1102C does not allow changing this property on a running task, so I'd like to either set the analog input voltage range before the DAQ Assistant activates, or pause the DAQ Assistant immediately after it starts, set the values and then resume.
    I don't know how to edit the task ahead of time because the DAQ assistant creates the task when it runs, and there is no task before that.
    In the attached picture, I have a conditional section, set to run only if the while loop iteration is 0.  I take the task from the Daq assistant, send it to a stop task vi, set the property, and then send the task on to the start task vi. I can watch it run with the debug light on, and everything seems to work correctly, but on the second (and all the other) iteration of the loop, I read out AI.Max and it seems like the DAQ Assistant has re set it back to 5V.  Can anyone see what is going wrong here?
    BTW, this is continuous acquisition and the code does not produce error messages when it runs.
    I did come across a similar question that someone posted here back in 2006, but his question was specifically aimed at a Labview API (VB, I think), and not an actual G solution.
    Attached are the actual vi in question and a png image of the block diagram.
    Thanks! 
    Ruby K
    Solved!
    Go to Solution.
    Attachments:
    Labview_question.PNG ‏14 KB
    Sample_AIV.vi ‏91 KB

    First, if you want to start getting beyond the basics with DAQ, you are going to have to stop using the DAQ assistant and do it with lower level DAQmx VI's.  There are hundreds of examples in the example finder.  You can even right click on the DAQ assistant and select open front panel.  That will create a subVI that you can open and see what is going on behind the scenes.  Do it.  I think you'll find the DAQ task is being recreated on each (though I'm not 100% of how the settings are established or maintained in each section of that subVI).
    The second problem is you have a bit of a race condition on iteration 0.  Those two DAQ property nodes are running at the same time.  So when you read the AI.Max, it may be happening before or after the AI.Max is set in your case structure.
    Third, make sure you wire up your error wires.

  • PCMCIA E series card Voltage Range

    In one of our application we want to use E-series daq card for a single channel analog input.
    Voltage range is from 0.01 mv to 2 V.
    Please let me know could I measure voltage of this range and how?
    Please let me know what minimum voltage we can measure using PCMCIA E series card.

    Vishal,
    The range on our current PCMCIA E Series boards is +/-10V. There are multiple example programs to measure analog voltages for common programming environments that will work with our PCMCIA boards. Hope this helps. Have a great day!

  • Using MIO-16XE-10, I am trying to relate per gain setting the resultant actual voltage range.

    I have tried to find the conversion table for gain setting and resultant voltage. I am a programmer, and am having trouble understanding the relationship, or better yet, an algorithm, so I can programmably set a gain, based on the current voltage returned after a measurement. The idea is, to get the most reliable resistance measurement by assuring that the current measurement taken with a specific gain setting, is not taken on the lower 1% of that gain.
    bookieb

    Hi Bill:
    Thanks for contacting National Instruments.
    The best way to use the most appropriate gain is to set the voltage range of the incoming signal and let the NI-DAQ driver choose the best gain to apply for that range.
    The DAQ card has some predefined gain settings which the NI-DAQ driver selects from depending on the input limits.
    To find out what input ranges correspond to what gain, please refer to Appendix A Specifications of the E Series User Manual. Table on page A-2 shows the relationship between the input signal ranges and the channel gain that the NI-DAQ driver chooses.
    I hope this helps. Please do not hesitate to to contact me back if you need additional information.
    Re
    gards,
    Bharat S
    Applications Engineer
    National Instruments
    Penny

  • WLS 9205 setting input voltage range

    Hi folks. I need some assistance in setting my input voltage range for my WLS 9205. There are three settings (it would appear) from the documents supplied with the DAQ device ranging from +/- 200mV, +/-1V and +/-10V. How can i create a dynamic menu in standard Labview to alter the settings within a .vi. I am not using field point or anything of the sort.The DAQ mx does not give me the option to select the range either.
    On the old devices, you could alter setting using dip switches but there are no dip switches on this device. Any advice would be greatly appreciated.
    Thanks,
    Daragh

    Here is the equivalent information for NI-DAQmx (since the WLS-9205 does not use LabVIEW FPGA):
    How Do I Set the Analog Input Gain using DAQmx?
    Using Min/Max or High/Low to Set NI-DAQmx Channel Gain
    Brad
    Brad Keryan
    NI R&D

  • How to output sample and convert clocks to PFI lines of E-Series DAQ (DAQPad-60​15)

    Hi,
    Can someone tell me how to output sample and convert clocks to PFI lines of E-Series DAQ (DAQPad-6015)?
    Thank you very much.
    Jack

    John --
    Windows is not an option for me. I like your idea of using a counter output -- it may be helpful as I am getting ramped up, but my application will eventually require both timer outputs.
    I have a legacy C application written for Macintosh, and I am in the process of moving it onto OS X. So my options are to use DAQmx Base, or write an in-kernel driver. I actually have already done the latter for 6024/6025 E-series boards (for another company); for this client I was hoping to use the DAQmx Base to allow an easy transition to M-series boards, without the cost of writing and supporting a low-level driver.
    The specific task I am doing is relatively straightforward. I record 2 channels of AI for a short period (usually about 250 ms.) and during this time I drive 2 external digital signals. Right now, I use the 2 timer outputs, which allows precise synchronization with the output and AI sampling.
    I appreciate your comments, and thanks in advance for any additional suggestions you can lob my way.
    --spg
    Scott Gillespie
    Applied Brain, Inc.
    scott gillespie
    applied brain, inc.

  • Which type of ADC is used in M Series, S Series and C Series DAQ Cards?

    Hi
    I want to know which type of Analog to digital converters(ADC) are used in M Series and S Series DAQ Cards. I know that DSA Cards and some C series are using Delta Sigma ADC. What about the ADC used in non delta sigma C Series modules. Why Delta sigma ADC is not used in M series and S series DAQ cards.
    Regards
    Samuel J

    Hi Samuel.
    The ADC type used is Succesive Approximation ADC. You can refer to the following links for your reference.
    http://digital.ni.com/public.nsf/websearch/32FD9AA817D0EBE68625708C005E1B26?OpenDocument
    Hope this helps.
    Thank You.

  • NI 9237 maximum voltage range accuracy

    Hello,
    I'd like to know how to read the datasheet of the NI 9237 for the accuracy. If I read the web page I can read 0.038mV/V but if I read the datasheet It is more difficult.
    I'm working with the load cell MT 1241 (OIML R60 C3) with rated capacity of 150Kg. When I use an external excitation (10V) I'm not abble to have a fixed value. I have 0.000xxxy V where x is fixed and y no.

    Hi Fabio,
    you could try to calculate the correct maximum voltage range accuracy through this document (see attachment).
    Regards.
    Sabrina
    Attachments:
    DOCUMENT.jpg ‏274 KB

  • How can we change output pin for counter in X series daq

    hello every one...
    If this question is repeated then forgive me and direct me to that.
    I am using X series DAQ USB-6343. In specific instant i want counter output at specific pin and then agian for other requirement i want it to other pin. plz so me some path. i try to find thing but i was not luck...
    thanks & regards,
    Solved!
    Go to Solution.

    Attached is my example (it provides both methods discussed).  I used a text panel with the same device to count edges on PFI0.  Methodology:
    1) Run Example and export to PFI0.
    2) Stop Example
    3) Change output terminal to PFI1
    4) Run Example
    Notes:
    For Method 1, I would continue to see counts until I stopped and started the test panel.  Without digging it to deeply it appears as though the route is not released until the edge count timer is stopped.  Stopping and starting this counter stopped the counter from seeing edges on PFI0.
    For Method 2, I stopped seeing edges immediately after stopping the CO Pulse Freq task.
    Hope that helps,
    Dan
    Attachments:
    DiffCounterOPModified.vi ‏23 KB

  • Temperatures corresponding to voltage ranges with the PXI-4351/TC-2190

    Using a PXI-4351 and TC-2190 with a K-type thermocouple, what temperature range do the voltages (+/- 625 mV, 1.25 V, 2.5 V, 3.75 V, 7.5 V, and 15 V) correspond to? How much accuracy does each different range of voltages have? All I could find was that to get improved accuracy, use a range of 2.5 volts or less (document #1X0F7L3E). Also, will using several channels at vasly different temperatures (ie. room temp. and several hundred degrees) affect which range I should select?

    The full temperature range of a K type thermocouple from -270C to 1370C represents a voltage range from -6.45mV to 54.8mV. If the only sensors you are using are thermocouples then you should choose the smallest range. The 4351 applies one gain setting to all channels, so the only reason you would want to use anything but the smallest range would be if you have other types of sensors with larger voltage swings. You can find voltages for any thermocouple at any temperature at the link below.
    Regards,
    Brent R.
    Applications Engineer
    National Instruments
    http://srdata.nist.gov/its90/menu/menu.html

  • Can we lock 80 MHz timebase to PXI_CLK10 on PXI-6220 M Series DAQ?

    I am using PXI-6220 to measure Frequency/Period of 32768 Hz clock signal, accuracy is very important. I have provided signal to measure on Gate input of Ctr 0 (PFI 9). I am using 80 Mhz Timebase. I want to Lock PLL to PXI back plane 10 MHz reference (PXI_CLK10) which in turn is locked onto the 10 MHz reference via PXI-5600 on Slot 2 (External 10 Mhz standard is connected). so far I don't see a way to lock PLL to PXI_CLK10. Is that possible at all? see the image attached to this message.
    Attachments:
    M Series DAQ Question.png ‏62 KB

    Hello Abhatti,
    Based on the diagram that you have attached, the M-Series card can PLL to a higher accuracy clock such as the PXI Clock_10.  The way to configure this change using the DAQmx driver is route the signals using the DAQmx Timing RefClk.Src Timing property node.  Once you place down this property node, and specify the RefClk.Src attribute, you select the PXI-Clk10 as your reference clock source.  This will discipline the 80 MHz Timebase of the DAQ card to the 10 MHz reference clock of the PXI chassis.  Which chassis are you using?  Also, how have you PLL'ed the 10 MHz backplane clock to the PXI-5600 Downconverter?
    Michael L.
    Applications Engineer
    National Instruments

  • Need to use quadrature encoder to trigger (RTSI) single point DAQ on 2 channels of E-Series DAQ, using 6602 NI-TIO for counting encoder pulses.

    This is for LV6i, W2000, all PCI equipment.
    Using a quadrature-measure position-VI, I get 7200 edges/rev from the encoder of my physical system. This equates to 0.05 degrees of angular displacement. This amounts to an angle stamp as opposed to a time stamp.
    I need each of these 7200 edges (source: 6602 NI-TIO) to trigger (using RTSI) the acquisition of a single sample from each of 2 channels on an E-Series DAQ board (maybe more channels later). I only need/want one rev (7200 samples per channel) of data for each run of the test. As I write this I think I want pre-triggering and a little more than a rev of data. So the
    re is a buffering step. Anyway, you can get the idea.
    I need this angle stamp and the DAQ samples to be placed in an array and on the hard drive for graphing and other mathematical treatment, analysis, etc.
    I think there must be a way to use the quadrature output of the counter/timer as a scan clock for the DAQ board, but I haven't seen an example to guide me.
    It seems like all of the RTSI or other triggering examples I have seen trigger once to start a continuous scan, not a series of discrete samples repeated quickly. I am not sure how to fill an array with this data. Again, examples are for continuous sampling, not a series of discrete readings.
    Any hints on any part of this task will greatly appreciated. This is my first LV project.

    Sounds like a fairly ambitious first project!
    I assume your 7200 edges/rev come from an encoder with 2 channels in quadrature which each provide 1800 cycles/rev. You can clock in analog data at 1800 scans/rev with either of the two encoder channels, but will probably need an external quadrature decoder circuit to produce 7200 scans/rev. Either method can be done with screwdriver and wire or else by using another counter from the 6602 and the RTSI bus. Here are two approaches in detail, but you could mix-and-match as needed.
    Note also that if you can be sure that your reference encoder will be uni-directional, you wouldn't need to measure position -- position could be determined by the array index of the analog scan data. This would simplify things greatly.
    1800 scans/rev, screwdriver & wire
    Wire both encoder channels to your 6602 breakout box and configure your counter for the 4x quadrature option. Send a wire from one of the encoder channel connections at your 6602 breakout box to a PFI pin at your E-series board breakout box. Config the analog acquisition to use an external scan clock and specify the correct PFI pin -- there are built-in examples that will guide you. Now one edge of one encoder channel acts as a scan clock for your analog acquisition. Inside the 6602 breakout box, route the same signal to one of the default gate pins and configure your encoder counter gate to use that pin as its gate signal. Note that there will be a race condition governing whether the encoder value updates from the encoder inputs before or after the value is latched by the gate.
    7200 scans/rev, extra counter & RTSI
    Make sure you have a RTSI connector between your two acquisition boards inside your PC. Build a quadrature decoder circuit that will convert your two encoder channels into a clock and direction output. (Consider the LSI 7084 decoder chip or similar). Setup your "encoder" counter for buffered position measurement. Use "Counter Set Attribute" to define "up down" as "digital" (don't use it to define "encoder type"). The clock output goes to the counter SOURCE and the direction output goes to the counter UP_DOWN pin.
    Use "Adjacent Counters.vi" to identify the counter considered adjacent to your encoder counter. Configure it for "retriggerable pulse generation". Use "Counter Gate (NI-TIO).vi" to specify "other counter source" as the gating signal. Configure the output pulse specs to be short duration (make sure total of delay + pulse width is less than the minimum period of the incoming encoder clock signals). Use "Route Signal.vi" to send this counter's output onto the RTSI bus, say RTSI 0.
    Now configure the analog acq. to use RTSI 0 as its external scan clock. Also configure the encoder counter to use RTSI 0 as its gate signal. Voila! Now your quadrature decoder clock output acts as a scan clock for analog acquisition and a "gate" to buffer your encoder measurement. The short delay helps ensure that the clock updates the position measurement before the gate fires to latch the value.
    Respond if you need clearer explanation. There's a fair amount of decent info "out there" if you scour the online help and this website. Good luck!

  • Trouble with digital PWM using an M-series Daq

    Hello,
    I am trying to control 8 independent digitally pulse width modulated Thermoelectric Peltier couples using the 6221 daq. I have tried using the digital PWM for M-series daq example that is available, but can only get the first line to output a signal.  When different lines are selected using the drop down menu, they create no output signal.  What am I doing wrong? 
    Would this program (with some modifications) capable of running 8 different lines of signal?  All 8 signals will be run at the same frequency, with only the duty cycle needing to be independently changed, so I assume I will only need to use one counter? 
    I have struggled with the example program for several hours, and cannot figure out why only the first line will output a signal. It says it was posted only a few days ago so I wonder if there is something wrong with it.
    Any help will be GREATLY appreciated because I am at a standstill until I get this resolved 
    Thank You
    -Gabe 

    Hey,
    I found an example which you could use to build up on it. It works on 8 lines.
    Christian
    Attachments:
    pwmwithmseries[1].vi ‏61 KB

  • Voltage range reached - but no error

    Hello!
    I have a problem. I neet to do a bodeplot very fast, and I just want to
    autoset my niscope yust when the maximum voltage of the range is
    reached. Or is there a possibility to set the voltage range manually?
    When the voltage reaches the maximum the value is always the same.
    I use C++ with ni scope, no meassurement studio!
    thank's Nick, Hannes

    Hello,
    you can configer your voltage range with niScope_ConfigureVertical()
    function. This is described in this help file:
    http://digital.ni.com/manuals.nsf/websearch/998C51E969F4900486256D9F007103F2?OpenDocument&node=132090_US
    regards
    P. Kaltenstadler
    NI Germany

Maybe you are looking for

  • Hierarchy data not display in particular Alignment

    I have created hierarchy on node id  the basis of parent id with group indent of 0.2. Put the other field  in detail field but after running the report it showing data in hierarchical order data (i.e data is not proper alignment).

  • Error While deleting PF Infotype

    Hi Expert, I have a query. For one employee PF is not eligible. But while performing hiring action end user wrongly maintained PF IT. And also payroll run had been for that employee for 1 month. And After running the payroll they identified he is not

  • 'Mark for Translation' cannot be used in languages other then the primary

    Hi there, I'm using webservices to update a picklist and when i try to insert a new value im getting the following error message: 'Mark for Translation' cannot be used in languages other then the primary.(SBL-ODS-50196). Can anyone give me an help on

  • Enqueue Dequeue for VBREVE

    Hi, I am updating the Revenu reversal data into VBREVE table through standard function module SD_REV_REC_COLLECTIVE_RUN_NEW The data will be updated in the table after few seconds only.So I need to chek dynamically whether the data has been updated b

  • Dangling Meta Character*

    Hi I am getting following exception while executing this statement.can any body give anyt idea and how to fix this problem? String sbr="this is kdfdanth redd*t from fdgkanth"; String bol=sbr.replaceAll("*"," "); System.out.println("********* " + bol)