WLS 9205 setting input voltage range

Hi folks. I need some assistance in setting my input voltage range for my WLS 9205. There are three settings (it would appear) from the documents supplied with the DAQ device ranging from +/- 200mV, +/-1V and +/-10V. How can i create a dynamic menu in standard Labview to alter the settings within a .vi. I am not using field point or anything of the sort.The DAQ mx does not give me the option to select the range either.
On the old devices, you could alter setting using dip switches but there are no dip switches on this device. Any advice would be greatly appreciated.
Thanks,
Daragh

Here is the equivalent information for NI-DAQmx (since the WLS-9205 does not use LabVIEW FPGA):
How Do I Set the Analog Input Gain using DAQmx?
Using Min/Max or High/Low to Set NI-DAQmx Channel Gain
Brad
Brad Keryan
NI R&D

Similar Messages

  • How do i read out the actual voltage range of the daq board as calculated by labview based on my input voltage range

    I am somewhat new to DAQ so I hope I describe this well enough, but I have encountered a requirement for my VI that I am not sure how to obtain from LabVIEW. It is my understanding that when using a DAQ board (I'm using a PCI-6034 E) you can specify the voltage range you expect from your measurements, and then the software calculates the actual voltage range it will be measuring to ensure that you get the best resolution in your range that you can given the voltage step size, etc. of the equipment. I know how to extract the voltage input that I enter into the virtual channel, but I was wondering how I can extract the actual voltage range that the hardwar
    e/software is using based on its own calculations from my input. Can anyone help me with this?

    If I understand correctly, you want to retrieve the actual gain value (and, therefore, usable input range) used by your DAQ card for a particular measurement.
    I think there's a good KnowledgeBase entry with your solution entitled "Programmatically Determining the Gain Used by a DAQ Device". Here's the URL:
    http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/c93d3954ec02393c86256afe00057bb0?OpenDocument
    Regards,
    John Lum
    National Instruments

  • NI 5761 input voltage range

    Hello,
    What is the simplest way to change the input range for the NI 5761?  I've tried reading the CLIP reference under the help and looked at the Texas Instruments data sheet for the ADC, but it's not clear to me how to piece everything together.
    Thanks.

    Hi hht3lasers,
    The NI 5761 does not have an adjustable input range. From the specifications on page 13 http://www.ni.com/pdf/manuals/375509a.pdf , you can see that the voltage range is only specified for a range of 2.07Vpk-pk in AC coupled mode and on page 16 as 1.23 Vpk-pk for DC-coupled mode.  You can change the device between AC-coupled or DC-coupled. Check out the NI 5761 folder in the LabVIEW example finder, as this code is implemented in one of the shipping examples. From the Example Finder, navigate to Hardware Input and Output >> FlexRIO >> IO Modules >> NI 5761 >> NI 5761 Getting Started.lvproj.
    Regards,
    Jason L.
    Product Support Engineer
    National Instruments

  • PXI-6115 input voltage range configuration help

    I need to set the analogue inputs on my PXI-6115 to +/-42V, however using Measurement & Automation Explorer, under Traditional NIDAQ Devices - PXI-6115- Configuring Device 1 : PXI-6115, Analogue Input Tab, the Polarity / Range pull down menu only has -10.0V - + 10.0V available for selection.  Any help gratefully received.

    Hi,
    The range inside of MAX should read +/- 10 V, the range of the card with a gain of one.
    To see a larger range in your application, you must configure the gain to be less than one. For example, the NI 6110 supports gains of 0.5 and 0.2, which allow ranges of +/- 20 V and +/- 50 V. (Remember that the card supports a maximum of 42 V.)
    Depending on the range of your input signal, you would want to use the following gain setting:
    Range
    Gain Setting
    -50 to 50 V (not to exceed +/- 42 V maximum)
    .2
    -20 to 20 V
    .5
    -10 to 10 V
    1
    -5 to 5 V
    2
    -2 to 2 V
    5
    -1 to 1 V
    10
    -500 to 500 mV
    20
    -200 to 200 mV
    50
    To set the gain in LabVIEW, use the Hardware Configure VI and set the input range of the signal you want to measure, and the VI will select the best gain. Refer to the link below for an example.
    To set the gain using NI-DAQ Function Calls, pass in the gain as an integer. If the gain is 0.5 or 0.2, pass in a -1 or -2 respectively.
    http://sine.ni.com/apps/we/niepd_web_display.display_epd4?p_guid=B45EACE3E8A156A4E034080020E74861&p_...
    Hope that helps
    Sacha Emery
    National Instruments (UK)
    // it takes almost no time to rate an answer

  • Using MIO-16XE-10, I am trying to relate per gain setting the resultant actual voltage range.

    I have tried to find the conversion table for gain setting and resultant voltage. I am a programmer, and am having trouble understanding the relationship, or better yet, an algorithm, so I can programmably set a gain, based on the current voltage returned after a measurement. The idea is, to get the most reliable resistance measurement by assuring that the current measurement taken with a specific gain setting, is not taken on the lower 1% of that gain.
    bookieb

    Hi Bill:
    Thanks for contacting National Instruments.
    The best way to use the most appropriate gain is to set the voltage range of the incoming signal and let the NI-DAQ driver choose the best gain to apply for that range.
    The DAQ card has some predefined gain settings which the NI-DAQ driver selects from depending on the input limits.
    To find out what input ranges correspond to what gain, please refer to Appendix A Specifications of the E Series User Manual. Table on page A-2 shows the relationship between the input signal ranges and the channel gain that the NI-DAQ driver chooses.
    I hope this helps. Please do not hesitate to to contact me back if you need additional information.
    Re
    gards,
    Bharat S
    Applications Engineer
    National Instruments
    Penny

  • How can I programmatically change the voltage range settings in a DAQ Assistant

    Hi,
    First post here.  
    I need to be able to change the voltage range properties of a daqmx DAQ Assistant based on user input.  My hardware, an SCXI-1102C does not allow changing this property on a running task, so I'd like to either set the analog input voltage range before the DAQ Assistant activates, or pause the DAQ Assistant immediately after it starts, set the values and then resume.
    I don't know how to edit the task ahead of time because the DAQ assistant creates the task when it runs, and there is no task before that.
    In the attached picture, I have a conditional section, set to run only if the while loop iteration is 0.  I take the task from the Daq assistant, send it to a stop task vi, set the property, and then send the task on to the start task vi. I can watch it run with the debug light on, and everything seems to work correctly, but on the second (and all the other) iteration of the loop, I read out AI.Max and it seems like the DAQ Assistant has re set it back to 5V.  Can anyone see what is going wrong here?
    BTW, this is continuous acquisition and the code does not produce error messages when it runs.
    I did come across a similar question that someone posted here back in 2006, but his question was specifically aimed at a Labview API (VB, I think), and not an actual G solution.
    Attached are the actual vi in question and a png image of the block diagram.
    Thanks! 
    Ruby K
    Solved!
    Go to Solution.
    Attachments:
    Labview_question.PNG ‏14 KB
    Sample_AIV.vi ‏91 KB

    First, if you want to start getting beyond the basics with DAQ, you are going to have to stop using the DAQ assistant and do it with lower level DAQmx VI's.  There are hundreds of examples in the example finder.  You can even right click on the DAQ assistant and select open front panel.  That will create a subVI that you can open and see what is going on behind the scenes.  Do it.  I think you'll find the DAQ task is being recreated on each (though I'm not 100% of how the settings are established or maintained in each section of that subVI).
    The second problem is you have a bit of a race condition on iteration 0.  Those two DAQ property nodes are running at the same time.  So when you read the AI.Max, it may be happening before or after the AI.Max is set in your case structure.
    Third, make sure you wire up your error wires.

  • Voltage range reached - but no error

    Hello!
    I have a problem. I neet to do a bodeplot very fast, and I just want to
    autoset my niscope yust when the maximum voltage of the range is
    reached. Or is there a possibility to set the voltage range manually?
    When the voltage reaches the maximum the value is always the same.
    I use C++ with ni scope, no meassurement studio!
    thank's Nick, Hannes

    Hello,
    you can configer your voltage range with niScope_ConfigureVertical()
    function. This is described in this help file:
    http://digital.ni.com/manuals.nsf/websearch/998C51E969F4900486256D9F007103F2?OpenDocument&node=132090_US
    regards
    P. Kaltenstadler
    NI Germany

  • Voltage Range E Series DAQ

    I have a PCI-MIO-16E DAQ.  My input voltage range is 0.7 - 1.1 volts.  The manual seems to suggest that I can only choose voltage ranges that have discrete values, ie. 0-1; 0-2; 0-5 etc. 
    The NI Measurement & Automation Explorer allows me to select 0.7 - 1.1 volts, however.  So my question is:  will the resolution be based on dividing the range 0.7 - 1.1 volts into 4096 parts, or by dividing 0 - 2 volts into 4096 parts?
    Thanks,
    Shawn Clark

    The actual input ranges of the DAQ board are fixed. I don't know the specifics of your board but the ranges might be +/-10, +/-5, +/-1, 0-1, 0-5, 0-10, etc. When you specify a minimum and maximum value, NI-DAQ uses that information to select the most appropriate range. So, in your case, if the board supports 0-2 volts, that is the range that will be selected and that's the value to use for resolution. If the DAQ board does not have a 0-2 range and only has a 0-5 range, then you need to use this number.

  • I'm not able to set Analog Input voltages, etc

    ok,
    so i have an NI DAQ I/O card (NI PCI 6723, or something to that effect) and i'm trying to configure the channels for Analog Input signals.
    when i try to choose any kind of setting for Anolog signal by:
    >>drop DAQ Assistant vi onto the block diagram
    >>>>choose Analog Input
    >>>>>>>Voltage
    i get this message "No supported devices found"
    i thought since I have the I/O card, i should be able to configure channels for both input/output.
    i also have ADAC card (from IOTECH) installed but when i look at MAX, i only see the DAQ card.
    i'm really confused at this point. can anyone break things down for me ... what i'm doing wrong?
    -r

    stuartG,
    you were right. so what i'm using for the input is a ADAC 5501MF from IOTECH
    following their manual (http://www.iotech.com/productmanuals/adac_lvi.pdf) it says that the ADAC-LVi libraries can be located at "Functions->User Libraries" palette of labview.
    does this mean that i have to install the ADAC-LVi driver in LabView->User Libraries directory?
    if that's not the case then i don't see why the ADAC-LVi is not showing up in the LabView User Libraries since i've installed the driver correctly (in C\Program File\ADAC )
    do you have any idea why?
    thanks
    -r

  • How to set input range to 100mv

    I use AI_configure(....) to set input range.And i want to set input range to 100 mv .How should i do.
    Please tell me..........my DAQ is PCI MIO 16E-4

    You will need to set the gain of each channel using the AI_Read function. The input range parameter of AI-Configure is acutually ignored for E Series devices. The polarity parameter will allow you to set the range to unipolar (0 to 10 V) or bipolar (-5 to +5).
    You use the gain parameter of the AI_Read to futher specify the input range. The valid gain settings for the 16E-4 can be found on page A-2 of the appendix in the user manual linked below. For a +/- 100 mV range you would set the polarity of AI_Configure to bipolar and the gain parameter to 50 of the AI_Read. I hope this helps.
    PCI E Series User Manual
    http://digital.ni.com/manuals.nsf/webAdvsearch/06F1C9FB0D0BA5C286256C010057461B?OpenDocument&vid=niwc&node=132100_US
    Regards,
    Todd D.

  • Does the iPad mini have a setting so it can be used with either 120 or 240 electrical current? If yes, how do you go about setting the desired input voltage? I will travel to Northern Europe and want to know if I will need a voltage converter.

    Does the iPad mini have a setting so it can be used with either 120 or 240 electrical current? If yes, how do you go about setting the desired input voltage? I will travel to Northern Europe and want to know if I will need a voltage converter.

    The iPad charger is dual voltage. You need only a plug adapter to be able to plug your US charger into a European wall outlet. A basic one costs about $2.00. - $6.00. Like this...
    http://www.amazon.com/CVID-Grounded-European-Adapter-Adaptor/dp/B0098B5GN8/

  • PXI-5105 Minimum Voltage Range

    Hi,
    I searched the forums for this question but did not get a concrete answer.  I'm using a Kulite ITQ-1000 pressure transducer attached to a RDP signal conditioner/amplifier which is then fed into the PXI-5105.  The signal has been set so that the maximum rated pressure outputs +10V from the RDP unit.  My question is regarding the minimum voltage level...  I noticed on the specifications for the 5105 that there is a "Minimum Voltage Range" of -25mV,25mV.  Does this mean that the minimum voltage input that should be fed into the 5105 should be no less than, in my case of only positive voltage, 25mV?
    Thanks,
    Ryan 

    Nevermind... I figured it out

  • NI 9201 voltage range

    Dear All,
    Can I program the voltage range of NI 9201/NI 9221 even though there is only 1 nominal voltage range?
    Anyway there is C function DAQmxCreateAIVoltageChan. 
    I am using cDAQ-9172 & Visual C++.
    Thanks and regards,
    A

    Hi angelmcdoggie,
    You are correct that there is only one input range for the NI-9201/9221, according to the specifications. Unfortunately, this device does not have a programmable-gain amplifier, and thus you will only have the single range of ±10V for the 9201 or ±60V for the 9221. Setting the maximum and minimum input value parameters of the DAQmxCreateAIVoltageChan function to anything less than 10V and greater than -10V, respectively, will still yield a range of ±10V. Hope this helps,
    Daniel S.
    National Instruments

  • How to measure CompactRIO system input voltage

    Hello,
    I would like to measure the voltage of the two 9-35V DC inputs going to the CompactRIO RT controller. I am using the 4-slot chassis due to space considerations and do not have a 24V analog input module on-board to be able to do this via a module. On the 9205 analog input module there is a hook for "chassis temperature" that works perfectly. I am trying to find a similar hook to read the voltage that the CompactRIO is running on.
    I will be running this CompactRIO on two 24V batteries and wanted to be able to detect when the battery levels were too low to continue operation. When this state is reached, or hopefully prior to, I want to offload the data I have collected before running out of power to do so.
    Any ideas?
    Thanks,
    Ryan
    Senior Systems Manager, CIMIT
    Massachusetts General Hospital
    Cambridge, MA
    www.cimit.org

    Hi Eric,
    I am not an electrical engineer, so trying to figure out a clever way to split the voltage and measure it on three different inputs would be a job for one of the other researchers here. This seems like a straightforward feature though. We are using the CompactRIO as an onboard controller for a robot, but I would imagine that other more "mission-critical" applications would benefit from knowing the voltage that is powering the C-RIO. Much like the on-board chassis temperature reading, it seems like an important piece that is missing from the basic feature list. I don't know if we even have the 3 inputs free on the AI module as they are used to control everything else.
    Hmm.
    Thanks,
    Ryan
    Senior Systems Manager, CIMIT
    Massachusetts General Hospital
    Cambridge, MA
    www.cimit.org

  • Temperatures corresponding to voltage ranges with the PXI-4351/TC-2190

    Using a PXI-4351 and TC-2190 with a K-type thermocouple, what temperature range do the voltages (+/- 625 mV, 1.25 V, 2.5 V, 3.75 V, 7.5 V, and 15 V) correspond to? How much accuracy does each different range of voltages have? All I could find was that to get improved accuracy, use a range of 2.5 volts or less (document #1X0F7L3E). Also, will using several channels at vasly different temperatures (ie. room temp. and several hundred degrees) affect which range I should select?

    The full temperature range of a K type thermocouple from -270C to 1370C represents a voltage range from -6.45mV to 54.8mV. If the only sensors you are using are thermocouples then you should choose the smallest range. The 4351 applies one gain setting to all channels, so the only reason you would want to use anything but the smallest range would be if you have other types of sensors with larger voltage swings. You can find voltages for any thermocouple at any temperature at the link below.
    Regards,
    Brent R.
    Applications Engineer
    National Instruments
    http://srdata.nist.gov/its90/menu/menu.html

Maybe you are looking for