PXI-5105 Minimum Voltage Range

Hi,
I searched the forums for this question but did not get a concrete answer.  I'm using a Kulite ITQ-1000 pressure transducer attached to a RDP signal conditioner/amplifier which is then fed into the PXI-5105.  The signal has been set so that the maximum rated pressure outputs +10V from the RDP unit.  My question is regarding the minimum voltage level...  I noticed on the specifications for the 5105 that there is a "Minimum Voltage Range" of -25mV,25mV.  Does this mean that the minimum voltage input that should be fed into the 5105 should be no less than, in my case of only positive voltage, 25mV?
Thanks,
Ryan 

Nevermind... I figured it out

Similar Messages

  • PXI-5105 clipping voltage at ~19V

    The PXI-5105 should be capable of measuring peak to peak voltages of 30 V.  I have a set of 3, 9V batteries to give me a total of 27 volts across a resistor.  The PXI clips this voltage around 19V however.  I confirmed with a multimeter I have 27 volts.  I have input impedance set to 1 Mohm.
    Thanks for any help.
    Solved!
    Go to Solution.

    Hi cuxcrider,
    The 30 V max range on the 5105 is a peak-to-peak voltage, which means that the highest voltage it can measure with respect to ground (0 V) is +15 V, unless you apply a DC offset, which unfortunately cannot be done on the NI 5105. You will usually see the rail voltage a little higher than the specified value, but generally you should not operate in that region as you risk causing damage to the board and taking measurements out of spec. Hope this helps,
    Daniel S.
    National Instruments

  • PXI-6115 input voltage range configuration help

    I need to set the analogue inputs on my PXI-6115 to +/-42V, however using Measurement & Automation Explorer, under Traditional NIDAQ Devices - PXI-6115- Configuring Device 1 : PXI-6115, Analogue Input Tab, the Polarity / Range pull down menu only has -10.0V - + 10.0V available for selection.  Any help gratefully received.

    Hi,
    The range inside of MAX should read +/- 10 V, the range of the card with a gain of one.
    To see a larger range in your application, you must configure the gain to be less than one. For example, the NI 6110 supports gains of 0.5 and 0.2, which allow ranges of +/- 20 V and +/- 50 V. (Remember that the card supports a maximum of 42 V.)
    Depending on the range of your input signal, you would want to use the following gain setting:
    Range
    Gain Setting
    -50 to 50 V (not to exceed +/- 42 V maximum)
    .2
    -20 to 20 V
    .5
    -10 to 10 V
    1
    -5 to 5 V
    2
    -2 to 2 V
    5
    -1 to 1 V
    10
    -500 to 500 mV
    20
    -200 to 200 mV
    50
    To set the gain in LabVIEW, use the Hardware Configure VI and set the input range of the signal you want to measure, and the VI will select the best gain. Refer to the link below for an example.
    To set the gain using NI-DAQ Function Calls, pass in the gain as an integer. If the gain is 0.5 or 0.2, pass in a -1 or -2 respectively.
    http://sine.ni.com/apps/we/niepd_web_display.display_epd4?p_guid=B45EACE3E8A156A4E034080020E74861&p_...
    Hope that helps
    Sacha Emery
    National Instruments (UK)
    // it takes almost no time to rate an answer

  • Temperatures corresponding to voltage ranges with the PXI-4351/TC-2190

    Using a PXI-4351 and TC-2190 with a K-type thermocouple, what temperature range do the voltages (+/- 625 mV, 1.25 V, 2.5 V, 3.75 V, 7.5 V, and 15 V) correspond to? How much accuracy does each different range of voltages have? All I could find was that to get improved accuracy, use a range of 2.5 volts or less (document #1X0F7L3E). Also, will using several channels at vasly different temperatures (ie. room temp. and several hundred degrees) affect which range I should select?

    The full temperature range of a K type thermocouple from -270C to 1370C represents a voltage range from -6.45mV to 54.8mV. If the only sensors you are using are thermocouples then you should choose the smallest range. The 4351 applies one gain setting to all channels, so the only reason you would want to use anything but the smallest range would be if you have other types of sensors with larger voltage swings. You can find voltages for any thermocouple at any temperature at the link below.
    Regards,
    Brent R.
    Applications Engineer
    National Instruments
    http://srdata.nist.gov/its90/menu/menu.html

  • PXI-4496 voltage range

    Can I increase the voltage range of individual channels on the PXI-4496? I'm guessing not, but I'm hesitatiting applying large attenuations to my signal (a factor of 4 is likely) with the fear of losing significant accuracy.
    Currently I have single IEPE sensors going to individual channels, but I'm looking to sum several IEPE signals. This means that the incoming voltage will be multiplied (by the number of summed channels), so I'll need more than +/- 10V or will need to attenuate the signal.
    Thanks

    ap8888,
    Unfortunately, the input range for this device is +/- 10V, and that cannot be increased. Rather than summing the channels prior to input, have you considered summing them (or performing other operations as needed) in software AFTER it has been inputted? This way, you wouldn't have to worry about attenuating the signal prior to input of the 4496.
    Katie
    Katie Collette
    National Instruments

  • How to aquire short circuit current from digitizer PXI-5105?

    I was trying to aquire short circuit current with the digitizer PXI-5105 (it only gives me voltage). Any suggestions as to how I can make this measurements? I have gotten some hints from someone that we should use a Keithley (I have an old Keithley 237). Is this the case?
    Thanks.

    Using a digitizer for current measurements is not generally recommended, as digitizers are made for high-frequency voltage measurements. Why are you using a digitizer instead of a DAQ card that's made to measure current, such as the PXI-6236? What maximum current do you expect? 
    Regards,
    Nathan S.
    Applications Engineer
    National Instruments

  • PXI-5105 Re-Arm Time

    I have a Lab-View GUI which is fetching data each time a User Interfacea Trigger level is exceed, the datasheet specified a minimum of 2.6us for Re-Arm.
    This re-arm is true always or sotware add more delay to it.
    How the Internal Memory on the Card itself works? Is a circular buffer saving data after the Re-Arm happen still if the sotware is processing the data?

    Hi Joel_Cruz,
    You can find information about the Onboard Memeory for the PXI-5105 in the NI High-Speed Digitizers Help. This help file installs with the NI-Scope driver. 
    The specific information on the NI 5105 Onboard Memory can be found under NI High-Speed Digitizers Help>>Devices>>NI 5105>>Onboard Memory. 
    Regards
    Carli S.
    Applications Engineer
    National Instruments

  • PXI-5105 rms accuracy

    What is the accuracy of a rms value on a PXI-5105 ? The only specifications you get is the AC Amplitude Accuracy and the RMS noise.

    Hey bartvsc,
    If you look at page 5 on the PXI/PCI-5105 Specifications Document:
    http://www.ni.com/pdf/manuals/374403g.pdf
    You will see that depending on the Vpk-pk and input impedance range you will get a different AC Amplitude Accuracy. You can get the RMS Accuracy value by calculating it from the AC amplitude accuracy.
    Regards,
    Efrain G.
    National Instruments
    Visit http://www.ni.com/gettingstarted/ for step-by-step help in setting up your system.

  • PCMCIA E series card Voltage Range

    In one of our application we want to use E-series daq card for a single channel analog input.
    Voltage range is from 0.01 mv to 2 V.
    Please let me know could I measure voltage of this range and how?
    Please let me know what minimum voltage we can measure using PCMCIA E series card.

    Vishal,
    The range on our current PCMCIA E Series boards is +/-10V. There are multiple example programs to measure analog voltages for common programming environments that will work with our PCMCIA boards. Hope this helps. Have a great day!

  • NI 9201 voltage range

    Dear All,
    Can I program the voltage range of NI 9201/NI 9221 even though there is only 1 nominal voltage range?
    Anyway there is C function DAQmxCreateAIVoltageChan. 
    I am using cDAQ-9172 & Visual C++.
    Thanks and regards,
    A

    Hi angelmcdoggie,
    You are correct that there is only one input range for the NI-9201/9221, according to the specifications. Unfortunately, this device does not have a programmable-gain amplifier, and thus you will only have the single range of ±10V for the 9201 or ±60V for the 9221. Setting the maximum and minimum input value parameters of the DAQmxCreateAIVoltageChan function to anything less than 10V and greater than -10V, respectively, will still yield a range of ±10V. Hope this helps,
    Daniel S.
    National Instruments

  • Voltage Range E Series DAQ

    I have a PCI-MIO-16E DAQ.  My input voltage range is 0.7 - 1.1 volts.  The manual seems to suggest that I can only choose voltage ranges that have discrete values, ie. 0-1; 0-2; 0-5 etc. 
    The NI Measurement & Automation Explorer allows me to select 0.7 - 1.1 volts, however.  So my question is:  will the resolution be based on dividing the range 0.7 - 1.1 volts into 4096 parts, or by dividing 0 - 2 volts into 4096 parts?
    Thanks,
    Shawn Clark

    The actual input ranges of the DAQ board are fixed. I don't know the specifics of your board but the ranges might be +/-10, +/-5, +/-1, 0-1, 0-5, 0-10, etc. When you specify a minimum and maximum value, NI-DAQ uses that information to select the most appropriate range. So, in your case, if the board supports 0-2 volts, that is the range that will be selected and that's the value to use for resolution. If the DAQ board does not have a 0-2 range and only has a 0-5 range, then you need to use this number.

  • Cisco1522 minimum voltage using 12vdc option

    I cannot find an answer for this question anywhere:
    What’s the minimum DC input voltage allowed when using the 12vdc option on the cisco 1522 AP? Is 10.5vdc or 11vdc going to be enough to power the AP?
    Cisco has voltage ranges listed for AC and POE power options but none for 12vdc.

    Hi Chris,
    I took a look at your VI and converted it to use the SignalExpress User Step template. Because SignalExpress uses the LabVIEW RunTime Engine to run the VIs, some extra steps have to be done when the VI you are calling includes subVIs. These extra steps/instructions can be found at: Creating SignalExpress Plug-Ins with LabVIEW. An overview will tell you that 1) you need to create an LLB which contains ALL the subVIs that you are using, 2) The main VI should be re-entrant, 3) all control and indicators should be on the connector pane. There is also a section that talks about being careful of While Loops and other things.
    For now, i went ahead and did all that for you and I'm including that LLB (written in LabVIEW 8.0) that should hopefully do the job. Feel free to modify it, but if you add more subVIs, you'll have to rebuild the LLB (just a warning). So when in SignalExpress, select "Run LabVIEW VI 8.0" and select the top-level VI in the LLB which I called "PulseWidth.vi". I didn't have a chance to try it (sorry).
    By the way, the User Step template can usually be found at: C:\Program Files\National Instruments\SignalExpress\User Step Templates
    I hope this helps... Good luck.
    Phil
    Attachments:
    PulseWidth.llb ‏3222 KB

  • How can I display an analog input to PXI-5105 out on LabVIEW?

    Hi ALL,
    I am very very new to LabVIEW and I just started to fiddle around with it. I am running LabVIEW 2010 SP1 version on Windows 7 OS. I also have NI PXIe-1073 chassis with PXIe-6361 and PXI-5105 modules and the chassis is connected to my PC via PCI. I was getting myself acquainted with the devices and was trying to see some analog signals to one of the channels on the PXI-5105 module in a graph in LabVIEW.
    I would appreciate your helps. 
    Solved!
    Go to Solution.

    Hello Henokview!
    I would like you to read through these tutorials to understand the programming steps of NI-SCOPE and NI-DAQmx. After reading these links below you will be able to understand how to connect output from a readfunction to a graph or chart.
    DAQmx
    http://www.ni.com/white-paper/5434/en
    NI-SCOPE
    http://www.ni.com/white-paper/3382/en
    Best regards
    Jonas
    Best Regards
    Jonas Mäki
    Applications Engineering
    National Instruments

  • How can I programmatically change the voltage range settings in a DAQ Assistant

    Hi,
    First post here.  
    I need to be able to change the voltage range properties of a daqmx DAQ Assistant based on user input.  My hardware, an SCXI-1102C does not allow changing this property on a running task, so I'd like to either set the analog input voltage range before the DAQ Assistant activates, or pause the DAQ Assistant immediately after it starts, set the values and then resume.
    I don't know how to edit the task ahead of time because the DAQ assistant creates the task when it runs, and there is no task before that.
    In the attached picture, I have a conditional section, set to run only if the while loop iteration is 0.  I take the task from the Daq assistant, send it to a stop task vi, set the property, and then send the task on to the start task vi. I can watch it run with the debug light on, and everything seems to work correctly, but on the second (and all the other) iteration of the loop, I read out AI.Max and it seems like the DAQ Assistant has re set it back to 5V.  Can anyone see what is going wrong here?
    BTW, this is continuous acquisition and the code does not produce error messages when it runs.
    I did come across a similar question that someone posted here back in 2006, but his question was specifically aimed at a Labview API (VB, I think), and not an actual G solution.
    Attached are the actual vi in question and a png image of the block diagram.
    Thanks! 
    Ruby K
    Solved!
    Go to Solution.
    Attachments:
    Labview_question.PNG ‏14 KB
    Sample_AIV.vi ‏91 KB

    First, if you want to start getting beyond the basics with DAQ, you are going to have to stop using the DAQ assistant and do it with lower level DAQmx VI's.  There are hundreds of examples in the example finder.  You can even right click on the DAQ assistant and select open front panel.  That will create a subVI that you can open and see what is going on behind the scenes.  Do it.  I think you'll find the DAQ task is being recreated on each (though I'm not 100% of how the settings are established or maintained in each section of that subVI).
    The second problem is you have a bit of a race condition on iteration 0.  Those two DAQ property nodes are running at the same time.  So when you read the AI.Max, it may be happening before or after the AI.Max is set in your case structure.
    Third, make sure you wire up your error wires.

  • How do i read out the actual voltage range of the daq board as calculated by labview based on my input voltage range

    I am somewhat new to DAQ so I hope I describe this well enough, but I have encountered a requirement for my VI that I am not sure how to obtain from LabVIEW. It is my understanding that when using a DAQ board (I'm using a PCI-6034 E) you can specify the voltage range you expect from your measurements, and then the software calculates the actual voltage range it will be measuring to ensure that you get the best resolution in your range that you can given the voltage step size, etc. of the equipment. I know how to extract the voltage input that I enter into the virtual channel, but I was wondering how I can extract the actual voltage range that the hardwar
    e/software is using based on its own calculations from my input. Can anyone help me with this?

    If I understand correctly, you want to retrieve the actual gain value (and, therefore, usable input range) used by your DAQ card for a particular measurement.
    I think there's a good KnowledgeBase entry with your solution entitled "Programmatically Determining the Gain Used by a DAQ Device". Here's the URL:
    http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/c93d3954ec02393c86256afe00057bb0?OpenDocument
    Regards,
    John Lum
    National Instruments

Maybe you are looking for

  • Org.xml.sax.SAXParseException: Content is not allowed in prolog Error

    Hello, this is pratap.<br> i was trying rules using decision tables with 'call charges' example<br>. after filling the text fields i am getting an error org.xml.sax.SAXParseException: Content is not allowed in prolog<br>. Can any one please tell me w

  • Bar code question

    I haven't done anything with bar codes but I know one can do bar codes in Designer. I also know that in order to use dynamic bar codes I have to pay more money in order to use them. But I can add bar codes that are not dynamic that someone would be a

  • Unlimited Data Usage - Why is this now gone?

    So I have been a long time Verizon wireless subscriber, and have had unlimited data usage for a long time as well.  Now that I upgrade my phone, I am no longer allowed to have that?  What a joke.  The data usage does not take up that much data to run

  • One script to make multiple cells call one function or vice versa

    Please help. I need one script to make multiple cells reference one function or one cell reference multiple functions. Goal: On the enter event of cell c1, I want to make cells (this, Header.c1, Example.c1, rLabel) highlighted, and this would be for

  • Iphone iCloud restore with wrong version of iOS

    I am trying to restore my iphone from my iCloud backup. (I had to have the innards replaced after getting it wet.) I have logged in with my iCloud account. It gives my last iphone backup as 8 December 2014 and that is the one I want to use. But it sa