NI 9237 maximum voltage range accuracy

Hello,
I'd like to know how to read the datasheet of the NI 9237 for the accuracy. If I read the web page I can read 0.038mV/V but if I read the datasheet It is more difficult.
I'm working with the load cell MT 1241 (OIML R60 C3) with rated capacity of 150Kg. When I use an external excitation (10V) I'm not abble to have a fixed value. I have 0.000xxxy V where x is fixed and y no.

Hi Fabio,
you could try to calculate the correct maximum voltage range accuracy through this document (see attachment).
Regards.
Sabrina
Attachments:
DOCUMENT.jpg ‏274 KB

Similar Messages

  • Temperatures corresponding to voltage ranges with the PXI-4351/TC-2190

    Using a PXI-4351 and TC-2190 with a K-type thermocouple, what temperature range do the voltages (+/- 625 mV, 1.25 V, 2.5 V, 3.75 V, 7.5 V, and 15 V) correspond to? How much accuracy does each different range of voltages have? All I could find was that to get improved accuracy, use a range of 2.5 volts or less (document #1X0F7L3E). Also, will using several channels at vasly different temperatures (ie. room temp. and several hundred degrees) affect which range I should select?

    The full temperature range of a K type thermocouple from -270C to 1370C represents a voltage range from -6.45mV to 54.8mV. If the only sensors you are using are thermocouples then you should choose the smallest range. The 4351 applies one gain setting to all channels, so the only reason you would want to use anything but the smallest range would be if you have other types of sensors with larger voltage swings. You can find voltages for any thermocouple at any temperature at the link below.
    Regards,
    Brent R.
    Applications Engineer
    National Instruments
    http://srdata.nist.gov/its90/menu/menu.html

  • Voltage range reached - but no error

    Hello!
    I have a problem. I neet to do a bodeplot very fast, and I just want to
    autoset my niscope yust when the maximum voltage of the range is
    reached. Or is there a possibility to set the voltage range manually?
    When the voltage reaches the maximum the value is always the same.
    I use C++ with ni scope, no meassurement studio!
    thank's Nick, Hannes

    Hello,
    you can configer your voltage range with niScope_ConfigureVertical()
    function. This is described in this help file:
    http://digital.ni.com/manuals.nsf/websearch/998C51E969F4900486256D9F007103F2?OpenDocument&node=132090_US
    regards
    P. Kaltenstadler
    NI Germany

  • PXI-5105 Minimum Voltage Range

    Hi,
    I searched the forums for this question but did not get a concrete answer.  I'm using a Kulite ITQ-1000 pressure transducer attached to a RDP signal conditioner/amplifier which is then fed into the PXI-5105.  The signal has been set so that the maximum rated pressure outputs +10V from the RDP unit.  My question is regarding the minimum voltage level...  I noticed on the specifications for the 5105 that there is a "Minimum Voltage Range" of -25mV,25mV.  Does this mean that the minimum voltage input that should be fed into the 5105 should be no less than, in my case of only positive voltage, 25mV?
    Thanks,
    Ryan 

    Nevermind... I figured it out

  • NI 9201 voltage range

    Dear All,
    Can I program the voltage range of NI 9201/NI 9221 even though there is only 1 nominal voltage range?
    Anyway there is C function DAQmxCreateAIVoltageChan. 
    I am using cDAQ-9172 & Visual C++.
    Thanks and regards,
    A

    Hi angelmcdoggie,
    You are correct that there is only one input range for the NI-9201/9221, according to the specifications. Unfortunately, this device does not have a programmable-gain amplifier, and thus you will only have the single range of ±10V for the 9201 or ±60V for the 9221. Setting the maximum and minimum input value parameters of the DAQmxCreateAIVoltageChan function to anything less than 10V and greater than -10V, respectively, will still yield a range of ±10V. Hope this helps,
    Daniel S.
    National Instruments

  • PXI-4496 voltage range

    Can I increase the voltage range of individual channels on the PXI-4496? I'm guessing not, but I'm hesitatiting applying large attenuations to my signal (a factor of 4 is likely) with the fear of losing significant accuracy.
    Currently I have single IEPE sensors going to individual channels, but I'm looking to sum several IEPE signals. This means that the incoming voltage will be multiplied (by the number of summed channels), so I'll need more than +/- 10V or will need to attenuate the signal.
    Thanks

    ap8888,
    Unfortunately, the input range for this device is +/- 10V, and that cannot be increased. Rather than summing the channels prior to input, have you considered summing them (or performing other operations as needed) in software AFTER it has been inputted? This way, you wouldn't have to worry about attenuating the signal prior to input of the 4496.
    Katie
    Katie Collette
    National Instruments

  • Voltage Range E Series DAQ

    I have a PCI-MIO-16E DAQ.  My input voltage range is 0.7 - 1.1 volts.  The manual seems to suggest that I can only choose voltage ranges that have discrete values, ie. 0-1; 0-2; 0-5 etc. 
    The NI Measurement & Automation Explorer allows me to select 0.7 - 1.1 volts, however.  So my question is:  will the resolution be based on dividing the range 0.7 - 1.1 volts into 4096 parts, or by dividing 0 - 2 volts into 4096 parts?
    Thanks,
    Shawn Clark

    The actual input ranges of the DAQ board are fixed. I don't know the specifics of your board but the ranges might be +/-10, +/-5, +/-1, 0-1, 0-5, 0-10, etc. When you specify a minimum and maximum value, NI-DAQ uses that information to select the most appropriate range. So, in your case, if the board supports 0-2 volts, that is the range that will be selected and that's the value to use for resolution. If the DAQ board does not have a 0-2 range and only has a 0-5 range, then you need to use this number.

  • How can I programmatically change the voltage range settings in a DAQ Assistant

    Hi,
    First post here.  
    I need to be able to change the voltage range properties of a daqmx DAQ Assistant based on user input.  My hardware, an SCXI-1102C does not allow changing this property on a running task, so I'd like to either set the analog input voltage range before the DAQ Assistant activates, or pause the DAQ Assistant immediately after it starts, set the values and then resume.
    I don't know how to edit the task ahead of time because the DAQ assistant creates the task when it runs, and there is no task before that.
    In the attached picture, I have a conditional section, set to run only if the while loop iteration is 0.  I take the task from the Daq assistant, send it to a stop task vi, set the property, and then send the task on to the start task vi. I can watch it run with the debug light on, and everything seems to work correctly, but on the second (and all the other) iteration of the loop, I read out AI.Max and it seems like the DAQ Assistant has re set it back to 5V.  Can anyone see what is going wrong here?
    BTW, this is continuous acquisition and the code does not produce error messages when it runs.
    I did come across a similar question that someone posted here back in 2006, but his question was specifically aimed at a Labview API (VB, I think), and not an actual G solution.
    Attached are the actual vi in question and a png image of the block diagram.
    Thanks! 
    Ruby K
    Solved!
    Go to Solution.
    Attachments:
    Labview_question.PNG ‏14 KB
    Sample_AIV.vi ‏91 KB

    First, if you want to start getting beyond the basics with DAQ, you are going to have to stop using the DAQ assistant and do it with lower level DAQmx VI's.  There are hundreds of examples in the example finder.  You can even right click on the DAQ assistant and select open front panel.  That will create a subVI that you can open and see what is going on behind the scenes.  Do it.  I think you'll find the DAQ task is being recreated on each (though I'm not 100% of how the settings are established or maintained in each section of that subVI).
    The second problem is you have a bit of a race condition on iteration 0.  Those two DAQ property nodes are running at the same time.  So when you read the AI.Max, it may be happening before or after the AI.Max is set in your case structure.
    Third, make sure you wire up your error wires.

  • How do i read out the actual voltage range of the daq board as calculated by labview based on my input voltage range

    I am somewhat new to DAQ so I hope I describe this well enough, but I have encountered a requirement for my VI that I am not sure how to obtain from LabVIEW. It is my understanding that when using a DAQ board (I'm using a PCI-6034 E) you can specify the voltage range you expect from your measurements, and then the software calculates the actual voltage range it will be measuring to ensure that you get the best resolution in your range that you can given the voltage step size, etc. of the equipment. I know how to extract the voltage input that I enter into the virtual channel, but I was wondering how I can extract the actual voltage range that the hardwar
    e/software is using based on its own calculations from my input. Can anyone help me with this?

    If I understand correctly, you want to retrieve the actual gain value (and, therefore, usable input range) used by your DAQ card for a particular measurement.
    I think there's a good KnowledgeBase entry with your solution entitled "Programmatically Determining the Gain Used by a DAQ Device". Here's the URL:
    http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/c93d3954ec02393c86256afe00057bb0?OpenDocument
    Regards,
    John Lum
    National Instruments

  • Y/t chart stops showing data depending on Maximum Display Range setting

    The y/t graph will not display any data if I increase the Maximum Display Range setting to .8000s or higher.  The data in the file is at 5000 Hz, is this part of the problem?  I have played around with the Time Base and Measurement setting to no avail. The data is arbirary in length, but will not be longer than 3 seconds.
    Solved!
    Go to Solution.

    Here's the scoop...
    You have 145 blocks of data in the file. Fewer than that are sent through the Relay00.
    When you configure the Y/t Chart for more blocks than you have, then it waits for the rest of the blocks to come in. And they are not going to come in.
    One workaround is to use the "Fast Recorder" mode of the Y/t Chart, but your block size is too small (50 vs 128 minimum).  
    I'd suggest switching to the Chart Recorder module - configure it like this... and then, because you want the trigger to reset the time to 0, use an Action module. 
    Configure the Action like this... it will reset the Chart to start at 0 each time the trigger opens the Relay.
    - cj
    Measurement Computing (MCC) has free technical support. Visit www.mccdaq.com and click on the "Support" tab for all support options, including DASYLab.

  • What is acceptable Satellite M35 Motherboard CMOS battery Voltage range?

    Re: What is a good Toshiba Satellite M35 CMOS battery Voltage range?
    Part Number: P71035016113
    I pulled this part out of my Toshiba and its reading 2.75 Volts.
    Is that the proper voltage?
    If not, is it in an acceptable range to keep and not have to replace?

    Contact direct www.toshiba.com cusatomer service,for specific response to alleviate the concern.

  • N275GTX Lightning Core Voltage Range

    Hi, I am using the MSI Utility Afterburner to overclock my N275GTX Lightning 1792MB card.
    One of the options is to increase the core gpu voltage by an offset of 0 - 200 mV.
    GPU-Z says that the VDDC voltage of the card is 1.0500 and so a max OC would take that up to 1.250.
    What I cannot find anywhere is the documented voltage range for my video card. I do not want to increase the voltage beyond what it can handle.
    Can anyone please tell me the voltage range of my card?

    http://forums.vr-zone.com/overclockers-hideout/478135-lightning-strikes-again-msi-n275gtx-1792mb-ddr3-lightning-edition-d.html
    "I raised the vGPU to 1.1415v max on the s/w."

  • PCMCIA E series card Voltage Range

    In one of our application we want to use E-series daq card for a single channel analog input.
    Voltage range is from 0.01 mv to 2 V.
    Please let me know could I measure voltage of this range and how?
    Please let me know what minimum voltage we can measure using PCMCIA E series card.

    Vishal,
    The range on our current PCMCIA E Series boards is +/-10V. There are multiple example programs to measure analog voltages for common programming environments that will work with our PCMCIA boards. Hope this helps. Have a great day!

  • Digital voltage range limit (less V)

    Hi
    My first post here.
    In the Datasheet and in the examples i was not able to find a solution for my problem:
    - output voltage digital 0-5V <-> input FPGA (3V3)
    because i am communicating with a FPGA i have a problem with the output voltage of the USB-6221 to the FPGA. 
    Is it anywhere possible to change the digital output voltage range 0-5V? The other way is no problem, because the USB6221 accept 2.4V as an high (3V3 output of FPGA)
    I am looking for an solution per CVI property to limitate the voltage range
    -> the last solution will be a very simple voltage divider. But i need this for more than 2 lines.
    Thanks for your help!

    Hello floxe,
    we also have devices that support LVTTL. But as you already mentioned USB 6221 doesnt have that feature.
    So you eather have the option of purchasing another device or using an external circuit.
    Best Regards!
    Moritz M.

  • Using MIO-16XE-10, I am trying to relate per gain setting the resultant actual voltage range.

    I have tried to find the conversion table for gain setting and resultant voltage. I am a programmer, and am having trouble understanding the relationship, or better yet, an algorithm, so I can programmably set a gain, based on the current voltage returned after a measurement. The idea is, to get the most reliable resistance measurement by assuring that the current measurement taken with a specific gain setting, is not taken on the lower 1% of that gain.
    bookieb

    Hi Bill:
    Thanks for contacting National Instruments.
    The best way to use the most appropriate gain is to set the voltage range of the incoming signal and let the NI-DAQ driver choose the best gain to apply for that range.
    The DAQ card has some predefined gain settings which the NI-DAQ driver selects from depending on the input limits.
    To find out what input ranges correspond to what gain, please refer to Appendix A Specifications of the E Series User Manual. Table on page A-2 shows the relationship between the input signal ranges and the channel gain that the NI-DAQ driver chooses.
    I hope this helps. Please do not hesitate to to contact me back if you need additional information.
    Re
    gards,
    Bharat S
    Applications Engineer
    National Instruments
    Penny

Maybe you are looking for

  • [Solved] Help me, please with D-Link DWA-125

    Hi everybody! Firstly, sorry for my english. I'm from Russia . Recently i bought D-Link DWA-125 and Prestigio MultiPad 9.7 Pro. I want to make D-Link DWA-125 work like access point. lsusb | grep -i wireless Bus 002 Device 002: ID 2001:3c19 D-Link Cor

  • WRTP54G router: I have internet access wirelessly but a wired desktop doesn't.

    We have an excellent wireless network working in the house, but my desktop (which isn't wireless) can't get internet access.  It says that a network cable is unplugged - it's not.  When a cable is plugged into one of the 4 ethernet ports in the back

  • HostComponent always null :(

    Hey guys, I am declaring and adding a new Skin programmatically from actionscript like so... var custom:MyCustomComponent = new MyCustomComponent(); var skin:MySkin = new Skin(); custom.addElement(skin); on MySkin I have an event handler for creation

  • Safari 5.1.3 crashes everytime it is opened

    Here is the crash report. Process:    Safari [22423] Path:        /Applications/Safari.app/Contents/MacOS/Safari Identifier:  com.apple.Safari Version:    5.1.3 (7534.53.10) Build Info:  WebBrowser-7534053010000000~1 Code Type:  X86-64 (Native) Paren

  • Tcode RSRT

    Hello, In BW production system I'm using tcode RSRT to scan the query for performance issues, using this transaction does anyone know what options to select to determine if it is an Extractor, OLAP, Excel, etc... that is causing the query performance