Simultaneo​us temperatur​e and voltage measuremen​t

We have a NI-4351 to measure both the temperature and a voltage across a resistor. We have utilized two successive flat sequence structure frames, each containing an AI-Single Scan SubVI, to first scan the voltage off the thermocouple, and the second to scan the voltage off the resistor. Our voltage readings across the resistor are roughly 3V.
When we run our VI without scanning the resistor voltage, our temperature readings are correct. However, once we connect the voltage supply to a channel on the terminal board, we notice two things:
1) A constant error of 2500
2) The voltage readings across the thermocouples shoot up three magnitudes of error to approximately 0.08 volts
Would we need to separate the two scans into two separate buffers? If so, how would we do this?
Much Appreciated!!

Coolest,
At first glance, this particular issue may be related to the fact that the 4351 can only be configured for one gain setting across all of its channels. I recommend that you connect both signals and run a program which only performs one measurement at a time, reconfiguring the board between each measurement. If you still have the problem, disconnect both signals and then measure each one at a time. Make sure that you are configuring the channels to be ground referenced. If you've tried all of this and are still experiencing the same problem, there might be some cross-talk occurring between the channels -- potentially due to a grounding issue.
I hope that this is helpful to you.
Thanks,
Alan L
Applications Engineer
National Instruments

Similar Messages

  • Temperature, strain and voltage

    I need an assistance to add Temperature, Strain and Voltage seperately to my input code at FBG Lambda 1, this will allow the waveform of any of the input to varies. just to observe the variation on the input waveform

    "Dear Young, 
    I want to see how to add Strain and Voltage to the input waveform of Lambda1, my simulation is to just to study or observe a change in any of the waveform preferrably FBG waveform lambda1.
    here are the details of what i want to achieve thereafter: 
    Development of  simulation program to assess the capability of the proposed all-optical protection system to detect faults
    Test the model during investigation of different scenarios of fault within the zone and outside the zone of protection
    Investigate the influence and the effect of Temperature changes on the ability of the scheme to provide correct fault identification.
    Investigate the influence of Leakage current capacitance on Power Transmission Cable
    Development of Algorithm to provide fault status information having a binary information to indicate fault duration or no fault occurrence within the protection zone.
    Recieved from olusegunalfred"
    Thanks for your prompt reply. If it's okay, I have put your response here to keep it public and help people in the future.
    Joshua Young
    Applications Engineer 
    National Instruments UK

  • These i7 4770K Temperatures And Voltage Values Are Normal With Z87-G43?

    Hello everyone,
    I recently upgraded my good old Core2Duo rig and bought a new cpu-motherboard-ram trio. My new specs is as follows:
    - i7 4770K @ stock speed + Coolermaster 212Evo cpu cooler
    - MSI z87 g43 motherboard
    - 2x4 GB G-Skill Ripjaws X 1866Mhz
    - Case: Corsair Carbide 400R
    The problem is I wasn't aware of that new Haswell cpus are running slightly hotter and now I'm little bit worried about my temperatures. Since I was also planning to overclock my cpu a bit and trying to find a point that doesn't need a vast voltage increase, I'm losing my sleeps over this situation at the moment.
    Anyways. I'm using softwares like HWmonitor, Coretemp, Realtemp and Intel Extreme Tuning Utility. Turbo Boost is also active. So the cpu is going up until 3.9 GHz. Other than that I'm on stock speeds, and motherboard's default values. Here are my temperatures:
    Ambient room temp: Varies between 23-25 °C
    Idle: 28-30 °C
    While playing demanding games like Battlefield 4: Max 58-65°C
    With Intel Extreme Tuning's stress test for 15 mins: max 65-70 °C
    With Prime 95 Blend and OCCT burn tests for 15 mins: max 78-82 °C
    I also run realtemp's sensor test and the values are identical since it's using Prime95 too.
    I also noticed that Prime 95 and OCCT is increasing my cpu voltage value from 1.156 to 1.21 while Intel Extreme Tuning's stress test and BurnInTest is using 1.156v. All these test are using %100 of the cores. Couldn't understand why there's a voltage increase on certain tests which leads my temps go even higher and higher. Will I encounter these kinds of random voltage increases during normal tasks? Like playing games, rendering some stuff etc..?
    On the other hand I tried motherboard's OC Genie future to see what happens. It overclocked the cpu automatically to 4.0 GHz @1.10v. With this setting I've seen max of 70 °C for a second and mostly 65-68°C under OCCT stress test. And also my voltage didn't increase at all and sit at 1.10. I'm a bit confused about these values. Since with default settings I'm getting hotter values and my voltage is going up to 1.21 with Turbo Boost under Prime95/OCCT burn tests. I also found out that my BIOS is v1.0. I don't really have a performance or a stability issue for now except this voltage thing. Does a BIOS update help on that situation? I don't really like to touch something already working OK and end up with a dead board.
    Also I'm wondering if my temperature values are normal with the cpu cooler i have (Coolermaster 212evo)?
    I also could buy some extra fans for my case (1 exhaust to top & 1 intake to side) and maybe a second fan for the cpu cooler if you guys think that these would help a bit.
    Sorry for my English by the way. I'm not a native speaker.
    Thanks for all your comments and suggestions already.

    Thanks for the reply Nichrome,
    I will follow your suggestions for the fans. Currently I don't have any fans on top. But I'm considering to buy some fans to top and side. So you get better results with top fans being exhaust right?
    Also which fan are you using as the second fan for the cpu heatsink? I would buy one of these as well since we have the same heatsink.
    I'm also using the default/auto voltages and settings at the moment. Just Turbo Boost is enabled and when it kicks in voltage is going up to 1.156. Which seems normal and doesnt produce dangerous level of heat. The thing is if I start running Prime95&OCCT the voltages going up to 1.21+ level at the same turbo boost speed (which is 3.9 GHz). And that produces a lot more heat than usual. But if I use BurnInTest or Intel Extreme Tuning Utility stress test the voltages sit at 1.156 and under full load on all cores. I'm wondering what's the reason of this difference and if it is software or motherboard related. Even with using the OC Genie @ 4.1 GHz temperatures and voltages seem lower than the stock&auto settings (idle 35-38 / stress test with OCCT 70C max, gaming 60-65C max). I'm not sure if a BIOS update would fix this. Since the whole BIOS flashing process is creeping me out. I don't like to bother with something that is already working OK. Don't want to end up with a dead board in the end :P Maybe I'm becoming a bit paranoid though, since this is a really hard earned upgrade after 6 years. :P

  • Can not measure temperature and voltage simultaneously

    I am a beginner with labview. I want to measre temperature and voltage simultaneously. When I run
    the VI, I can get temperature or voltage, but not togther. I attach my VI, please give me suggestion on
    how to make it work. 3X
    Attachments:
    heatflux.vi ‏1069 KB

    Since I don't know your exact configuration I will make some basic assumptions based on how it appears that you have configured the DAQ Assistant Express VIs.
    Assumptions:
    1. You have only one DAQ board in your system.
    2. You want to scan continously
    3. You want to acquire 3 temperature channels at a rate of 1000S/s and take 100 readings at a time
    4. You want to acquire 2 voltage channels at a rate of 1000S/s and take 1000 readings at a time.
    Based on this configuration your first problem is that you have configured the DAQ board to acquire continously in the first call to the DAQ Assistant (your first frame of the sequence structure). This ties up all the analog acquisition resources without releasing them. When you make your second call to the DAQ Assistant (your second frame of the sequence structure) you are creating a conflict because the DAQ board is already busy running your first request. At this point you are probably receiving an error but you might not see this since you are not doing error checking in your code. This is also why you are only getting one set of data. Next iteration of the while loop the first call to the DAQ Assistance reconfigures the board and executes again and so the cycle repeats itself.
    I don't have a DAQ board installed so I can't confirm with certainty if I am correct but you can do this by simply changing the DAQ Assistant properties. In the 'Task Timing' tab change from 'Acquire Continuosly' to 'Acquire N Samples'.
    Assuming this works all you have done is confirmed that my assumptions are correct and technically your program should work. So now some programming advice.
    It's ok to scan all channels at once even though they might not be of the same type so go ahead and configure all your channels in one DAQ Assistant call and get rid of the sequence structure. Decide on one set of parameters for Scan Rate and Samples to Read, in your case I doubt this will be a problem. Since you are performing the same analysis on all channels you don't need to parse your data simply pass the 'data' from your DAQ Assistant into a single 'Amplitude and Level Measurements' Express VI. You will now have a single array with all your Mean values based on the order the channels are configured. If you want to plot the data in different graphs all you need to do is parse your channels using the 'Split Signals' or 'Select Signals' Express VI.
    Hope this makes some sense.
    -Christer

  • Recording Temperature and Voltage measurements using Keithley 2182 Nanovoltmeter

    Hello all,
    I am relatively new to LabView and looking to extend a vi I am currently using.
    I am trying to record voltage and temperature measurements from a Keithley 2182 nano voltmeter using a GPIB cable. I have a vi that can do this for either voltage or temperature not both. At the moment I only record what is shown on the display of the nano voltmeter.
    Could somebody explain how I could get labview either to change between voltage and temperature on the nano voltmeter or whether it is possible to have two simultaneous measurements of temperature and voltage and how I would achieve this.
    Thanks
    Mike

    Hi,
    For each read, no matter Temperature or Voltage there is a certain command that is send to the voltmeter.
    I don't think (actually I'm pretty sure) you cannot read it in parallel but you can do it successively: One read Voltage, one read Temperature and so on.
    There should be something like:
    while not STOP do
      1. send GPIB command for changing Keithley to Voltage Measurement
      2. send GPIB command for Voltage Read
      3. read GPIB -> Voltage
      4. send GPIB command for changing Keithley to Temperature Measurement
      5. send GPIB command for Temperature Read
      6. read GPIB -> Temperature
    end
    You can take a look in VI to see which are the commands send for Voltage and Temperature reads and to mixed them like I described it above.
    If you don't manage it share your VIs (for temp and volt.) maybe it will be easier for me (or something else) to give you some additional advices.
    Paul

  • How to measure current and voltage and rpm with daq device

    i am measuring current and voltage and am wondering should i use shunt resistors or current sensors hall effect sensors. i have a pci 6221 and a scc-68 breakout box. what specifications or size should the shunt resistor/current sensor be or should i use another device to measure the currrent and voltage. do i need any other safety device between the resistors, current sensors and the scc-68 breakout box when measuring the current and voltage. will the daq card pci 6221 pick up a signal from the resistor. what terminals should the wires from the resistors and current sensors be connected to in the scc-68. i am using a proximity switch to measure rpm of a motor. should the proximity switch be 2 or 3wire connection. should it have an analog or digital connection and is a power supply required to power the proximity switch and should it have an npn or pnp connection

    hello,
     i was going to use a 20amp 50mv, or a 20amp 100mv current shunt and connect wires directly into the AI input terminals of the scc-68 from the current shunt would these be suitable. would current shunts such as these be suitable to use http://uk.farnell.com/elc/sh10020/shunt-sh10020-20a-100mv-1-class/dp/1319576 or http://uk.farnell.com/datel/3020-01098-0/shunt-50mv-20a/dp/1339338
    is it ok to use either a current shunt or should a hall effect sensor be used  such as this http://ie.farnell.com/honeywell-s-c/csla2cd/sensor-hall-effect/dp/1082269 . which of them would be more accurate or are both of them fairly accurate.
    when i am measuring voltage can i connect two resistors between the positive and negative of the wires going to the battery and connect two wires from either side of the resistor directly into the analog inputs of the scc-68 and measure the 12-15 volts directly would the pci6221 and scc-68 be able to measure the voltage drop across the resistor.
    i wanted to measure rpm also does it matter whether the proximity switch has 2 or 3 wires and should it have an analog or digital/frequency output for cponnecting to the scc-68

  • Write time stamp and Voltage to text file

    I am a novice 2011 LabVIEW user and am trying to build a program that will write TIME in one column and VOLTAGE in another to a text file for later interpretation with MATLAB. I started to add elements to an existing code, which I downloaded from the examples forum, because it works well for my purposes of sending a finite square signal. The code that I started modifying is attached to this thread. If somebody wants to take the time to provide me with an example of how I can do this with my existing code, it would be greatly appreciated. I learn better from example.
    Regards,
    Sean. 
    Attachments:
    Voltage - Generate and Write.vi ‏99 KB

    This is a pretty simple set of code that every 5 seconds writes to a csv file that is stored in the location shown in the code. In order to apply something similar to your code simply run the data that you wish to store into the concatenate strings in the form of a string and it should be fine. 
    Although after looking over your code a second time you should probably take a look at the convert "Array to Spreadsheet string" function, all you would have to feed it is an array of times and measurements at the completion of your program and feed it into a file and it should do everything for you.
    Attachments:
    Write to File.vi ‏20 KB

  • Using XY Graph to plot current and voltage

    Hi,
       I'm making an I-V curve tracer, and am hoping to plot current and voltage measurements I'm acquiring onto an XY Graph in realtime.  I'm using Labview 2010 on Windows Vista, with the VISA drivers installed; my acquisition hardware is an Arduino Uno communicating over USB to serial via VISA drivers.
    My data is coming in over the serial port formatted like "voltage,current":
    237,521
    320,402
    I've read through the relevant documentation for the graph builder, the 4 samples included with LabView, and quite a few posts on this forum.  I modified Jazlan's sample VI to read the current and voltage, and display the values on the front panel - it works just fine.  However, when I wire those values to an Express XY Graph builder (and set the 'clear data on each call' property to false) and run the software, it just sort of freezes... I try to stop it, but it keeps running for about 10-20 seconds.  The current/voltage values are not updated, nor is anything displayed in the XY graph.
    Am I not sending the correct input to the graph builder?
    Should I just wire up my values directly to the XY graph by concatenating values to an array, and then clustering it?
    Also, on the right border of my case structure, one of the orange squares is not solid - how do I fix that?
    any help much appreciated!
    imran
    Attachments:
    block.jpg ‏140 KB
    project.vi ‏76 KB

    thanks for the tip Tim,
        I wired up the stop button to the VISA close block, and now I'm able to run and stop it multiple times without freezing.
    1)  I know how to add shift registers, but why do I need one?  It doesn't seem like I need to pass values from one iteration of the while loop to another...
    regards,
    imran
    Attachments:
    project.vi ‏43 KB

  • Measure current and voltage in Elvis and labview

    Hi!  We're setting up a control system setup for class where we need to monitor the current and voltage that we are supplying/drawing from the ELVIS hardware within a LABVIEW setup.  So far, we can accurately display voltage input and outputs from the AI and DAO terminals respectively.  However, we can't seem to find a direct way measure current draw.  I believe that we are supposed to be able to get DMM readings from the AI5 channel, but either we're not doing it correctly, or we still need to place a known resistance across it (which we cannot do, due to variable loads).  Any advise please?
    Cheers,
    Gerald

    Are you using Elvis I or II?  From your post, I will assume it is Elvis I but please correct me if this is not the case.  Page 38 of the manual shows that ACH5 is used for capacitor or diode measurements for the DMM.  To measure current, use the connectors on the front of the Elvis unit and then route the current HI/LO on the protoboard to either the oscilloscope or an analog input channel.  You can use the DMM Soft Front Panel to monitor current as well.
    Message Edited by h_baker on 07-09-2009 04:54 PM
    Regards,
    h_baker
    National Instruments
    Applications Engineer
    Digital Multimeter Resources

  • When simulating a circuit, how do I display current and voltage on the circuit?

    Hello, how do I get Current and Voltage to display on the circuit design without using meters?

    saxyman123,
    You can use probes (yellow instrument towards the end of the instruments toolbar) and then set the properties.  You can choose to display just current and voltage.
    Regards,
    Pat Noonan
    National Instruments

  • Please help me with my electrical engineering homework : temperature control and watering system for greenhouse using labview and arduino

    temperature control and watering system for greenhouse using labview and arduino
    spesification :
    1. max temp : 28 celcius (when temperature is more than 28 celcius, fan ON)
    2. min temp : 20 celcius (when temperature is under 20 celcius, heater ON)
    3. watering system : aquaponic (grow plant and fish in separate tank but connect each other). Plant roots help filter water for fish. Fish poop for plants fertilizer. So I need a pump to distribute water.
    Please help me create VI file simulation.. I'm sorry I'm not fluent in English. May God bless you all
    Attachments:
    YOOOSHH.vi ‏88 KB

    Duplicate thread.   Please keep the discussion in that thread where you already have a response. It is also the more appropriate thread for your question.
    Lynn

  • Recording of temperature, power and speed at the time of confirmation

    Hi experts,
    My client requires to record the temperature,power and speed of the equipment while doing the phase wise confirmation. I'm planning to put inspection characteristics for the operations. I'm not going to use PI sheet.
    Is this inspection characteristics assignment to record the parameters is correct?
    Any other option to do this apart from PI sheet.
    Anybody help me.
    Naren

    Dear ,
    It will be better  to capature those  parametres as Inspection Characterstic  before confirmation posting .
    However , if you want to capture as activity through certail formula key and activity price , then you can identify them as Satndard Values as like Labour , Machine and Setup .
    Go-to : OP7B "Parameters Overview"
    Define your requirements Eg.  For Power u2013 say ZELECT like that 
    (Short Key work/Keyword u2013 Electricity, Dimension " Energy", Std. Value Unit u2013 KWH)
    Then Go-to : OP19 " Standard Value Key Formula : Overview "
    Select SAP1 u2013 Normal Production and Copy as (F6) and Create new Activity Type Say eg. ZELE
    Then Assign this new Parameter ZELECT and save (handle the transport request)
    For Formula Definition :
    Go-to : OP54 "Formula Definition"
    Select SAP002 u2013 Prod.: Machine Time and Copy as (F6) and Create ur new Formula Say Eg. ZELECT
    And replace SAP_02 with your Parameter u2013 ZELECT
    Then itu2019ll look some thing like "ZELECT * SAP_09 / SAP_08 / SAP_11"
    Assign The Standard Value Key " ZELE" to your Work Center-Basic Data Screen
    Now Go-to Costing Tab Page assign your formula to your Predefined Activity Type for Power.
    (Otherwise u2013 First Create Secondary Cost Element (T.Code KA06), Assign this Sec. Cost Element to your new Activity Type, then KP26 " Change Activity / Price Planning" 
    Assign Activity Type in Work Center Costing Tab & Assign Your Formula)
    You can assign max. 6 Standard Values to your WC, I.e If you want to capture Power & Fuel Consumption for an Operation you can have both parameters set as per above procedure
    Regards
    JH

  • [Solved] lm-sensors fan speeds and voltage.

    After installing lm-sensors and running sensors-detect I have the following readings from sensors;
    ~$ sensors
    acpitz-virtual-0
    Adapter: Virtual device
    temp1: +42.0°C (crit = +80.0°C)
    coretemp-isa-0000
    Adapter: ISA adapter
    Core 0: +47.0°C (high = +78.0°C, crit = +100.0°C)
    coretemp-isa-0001
    Adapter: ISA adapter
    Core 1: +42.0°C (high = +78.0°C, crit = +100.0°C)
    I am sure in the recent past I was able to retrieve fan speed and voltage readings from this machine;
    |Abit FP-IN9 SLI | Intel(R) Core(TM)2 Duo CPU [email protected] | GeForce 9800 GT |
    I suspect this is something that I am doing wrong (or is it a kernel issue??) as I have the same problem on a similar machine running Ubuntu 10.04.
    Any ideas appreciated.
    Last edited by ancleessen4 (2010-03-09 08:31:15)

    Hi graysky,
    You were spot on!
    I followed the instructions in the wiki article to add
    acpi_enforce_resources=lax
    to /boot/grub/menu.lst
    Reboot...
    Voila!
    [neil@archbox ~]$ sensors
    acpitz-virtual-0
    Adapter: Virtual device
    temp1: +30.0°C (crit = +80.0°C)
    coretemp-isa-0000
    Adapter: ISA adapter
    Core 0: +36.0°C (high = +78.0°C, crit = +100.0°C)
    coretemp-isa-0001
    Adapter: ISA adapter
    Core 1: +34.0°C (high = +78.0°C, crit = +100.0°C)
    w83627dhg-isa-0290
    Adapter: ISA adapter
    Vcore: +1.12 V (min = +0.00 V, max = +1.74 V)
    in1: +0.91 V (min = +0.51 V, max = +0.21 V) ALARM
    AVCC: +3.30 V (min = +2.98 V, max = +3.63 V)
    VCC: +3.30 V (min = +2.98 V, max = +3.63 V)
    in4: +1.10 V (min = +0.00 V, max = +0.14 V) ALARM
    in5: +1.20 V (min = +0.13 V, max = +0.00 V) ALARM
    in6: +1.58 V (min = +0.02 V, max = +0.00 V) ALARM
    3VSB: +3.23 V (min = +2.98 V, max = +3.63 V)
    Vbat: +3.01 V (min = +2.70 V, max = +3.30 V)
    fan1: 0 RPM (min = 340 RPM, div = 128) ALARM
    fan2: 811 RPM (min = 332 RPM, div = 16)
    fan3: 0 RPM (min = 340 RPM, div = 128) ALARM
    fan4: 0 RPM (min = 340 RPM, div = 128) ALARM
    fan5: 0 RPM (min = 527 RPM, div = 128) ALARM
    temp1: +28.0°C (high = +72.0°C, hyst = +0.0°C) sensor = thermistor
    temp2: +29.0°C (high = +70.0°C, hyst = +65.0°C) sensor = diode
    temp3: +38.5°C (high = +70.0°C, hyst = +65.0°C) sensor = thermistor
    cpu0_vid: +0.000 V
    [neil@archbox ~]$
    :D

  • What are the safe limits of current and voltages

    What are the safe limits of current and voltage of an ac signal to the NI USB 6008.

    Hi hanan,
    current is voltage/resistance. There surely a resistance is given too...
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • I have a problem. I've been my HD but I lost the temperature probe and I can't find it anyway. Can someone help me? Thanks

    Can someone help me to find a temperature probe for the imac's Hard Disc? I'm tired of looking for but I can't find it.

    What you are asking isn't clear. Are you saying that you are not getting any temp readout for the drive? What are you using to see the internal temps? If you aren't getting any output from the drive temp sensor, the drive fan should be revving up full speed. Is this happening?
    Get Temperature Monitor and post the results
    http://www.bresink.com/osx/TemperatureMonitor.html

Maybe you are looking for

  • Memory leak during insert on PocketPC 2002

    We've built an application using an Oracle Lite 5.0.1 database using an ODBC interface. We've stolen most of the code from: C:\OraHome1\Mobile\Sdk\wince\samples\ODBC\sample.cpp We've found that there appears to be a memory leak during inserts (haven'

  • Lookup Data For Each Record of Result Set

    I'm trying to determine if the following task is possible in BPEL and how to implement it. Assume I have two DB Adapters returning data from two different databases. The first excepts no inputs returns 5 records in a collection looking somewhat like

  • Problem with NumericStepper and MovieClip's gotoAndStop

    I have a NumericStepper that increment/decrements the slides in a MovieClip called libMC: <mx:NumericStepper id="slide" change="{libMC.gotoAndStop(slide.value)}" value="1" minimum="1" maximum="{numberOfPages}" stepSize="1" /> For some reason when I a

  • Accessing view ALL_WM_LOCKED_TABLES

    We have a problem executing selects against the Workspace Manager view ALL_WM_LOCKED_TABLES. In all cases, the error message "Table or view does not exist" is returned. Using Enterprise Manager, we checked and confirmed that the status of the view is

  • Extracting .s2p files from 8722A Network Analyzer?

    Hi, I wish to extract .s2p files from my HP 8722A Network Analyzer. I could not find a driver for it. It is connected to my computer over GPIB. I am quite new to this, so if anyone could explain the overall process to extract .s2p files from a networ