TBX-68T temperature measurement errors

I am using a TBX-68T with a PXI 4351 card to measure the temperature of a steel plate while the plate is undergoing a TIG (electric arc) welding process. To my knowledge (and based on past experience), the device is connected properly in MAX, with the TBX-68T selected as the accessory. In MAX it also passes the tests. Using the built in CJC and specifying a K-type thermocouple at a range of +/-0.625v, MAX reports temperatures of -80C. The temperature remains nearly the same as the range is increased, but if I change the range to +/-3.75v, the temperature suddenly jumps to 100+C. I used the Measure Thermocouple (with accessory).vi that comes with the driver, and it gives me the same readings. Furthermore, when I initiate the TIG welder the card reads extremely negative (-10e9) temperatures.
What is odd, is that I have used these devices for the exact same process in the past and it generated believable results. 
How can I fix the current temperature offset problem? Could I have possibly damaged the unit? Is it possible to use the TBX-68T to reliably detect temperature in arc welding process? 
Thanks,
William

Hello,
Becasue TIG welding creates a large electrical discharge I am concerned that you may have shorted the card.  What kind of isolation did you use to ensure a separation between the TIG process and the temperature measurement? Because the error is read by both your program and MAX it must be either the temperature configuration or the card.  Could you try to conduct a simple Voltage measurement with the 4351, lets say a AA battery.  You should be able to read 1.5-1.6V.  If this works than the card should be fine, (make sure to test all channels that are failing). 
If you cannot read the proper voltage on the card, then it is possible that the card will need to be repaired. Please update us so we can help you further.
Thank you 
Regards,
CharlesD
Digital MultimetersSchedule a Free 1 Hour LabVIEW Tutorial with an NI Applications Engineer

Similar Messages

  • Traditiona​l ni daq using tbx 68t with error 10685

    Hi im new, I have tried to create a program that will allow me to accept temp readings. i have a tbx 68t device and a pci 4351 card from which this is read.
    I have uploaded the vi.
    i keep getting a message of 10685 - the clock rate exceeds the boards recommended max. rate, even when i have the clock rate to a low 10.
    any help is much appreciated, i am fairly new to daq with labview. i do appolgise i have yet not annotated on the program.
    Thanks
    Notay
    Attachments:
    thermocouple example.vi ‏37 KB

    Hi again Notay,
    The 4351 Device has specific rates that you can sample at for single or multi-channel acquisition. There are six possible reading
    rates – 10, 50, and 60 readings/s in single-channel acquisition mode and 2.8, 8.8, and 9.7 total readings/s in multiple-channel acquisition mode.
    See the data sheet or manual linked bellow for more information.
    http://www.ni.com/pdf/products/us/2mhw296-297e.pdf
    http://digital.ni.com/manuals.nsf/websearch/924939​4DFC45FA3D86256FD20078D836
    Jon B
    Applications Engineer
    NI Uk & Ireland

  • Temperature Measurement errror

    I am using a DAQ card 6052E, an SCXI 1121 , chasis SCXI 1000 and TBX 1328 for thermocouple and thermistor measuremtns.
    I have gone through the process of configuring everything in MAX. However my temperature measurments using a thermocouple gave me a wrong figure of around 70C. To debug this situation I measured the CJC temperature( using a Labview example)which is around 56C ( when in fact the ambient temperature is around 27C.
    Why is my CJC temperature wrong ?. Is the CJC broken?
    Also, the cable SH32-32 cable assembly ( for connecttion between 1328 and 1121) came without the cable adaptor. Hence I direclty connected the TBX 1328 to the SCXI 1121 it fits perfectly well withut the adaptor. (You can see the december 2000 manual for T
    BX 1328
    http://digital.ni.com/manuals.nsf/websearch/22B262582673F346862569B300742094?OpenDocument&node=132100_US)
    Could that possibly cause any error?
    Can someone give me advice as to how to debug this error? Thanks!

    I am reffering to the TBX cable adaptor on page 8 of the December 2000 manual of TBX 1328.
    I dont seem to have any bent pins.
    I tried your suggestion. The voltage doesnt change when the temperature changes.( in fact it is pretty constatnt )
    Also in MAX when i try to set up the global channel/task I keep getting an error saying
    Minimum Possible voltage value is -5mV and maximum is 5mV. HOwever i know that the SCXI has maximum output of +5V. ( and i have set the gain to 1000 )
    I also tested the cable for connections. It seems fine.
    I then tried to use a thermistor for temperature measurements. ON the TBX 1328 I connected the 2 ends of the thermistor to the + and - of the Channel 0 input.
    Also I connected
    EX0+ to CHo+
    EX0 - to CH0-
    ( I d
    ont know if this qualifies as a 2 or 4 wire configuration)
    Then when I tried to set up a global channel, it told me that the mimum value should be 198C and Maximum 300C.
    I doubled checked my jumper settings on the 1121
    gain = 1000
    Filter = 4hz
    Current excitation = 0.15mA.
    it is confiuged for " MTEMP' readings. ( i am using it in multipled mode)
    I have tried several different sensors but to no avail.
    I am at my wits end and dont know what do to. I had called up NI but their suggestions did not work.

  • Accurate differential temperature measurements

    May I connect my RTD's in series to improve differential temperature measurement accuracy. I currently use an SCC RTD01 together with a DAQCARD. My differential temperatures are in the magnitude of 0.5 - 1.0C.

    Tom,
    After reading your second post, I realize that you want to couple the RTD's in series. Originally, I thought you wanted to couple the modules in series.
    Ok, you can place the RTD's in series, but this will not increase your measurement accuracy. If you decide to experiment with this setup, make sure you provide sufficient current excitation for both RTD's. What will happen is the experimental error you receive at the first thermocouple will be seen by the next RTD in series, so your error will not decrease.
    I advise you to try the oversampling idea to increase your accuracy. Here is information regarding this form of accu
    racy improvement.
    You can use this accuracy calculator to calculate the accuracy of a given SCXI module and compare this value to the measurement accuracy you're seeing with the SCC module.
    I hope this helps! Please let me know if the averaging technique helps to increase your accuracy.
    Best of Luck,
    Joe Des Rosier
    National Instruments

  • Huey Pro calibrator- error message  display measurement Error

    i am using this huey pro calibration.. it had been a year without problem... but now i am trying to calibrate my apple cinema display LCD 30 inch it keep saying error message- display measurement error.. and huey support never answer my e-mail. any idea? how can i fix it.. i did re install software many times but same thing happen.. so right now i can;t calibrate my monitor.. any help please

    >a nice NVidia card... Please help Adobe!
    1a - what is the model nVidia card, and what is your driver version?
    1b - for instance, I have a GTX 285 and driver 296.10
    2a - aside from the occassional Adobe employee, this is a user to user forum, not Adobe suport
    2b - how to contact Adobe...
    Adobe contact information
    http://www.adobe.com/support/contact
    In the US - Adobe General support 800-833-6687 M-F 5am-7pm Pacific
    In the US - Adobe Install Problems 800-642-3623
    In the US - Adobe Activation 866-772-3623 Open 24/7

  • Huey Pro Calibration Error message- measurement  error

    i need using this huey pro calibration for year. but now i am trying to calibrate my apple cinema display LCD 30 inch it keep saying error message- display measurement error.. and huey support never answer my e-mail. any idea? how can i fix it.. i did re install software many times but same thing happen.. so right now i can;t calibrate my monitor.. any help please?

    Hi,
    Which Wireless-Card you used?
    Best you use the Cisco CB21AG PCMCIA Card. I had similar problems with intel wireless chipsets
    Sincerely,
    Frederik

  • RTD Temperature Measurements using LabView 2013 and MyRio

    Hey everyone.  I am VERY new to LabView programming and working with a MyRio.  I need to figure out how to measure the resistance of a 2 wire RTD to find a temperature utalizing the MyRio and LabView.  I am pretty lost on how to do this.  Does anyone know some good resources for making the LabView program off of the tops of their heads?  I've figured out how to measure from specific pins, but I am not sure how to get it to constantly output a voltage from the output pins.
    Eventually, I would like to have it display the temperature as well as have it turn on or off a heater depending on that temperature, but that will come far later in this process. First things first, how do I take temperature measurements utalizing LabView 2013 and a MyRio with a 2 Wire RTD. 
    Thanks so much!

    Hi JoshEpstein87,
    The myRIO can't acquire a change in resistance directly, so you'll need to somehow convert the change in resistance to a change in voltage. There are multiple ways to do this, but you'll need to build an external circuit and then read the voltage output with the myRIO. One example of a circuit that allows you to do this can be found here. To output a voltage on the analog output pins, you should just need to set the output voltage and then it will remain at that voltage until you change it or power cycle the myRIO.
    To get started with LabVIEW and myRIO programming, see the following page:
    http://www.ni.com/myrio/setup/getting-started/
    There are some links to LabVIEW training as well as resources about RIO programming. I also highly recommend you check out the myRIO Community as there are example programs on there that you can take a look at to see how they are designed.
    Best Regards,
    Matthew B.
    Applications Engineer
    National Instruments

  • Help needed with temperature measurement

    hi guys
    I need some help with temperature measurement. I've got a NI PCI 5229 card and a powertransmitter. The power transmitter is programmed to output a 4-20mA (0°C - 100°C) signal. How can i get labview to convert this mA signal to displaying it as temperature? Please include a lot of detail as I am a first time labview user.
    Thanx

    Hi Jaco,
    calculate gain and offset for your values. After you have this values, you can calculate your temperature. See the attached picture.
    Hope it helps.
    Mike
    Message Edited by MikeS81 on 05-18-2008 02:44 PM
    Attachments:
    Unbenannt1.PNG ‏30 KB

  • Temperature measurements display incorrectly after opening a project.

    Dear All,
    I am having some difficulty in setting up LabView SignalExpress version 5.0.0 to display RTD temperature measurements from the NI9217 module and the NI cDAQ-9178.
    I have two problems:
    1. I cannot set the waveform chart to show the desired time interval; I need to set the time interval displayed on the chart to 30 minutes. This means that before taking 30 minutes of continuous measurement the user would see a fixed grid with fixed time boudaries and the plot moving from the left hand side of the screen to the right hand side of the screen. After thirty minutes of measurements have been taken, the chart would begin to "scroll"  so that the earliest measurements move off the screen and the chart still displays the time interval of 30 minutes.
    2. I need the chart to display correctly immediately after the project has been opened; When I open the project and click the 'run' button, the windows are all displayed correctly but the readings are not displayed correctly, the chart only displays one short line to the right of the screen and a very short time interval of less than a second. I can correct this by changing to a Time XY Graph and then back to a waveform chart but need this to display correctly as soon as the software is opened. The incorrect and the corrected chart are shown below:
    The chart below is displayed when the software is opened and the run button is clicked. This chart style is incorrect because the temperature line is displayed in the bottom right of the chart and only for a short time interval.
    The chart below is closer to the required graph described in point 1 but I cannot set the time interval to stay fixed at 30 minutes, nor can I make the software display the graph like this on opening, I always have to switch to a different chart type and then switch back to a Waveform chart to replace the format above with the format below:
    Any help would be appreciated,
    Thanks,
    Matthew

    Hi Matthew,
    We should be able to used this function to save the graph settings as well. I have just tried it on my system and got it to work. the procedure I undertook was:
    Open my Signal Express Project
    Turn off x-axis auto scale (Right click on graph >> X Scale >>Auto scale)
    Enter Properties (Right click on graph >> Properties)
    Select the scales tab
    Select 'Relative To' under the Timestamp Type.
    Set the min as 0 and the max as 1800...Press OK
    View >> Layouts >> Make Current Project Layout Default
    Save
    In this case I used the Relative Timestamp Type but you can use whatever time stamp you wish. When opening up this project after doing this the graph is formatted as required immediately.
    Let me know how you get on with this.
    Regards,
    Aaron. E
    Applications Engineer Team Lead
    National Instruments
    ni.com/support

  • SE30 - Unable to end the measurement (error number 5, Time limit reached)

    Hello there,
    I wish to perform a complete ABAP trace using SE30 for a program running for 4 hours. The system is R/3 4.6C.
    However I got the error message below once I clicked "Back" button or F3 when the program finished after 4 hours.
    Please note that ST12 is not available in the system I logged on.
    =============================================================
    Error message:
    "Unable to end the measurement (error number 5, Time limit reached)
    Message no. S7 068"
    Meas. type           Fully aggregated
    Session type         In current session
    Status               Time limit reached
    Error message text   Time limit exceeded. LIMIT: 1800000000, ELAPSED TIME: 1800957266
    File user            XXX
    User                 XXX
    File ID              ATRAFILE
    Release              46C
    Version              6
    Operating system     HP-UX
    Number of processors ???
    =============================================================
    The ABAP trace results only managed to capture the first 45 minutes of the ABAP calls (due to timeout i think). I would need the whole and complete ABAP trace result for 4 hours, NOT the first 45 minutes. Is there any setting in SE30 can be set for this purpose? Note that there is no ABAP error for the program I traced.
    The following link didn't answer my question as well. Hope you can provide clue in this case.
    http://help.sap.com/saphelp_nw70/helpdata/en/c6/617cafe68c11d2b2ab080009b43351/content.htm
    Runtime analysis, SE30, ERROR
    Thanks,
    KP

    Hi Kim,
    The Limit is expliced on the Doc.
    http://help.sap.com/saphelp_nw70/helpdata/en/4d/4e2f37d7e21274e10000009b38f839/frameset.htm
    It's 4293 Actually and you set it on SE30 by change the  Measurement Restrictions, tab Duration/Type.
    regards.

  • SCXI-1328 temperature measurement accuracy

    The catalog says temperature accuracy of SCXI-1328 is 0.5C. If I use three thermocouple, does 0.5C uncertainty exist for all results from each thermocouple? Or, only cold junction has 0.5C uncertainty so that all results have the same offset from real value. In latter case, what is the accuracy of temperature measurement?

    Hello efour,
    0.5 ºC is the accuracy for the Cold Junction Compensation (CJC) for the SCXI 1328 terminal block for temperatures ranging from 15-35 ºC. Each thermocouple measurement would be subject to this factor. That is the only factor from the 1328 that will affect your temperature measurements.
    The overall measurement accuracy of your system would be dependent upon what module you have the 1328 block attached to and what data acquisition (DAQ) card you are using. There is an accuracy calculator available on our website. You can reach it at
    Accuracy Calculator
    http://www.ni.com/advisor/accuracy/
    With this tool, you can enter in your DAQ and SCXI card information. It will return the accuracy of each and then the overall system accuracy. It do
    esn�t include the terminal block, though.
    General information about taking thermocouple measurements can be found at:
    Thermocouple Temperature Measurements
    http://zone.ni.com/devzone/conceptd.nsf/webmain/4C54819521D3503786256D73006E8BE9?opendocument
    Let me know if you have any further questions on this issue.
    Scott Romine,
    National Instruments

  • NI 4350 S1/S2 Switches (TBX-68T)

    I'm using the TBX-68T terminal block with an old NI-4350 USB device for analog input from a pair of thermistors.  The TBX-68T has a pair of switches labeled "S1 S2", which are for the current source(s): (the 4350 has a 25uA source and the 4351 has an additional current source).
    My question is, how are these switched wired?.....(i.e. why do I need to throw both switches ON to get the 25uA current source on the 4350)?
     It works, but I may also have to use the 4351 and I want to know the details of how S1/S2 function.  Originally I thought S1 would be for the 25uA current source and S2 would be for the 2nd current source in the 4351, but I had to through both S1 & S2 to get the 25uA supply.  
    I've checked the online NI PDF documents for the 435x and 4351 and I've not been able to find this detail.

    Hello PointOnePa,
    I will look into this to see if I can find out why both switches have to be used to access the current excitation.  However I believe that on the 4351 you will still only be able to access one of the current excitation values.  Which ever excitation value you select in software will be available if S1 and S2 are switched on.  I do not believe there is a way to access both excitation levels at the same time.  I will do a bit of research and see if I can find any evidence to the contrary.
    Are you trying to access both excitation levels at the same time or just trying to figure out how to access the 1mA level externally?  If you just want to get the 1mA level then you just need to select this in software and turn S1 and S2 to on.
    Let me know what you're looking for and I'll let you know what I find.
    Cheers,
    Brooks

  • NI434X and TBX 68T Data Acqusition

    Hi Guys
    i am using NI 435X and TBX 68T to read in voltage using excitation current of 1 mA passing thru resistors
    i use the NI435X vi as given in the examples for NI4355X
    however there is some strange readings obtained...........
    (1) When i used NI 435X check. Vi and wire it to " no of scans" to NI 435X read.vi, i obtained correct voltage reading for first few readings, thereafter readings(voltage) becomes zero until after certain time, reading return to correct values again then it returns back to zero again .........why is this happening, the connections are correct ,
    based on readings, its seems like at certain times, the circuit is open which physically should not be the cases
    (2) for 1 attempt, i did not use NI 435X check.vi but instead wire a constant value, 5 to the "no of scans" input to NI 435X read.vi , the data obtained is correct but the no of data read is very much lesser than if i use 1
    so i not sure why
    search the internet but not helping.........
    Anyone, please help
    trying to figure out 4 a few days
    but no answers
    thks

    Hello yongster.  Please refer to the forum at http://forums.ni.com/ni/board/message?board.id=170&message.id=222104 which you created first about this question.  All of your questions will be addressed here.  I understand that due to the Holiday and the weekend, your question went unanswered for a couple days and I am sorry for that.  However, in the future, please refrain from posting the same question in numerous places as it causes confusion and clutter in the discussion forums. 
    Brian F
    Applications Engineer
    National Instruments

  • Which board is good to produce 5v TTL and temperature measurement?

    I want to use Labview to control a valve controller,which contects 6 solenoid valves. This instrument accepts 0-5V TTL signals. I need to switch the valves on/off below 1 second. Which Ni board is good for this application as well as temperature measurement I want also?
    Thanks a lot for your suggestion.

    Sheng,
    One of National Instruments E Series multifunction DAQ devices should provide all of the functionality you require. You may want to also consider using SCC signal conditioning for your temperature measurements. With regards to the digital output, you will want to verify that the current drive provided by an E Series device is great enough to control the valve. If it is not, the NI 6509 digital I/O device provides high current drive. Below, I have included links to the appropriate product pages:
    E Series Multifunction DAQ
    SCC Signal Conditioning
    NI 6509 Product Page
    Good luck with your application.
    Spencer S.

  • Thermocouple Measurements on TBX-68T with PCI-4351

    I have set up Traditional NIDAQ Virtual Channels for 2 different thermocouples. In a simple program, I wire each channel to AI Sample Channel.vi and get the temperature reading. This works fine. However, when I do this inside of a larger code where I am also reading voltage measurements on my PCI-4351 (using 435xFast.vi which I downloaded), the first thermocouple I read is always giving an erroneous error. Do I have to reset the board in some way before I take the temperature readings?

    I've got just a few questions for you:
    1. Are you seeing a large negative value anywhere else other than when you run your particular vi as a subvi? Test panels in MAX? Other example programs?
    2. Do you have the latest drivers for your application? NI-DMM 2.3.1? NI-DAQ 7.3.1? NI-435x 1.1?
    3. Have you tried reseting the board in MAX and then running your application? If this takes care of that first bad sample, you could try calling Reset Device in the Traditional DAQ function palette before starting your acquisition.
    Please let me know what you can. Have a great Thanksgiving!
    Logan S.

Maybe you are looking for

  • SQL error 3113 occurred when executing EXEC SQL.

    Hi, We are facing one typical problem, One background is failing regularly with below dump. as we now got all notes giving information, if database  restarted taking backup, these type of failures occur, but our database is only down for backup once

  • Need help in fixing the error

    Hi All, could anybody help in fixing the following error. C:\bea\weblogic81\server\bin\.\myserver\.wlnotdelete\extract\myserver_SAISTARS_SAISTARS\jsp_servlet\__listproject.java:1071: cannot resolve symbol probably occurred due to an error in /listPro

  • Creating P.O object

    hi all experts i am creating purchase order object in smartforms...i have to take tax details from BSET table to link with EKPO table so can any body tell on which field i have to make link whilw retriving data..... Thanx in advance .....

  • I want to change or delete Username.

    I want to change or delete Username.

  • Error in handlinh web service

    Hello, I created a web service in which, i am calling a procedure from oracle apps. I am passing 3 values to procedure and getting one as a flag (Y or N) and other is null. So while handling null value i am getting error as : java.lang.nullpointer...