How to use labView to measure temperature using a thermistor

I'm trying to create a labView that will give a temperature reading based on a circuit using a thermistor. Any ideas on the VI or how to construct the circuit would be helpful!

Well, you need to generate a voltage. You also need a way to read that voltage. Do you have a meter, or a DAQ device?
Temperature Measurements with Thermistors: How-To Guide
As for other circuits, the internet is a good place to search. Have you tried? Here's a couple of places:
http://www.discovercircuits.com/T/therm.htm
http://www.facstaff.bucknell.edu/mastascu/elessonsHTML/Sensors/TempR.html
http://www.ecircuitcenter.com/Circuits/therm_ckt1/therm_ckt1.htm

Similar Messages

  • How do I run a J-type thermocouple to the usb-6363 DAQ and program labview to measure temperature?

    I have a usb-6363 DAQ and a J-type non-contact thermocouple that I am looking to connect and measure temperature through. However, the DAQ does not have any T/C inputs, which is needed to measure in the thermocouple temperature. I am connecting the thermouple to an analog input (+/-) and I am not looking to buy an amplifier, converter or any other hardware. I believe there is a way to program labview to read in the voltages of the thermocouple and convert it into accurate temperature readings. Any help/ideas?

    Hello George,
    This tutorial should step you through the basic process of configuring the device and connecting the thermocouple:
    Tutorial: Connect Thermocouples to a Data Acquisition (DAQ) Device
    http://www.ni.com/gettingstarted/setuphardware/dataacquisition/thermocouples.htm#Connecting a Thermocouple to Your Device
    From there, there are a number of things you can do- I'd recommend taking a look at the LabVIEW shipping examples (Help>>Find Examples...) as well as the DAQmx getting started tutorials:
    Getting Started with NI-DAQmx: Main Page
    http://www.ni.com/white-paper/5434/en
    At first glance, the 6363 you're using should have enough resolution to acquire usable data from a thermocouple- if you attempt reading raw voltages be sure that the acquisition range is configured for +/- 0.1V, though.
    Regards,
    Tom L.

  • How can I collect and record data in LabVIEW from Sensirion Temperature/Humidity Sensor connected to myRIO?

    The product documentation of the sensor and the datasheet is attached. Also Digilent MXP Breadboard's image is attached.
    I've connected the sensor to myRIO using the starter kit's MXP breadboard: http://www.digilentinc.com/Products/Detail.cfm?NavPath=2,842,1216&Prod=MXP-BB
    The output of the sensor is in digital. My initial goal is to get digital output in form of 8 or 12 or 14 bits and then convert them to decimal numbers and record them.
    Pin 1 of sensor is Data, which is connected to myRIO's connector A, DIO1 pin 13.
    Pin 3 of sensor is Clock, which is connected to myRIO's connector B DIO5/SPI.CLK pin 21. According to the datasheet of the sensor, if VDD < 4.5V then Clock should be set to 1MHz.
    Pin 4 of sensor is Ground, which is connected to Digilent MXP Breadboard GND pin.
    Pin 8 of sensor is VDD, which is connected to Digilent MXP Breadboard +3.3V pin. Supply voltage range is 2.4-5.5V and recommended voltage is 3.3V.
    I need to be able to send the following commands to the sensor from LabVIEW.
    Measure temperature: 00000011
    Measure relative humidity: 00000101
    Then, myRIO has to wait for maximum 20 milliseconds (ms) for 8-bit measurement, 80 ms for 12-bit measurement, and 320 ms for 14-bit measurement.
    Two bytes of measurement data will then be transmitted. myRIO as microcontroller must acknowledge each byte by pulling the DATA line (pin 13) low. All values are Most Significant Bit (MSB) first. Note that for 8-bit result, the first byte is not used. Finally, myRIO has to terminate the communication after the measurement data Least Significant Bit (LSB) has been received by keeping ACK bit, which is the LSB bit, as high.
    Attachments:
    Digilent MXP Breadboard.jpg ‏96 KB
    Sensirion-Temerature-Humidity-Sensor-Documentation.pdf ‏136 KB
    Temperature-Humidity-Manufacturer-Datasheet.pdf ‏323 KB

    Have you searched the forum?
    http://forums.ni.com/t5/LabVIEW/Sensirion-sht7x-labview/m-p/350958/highlight/true#M179022
    Greetings from Germany
    Henrik
    LV since v3.1
    “ground” is a convenient fantasy
    '˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'

  • How can i measure temperature using thermocouple using labview?

    please  list out the procedures to be followed to measure temperature using thermocouple using labview.

    Do a search in the LabVIEW Example Finder (Help->Find Examples).  There are a ton of simple Analog Input examples in there using DAQmx.  One you might be most interested is "Thermocouple - Continuous Input.vi".
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • How can i use labview to acquire temperature from optical fibre sensors

    i am trying to develop an application that would enable a user to acquire temperature data from optical fibre sensors and display the profile on a PC screen but i have got no clue of how to go about it

    To provide some assistance, I will need to know:
    1) What type of output does the device produce? (volts, current, frequency ...)
    2) What type of conditioning does the signal require? (gain, filter, excitation, bias resistors ...)
    3) What type of calibration does the vendor provide? (how the output signal relates to temperature)
    These characteristics have already been determined by the manufacturer (or the engineer who developed the interface electronics).
    You have made no mention of the computer, OS, software, or hardware you will use. I could try to walk you through the selection process; however, I would probably take up too much space trying to discuss all of the possible combinations.
    Could you provide some additional information about hardware and soft
    ware you will use? I will look for a response.

  • How to use the Control lines of a parallel port as Input lines to be read using Labview ?

    The details are :
    NI Software : LabVIEW
    Version : 5.0
    OS : Windows 95
    NI Hardware : N/A
    Drivers : N/A
    CPU : Pentium
    RAM : 48
    Vendor : darcom
    Customer Information :
    SPEL TESTING
    SPEL, INDIA
    [email protected]
    Ph: (91) 4114 53818
    We do not have any DAQ cards within the PC. We have the parallel port which is EPP and ECP compatible having the address 278h on LPT2 and we are trying use this port for reading 8bit data from an external circuit. We developed a vi program in Labview 5.0 to control the parallel port.We tried with the Data lines to send signals from PC to external device through this parallel port with the addr
    ess 278h (which
    is Data lines) and it works fine. We also tried using reading 4bit data from external
    device to PC through this port with the address 279h (which is Status lines) and it
    is also working fine.
    But it was not possible for us to read through Control lines whose address is 27Ah. Whereas when line printer (dot matrix printer) was
    connected, it was possible for us to take print out. Thus printer was working. This
    informed us that the control lines are all OK!
    Can you please clarify, how to use both control lines and status lines to read 8 bit data through this parallel port using the Labview software.

    There are several Knowledge Base entries about this on the NI site, but probably the most detailed document is on the Advanced Measurements (www.advmeas.com) website. Try looking at this page, I think you will find it useful.
    http://www.advmeas.com/goodies/parallelport.html

  • Can I measure temperature using 6013 & 68LPR?

    Hello, from hardware I've got Multifunction DAQ (6013) and Therminal Block (68LPR). From software I've got VI Logger (1.0.1) and NI-DAQ (6.9.3). Using this, can I measure temperature with K-Type therminals? (Everything is installed and ready to go)
    If yes, how can I connect K-Type therminal to thermal block? Are there any instructions on how to use VI Logger to perform similar task to this one?
    Also what I want is, the temperature that has been read (for an hour, every 5mins) to be written in a file. Is it possible to write to a file using VI Logger?
    Thank you!
    Regards Slav.

    Hello:
    I answered this questions over here:
    http://exchange.ni.com/servlet/ProcessRequest?RHIVEID=101&RPAGEID=135&HOID=50650000000800000041CA0000&UCATEGORY_0=_30_%24_12_&UCATEGORY_S=0&USEARCHCONTEXT_QUESTION_0=%2B6013+%2Btemperature&USEARCHCONTEXT_QUESTION_S=0
    Sincerely,
    Brooks B
    Applications Engineer
    National Instruments

  • Measuring temperatur​e using thermocoup​le and SCB-68 box

    I used thermocouple and SCB-68 box to measure temperature. But the readings were fluctuating around the expected value. So I removed the thermocouple and short-circuited the differential input channels. As expected, the voltage reading was not zero and also fluctuating. It seems that there was something wrong with the SCB-68 box. What might lead to this and how to solve this problem? Thanks.

    The problem could be either the SCB-68 or the DAQ card you are using to make the measurement. You should first ensure that the DIP switches (S1 - S5) of your SCB-68 are in the correct position for your application. Please refer to the SCB-68 User Manual for more information on this subject.
    SCB-68 68-Pin Shielded Connector Block User Manual
    http://digital.ni.com/manuals.nsf/websearch/74C86A​DEF0E4813F86256C84007CB3AB?OpenDocument&node=1180_​US
    What do you mean by fluctuating? Full swings from 10V to 10V? Hitting one rail? Close to 0V but looks like noise in mV or uV? If your fluctuation is very slight, for example off by a few millivolts, than you could be picking up noise or your board could need to be calibrated. You can read more about accuracy and
    calibration from the following web page:
    Calibration Solutions from National Instruments
    http://www.ni.com/support/calibrat/
    If you have an E-Series Card you may be interested in running our Online E-Series Diagnostic Utility. This utility takes about 3 minutes to run, will perform a calibration, and will generate a report of anything that may be wrong with your board.
    Online E Series Diagnostic Utility
    http://www.ni.com/support/selftest/
    Regards,
    Justin Britten
    Applications Engineer
    National Instruments

  • How to use dll in Labview ?

    I compile this code to dll file with VC++2010 filename is test_dll.dll .
    #include "stdafx.h"
    #include <iostream>
    #include <Windows.h>
    using namespace std;
    int main(int a){
        cout << "Test dll...............\n";
        return a;
    After that, I put Call Library Function node in editor and double click Call Library Function node. I browse test_dll.dll into Library name or path and set function prototype to int32_t main(int32_t a); but it show error Call Library Function Node 'test_dll.dll:main':function not found in libraly. How to use dll in Labview ?  And I have more question is what is differrent from Tools -> Import -> Shared Library(.dll) and use Call Library Function node.
    Solved!
    Go to Solution.

    The issue you are having is that LabVIEW is not capable of using C++ DLLs directly. It only handles C DLLs. This does not mean that if you you cannot use the DLL if it's compiled with the C++ compiler as opposed to the C compiler. Rather, it means that you must take extra steps in order to use it from LabVIEW. The primary issue is that of name mangling or adornment. This is discussed here: http://zone.ni.com/devzone/cda/tut/p/id/4877. Basically you need to prepend extern "C" in front of your prototypes in your header files. I would also suggest reviewing this article: https://decibel.ni.com/content/docs/DOC-14564.

  • Req any examples of how to use a USB midi controller​/keyboards with Labview TIA

    Req any examples of how to use a USB midi controller/keyboards with Labview TIA

    Hi,
    To access the MIDI ports you will need to call the Windows SDK. To send MIDI commands is relatively easy, here is an example that shows you how to send data to a MIDI controller or keyboard.
    As far as input goes, this is the hard part. There are a series of functions that you need to call to open up the device, set some buffers and and possibly a callback to get notifications on the incoming data.
    Reading MIDI data will not be an easy task, your best bet will be to implement this in a DLL and call that DLL in LabVIEW, there should be some code available o the web.
    = "http://msdn.microsoft.com/library/default.asp?url​=/library/en-us/multimed/htm/_win32_multimedia_... is a link to the Windows multimedia functions that you could use for MIDI input.
    Let me know if you have any further questions.
    Regards,
    Juan Carlos
    N.I.

  • I need to program a Hittite Fractional​-N Synthesize​r Evaluation Kit with a HMC702LP6C​E an external YIG oscillator​. Not sure how to use Labview to control the PLL.

    I need to program a Hittite Fractional-N Synthesizer Evaluation Kit with a HMC702LP6CE an external YIG oscillator.  Not sure how to use Labview to control the PLL.

    Here is how to use the PLL. But I don't know of how to interact with that device
    http://zone.ni.com/devzone/cda/tut/p/id/3781
    And for thouse who don't know what a PLL is a free bonus link is here:
    http://digital.ni.com/public.nsf/allkb/07BC8D77D4E​9AE258625708B007CE74F?OpenDocument
    and a second one on what that device is: http://www.hittite.com/products/view.html/view/HMC​702LP6CE
    Now we are all caught up to speed.
    Sam S
    Applications Engineer
    National Instruments

  • How to use third party DLL's in LabVIEW

    Hi all,
    Am using maxon EPOS 2 (Easy Positionning system) to control the EC motor. It comes with the 32-Bit Windows DLL for labVIEW. How to use these library unctions in labVIEW properly? When i try to call those library function in a LabVIEW program, it's showing me that one warning affecting one caller and so the subVI is not executable. Do i have to change the location of the library files or make to make any other changes??  
    Solved!
    Go to Solution.

    Have you seen the KB: Program a Maxon EPOS Controler using LabVIEW ?
    Christian

  • Need Expert's Advice - How to use LabView Efficiently and to increase Readability

    My application is fairly complex. It is a real world testing applications that simultaneously controls 16 servo motors running various stress testing routines asynchronously and all at the same time. The application includes queues, state machines, sub VI's, dynamically launched VI's, subpanels, semaphores, XML files, ini files, global variables, shared variables, physical analog and digital interfaces and industrial networking. Just about every technique and trick that LabView 2010 has to offer and the kitchen sink as well.
    Still I am not happy with the productivity that LabVIEW 2010 has provided, nor the readability of my final product.
    Sometimes there are too many wires. Much of my state machines have a dozen or more wires just going from input to output, doing nothing, just because one or two states in the machine need that variable in some state. Yeah, I could spend alot of time bundling and unbundling and rebundling those values, but I don't think that would improve things much.
    We have had a long discussion about the use or misuse of Local variables in this forum and I don't want to repeat that here. I use them sparingly where I think it is relatively safe to do so. I also have a bug whenever I try and copy some code that contains one or more local variables. On Pasting the code with local variables, the result is something other than what I expected, I am not sure what. I have to undo the paste and rebuild the code one object at a time.
    I am also having trouble using trouble using Variable Property Nodes. When I cut and paste them, they often loose their reference object and I have to go back into the code and redo the Link To on each one. That wastes alot of time and effort.
    Creating subVIs is often not appropriate when the code makes many references to objects on the Front Panel. Some simple code will turn into a bunch of object references and dereferences which also tends to take alot of work to clean up and often does not help overall readability in many cases. I use subVIs when appropriate, but because of the interface overhead, not as often as I would like to. My application already has over 150 sub VIs.
    The LabView Clean Up Diagram function often works poorly. It leaves way too much empty space between objects, making my diagrams 3 to 4 24" screens wide. That is way too much and difficult to navigate effectively. The Clean Up function puts objects in strange places relative to other objects used nearby. It does a poor job routing wires and often makes deciphering diagrams more difficult rather than easier.
    My troubleshooting strategies don't work well for large diagrams and complex applications. The diagrams are so complex that execution highlighting may take 20 minutes for a single pass. Probes help, but breakpoint aren't of much use, because single stepping afterwards often takes you to somewhere else in the same diagram. I can't follow the logic well doing this.
    Using structures, I may have Case structures nested 5 to 10 levels deep inside some Event Structure inside a While Loop. Difficult to work with and not very readable.
    All and all, I can make it work, but I am not happy about the end result.
    I am hoping to benefit from some expert advice from those that are experienced in producing large complex applications efficiently, debugging efficiently and producing readable diagrams that they are proud of.
    Can anyone offer their advice on how best to use the LabView features to achieve these results in complex applications? I hope that you can help show me the light.

    I'm not an expert but I'm charged out as one at work.
    I am off today so I'll share some thoughts that may help or possibly inspire others to chime. I have tried to continually improve my code in those areas and would greatly welcome others sharing their approaches and insights.
    Note:
    I do refactoring services to help customers with this situation. What I will write does not represent what we do in a code review since our final delverable is a complete final design and that is beyond the scope of this reply.
    I'll comment on your points.
    dbaechtel wrote:
    My application is fairly complex. ...
    While watching Olympic figure skating competion slow-motion replays, I learned how the subtleties of how the launching skate is planted while entering a jump can make the difference between a good jump and a bad one.
    In software, we plant our foot when we turn from the design to the development. I have to admit that there where a couple of times when I moved from design to development too early and found myself in a situation like you have described.
    How to know when design is done?
    Waterfall says "cross every 't' and dot every 'i' " while Agile says "code now worry about design latter" and Bottom-up "says "demo working why bother designing" (Please feel free to coment on these over-simplifications gang).
    My answer is not much more helpful for those new to LabVIEW. 
    My design work is done when my design diagrams are more complicated than the LabVIEW diagrm they describe.
    dbaechtel wrote:
     simultaneously controls 16 servo motors running various stress testing routines asynchronously and all at the same time. The application includes ...and the kitchen sink as well.
    Have you posted any design documents you have? These will help greatly in letting us understand your application. More on diagrams latter.
    Anytime I see multiple "variations on a theme", I think LVOOP (LabVIEW OOP ) . I'll spare you the LVOOP sales pitch but will testify that once you get your first class cloned off and running as a sibling (or child) you'll appreciate how nice it is to be able to use LVOOP.
    Discalimer:
    If you don't already have an OOP frame of mind, the learning curve will be steep.
    dbaechtel wrote:
    Still I am not happy with the productivity that LabVIEW 2010 has provided, nor the readability of my final product.
    Sometimes there are too many wires....going from input to output, doing nothing,... spend alot of time bundling and unbundling and rebundling those values, but I don't think that would improve things much.
     Full disclaimer:
    I used to be of the same opinion and even used performance arguements to make my point. I have since, changed my mind.
    Let me illustrate (hopefully). This link (if it works for you, use lefthand pane to navigate hierachy) shows an app I wrote from about 10 years ago when I was in my early days of routing wires. Even the "Main" VI started to suffer from too many wires as this preview from that links shows.
    Clustering related data values using Type Definitions   is the first method I would would urge. This makes it easier to find the VIs that use the Type def via the browse relationships>>>callers. If I implement my code correctly, any problem I believe is associated with a particualr piece of data that is a Type def has to be in one of the VIs that use that type def therefore easier to maintain.
    When I wrote "related data" I am refering data normalization rules (which my wife knows and I picked-up from her and I claim no expertise in this area) where only values that are used together are grouped. E.G. Cluster named File contains "Path" and "Refnum" but not "PhaseOfMoon". This works out nicely with first creating sub-VI since all of the data related to file operations are right there whe I need it and it leads into the next concept ...
    When I look at a value in a shift register on the diagram taking up space that is only used in a small sub-set of states, I concider using an Action Engine . This moves the wire from the current diagram into the Action Engine (AE), and cleans up the diagram. The AE brings with it built-in protection so provided I keep all of the opearations related to the the Type def inside the AE I am protected when I start using multiple threads that need at that data (trust me, it may not make a difference now but end users are clever). So that extra wire is effective encapsualted and abstracted away from the diagram you are looking at.
    But I said earlier that I would not sell LVOOP so I'll show you what LVOOP based LV apps look like to contrast what I was doing ten years ago in that earlier link. This is what the top level VI looks like.
     And this is the Analysis mode of that app.
    I suspose I should not mention that LVOOP has wizards that automatically create the sub-VI (accessors) that bundle/unbundle the clusters, should I?
    Continuing...
    dbaechtel wrote:
    We have had a long discussion about the use or misuse of Local variables...I also have a bug whenever I try and copy some code...
    If you can simplify the code and duplicat ethe bug. please do so. We can get it logged and fixed.
    dbaechtel wrote:
    I am also having trouble using trouble using Variable Property Nodes....
    That sounds like a usage issue. Posting code to illustrate the process will et us take a shot at figuring out what is happening. 
     dbaechtel wrote:
    Creating subVIs is often not appropriate... My application already has over 150 sub VIs.
    "Back in the day..." LV would not even try to create a sub-VI that involved controls/indicators. I use sub-VIs to maintain a common GUI often but I do it on purpose and when I find myself creating a sub-VI that involves a control/indicator, I hit ctrl-z immediately! 
    I figure a way around it (AE ?) and then do the sub-VI.
    Judging by your brief explanation and assuming you do a LVOOP implementation, I would estimate that app need 750-1500 VIs. 
     dbaechtel wrote:
    The LabView Clean Up Diagram function often works poorly.... 
    THe clean-up works fine for how I use it. After throwing together "scratch code" and debugging the "rats nest" I'll hit clean-up as a first step. It guess good enough on simple digrams and in some cases inspires me to structure the diagram in a different way that I may not have thought about. If I don't like, ctrl-z.
    Good deisgn and modualr implementaion led to smaller diagrams that just don't need thrre screens.
     dbaechtel wrote:
    My troubleshooting strategies don't work well for large diagrams and complex applications....Can anyone offer their advice on how best to use the LabView features to achieve these results in complex applications? I hope that you can help show me the light.
    Smaller diagrams single step faster since the sub-VI run full speed. I cringe thinking about a 3-screen diagram with multiple probes open ( shivver!).
    Re: Nestested structres
    Sub-VIs (wink, wink, nudge, nudge)
    If it works you have prven the concept is possible. That is the first step in an application.
    I hope that gives you some ideas.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • How to use crystal reports in LabVIEW

    Hi, Can anybody tell me how t use crystal reports in LabVIEW. If any material if you will be very helpful.
    Regards,
    Rajashekar

    activeX

  • How to use labview with the handyboard

    Hi,
    how to use labview with the handyboard
    Thx...

    I'm assuming you're talking about this, since you didn't provide a link for those of us who don't know what you're talking about.
    As the other poster said, you didn't say how you want to use LabVIEW with it.  If you want to write LabVIEW programs than run on the microprocessor, then you're out of luck.  If you want LabVIEW to interact with it, then you've got a couple of options, SPI probably being the best, but it also has DI and AI that you could use to communicate with it - the DI's could be used as a parallel interface.
    Message Edited by Matthew Kelton on 12-17-2007 02:21 PM

Maybe you are looking for

  • Unable to print TOP_OF_PAGE in ALV Using OOPS??plz help me in my code

    *&--data declaration TYPE-POOLS : slis. TYPES : BEGIN OF type_vbak,         vbeln LIKE vbak-vbeln,   "SD Order         ernam LIKE vbak-ernam,   "Name of the person who created         audat LIKE vbak-audat,   "Document Date         vbtyp LIKE vbak-vb

  • AP payment batch is running very slow

    Dear All, We have an issue with AP payment batch, wherein the batch is unning very very slow. The DB is currently on 11gR2 with application on CU2. After having a close look into it we have seen an SQL ID dq3nnqyx0u7ht. Please help us if any patch is

  • Firefox 12 is very slow to load pages and if two tabs are opening it lags.

    Firefox 12 is very slow to load pages and if two tabs are opening it lags. I've tried everything (e.g virus, defrag, I've created a new profile in "run", etc). I've had my router/line etc all checked and after 2 weeks of cleaning/testing etc its all

  • No Count all Rows in Discover Plus

    Hello, in Discoverer Desktop it is possible to count all rows (menue: sheet|Count all Rows) for a query. In Discoverer Plus Version 9.0.4 i am missing this feature. Is there a work around for this problem? Franz

  • Image does not display

    Hello, I am experimenting with Apex at the Oracle site (this should define the version and the environment, which I don't know much about). I uploaded a picture and created a region on a page to display it, but it doesn't get displayed, while a pictu