Calibration & LabView

Hello,
Need to calibrate an instrument that will later be used with LabVIEW.
Does anyone know of a measument instrument that meets the following requirements:
1) Resolution = 0.5 mm (no greater than this, but less is fine)
Range = 0-34cm (longer that this is fine)
Must be a non-contact measurement (i.e ultrasonic, laser,etc)
Thanks for any help. Please answer quickly, I need this answer yesterday.

Hello,
Thanks for asking about my progress. I got caught up with the next phase of my project that I did not quickly update this message. Apologies.
The calibration of the instrument went very well.  Resolution and range requirements were quickly met by buying an instruments from the internet.  I was able to have a VI to get slope and intercept for me from the data gathered from the calibration. 
Now, I am trying to get that data into a database (MS Access), but because my data is 3D but contained in notepad (2D), I am having problems inserting my data into Access using the Database Connectivity toolset VIs.
If I can solve this problem, it is smooth sailing.  But till then...... I will be working at it.
Thanks again for inquiring about the project's progress.

Similar Messages

  • Rtd calibration labview labjack

    Hello my name is mike and i'm a student in London Ontario, I'm taking an instrumentations class and was recently given an assignment to complete:
    Objectives:
    The instrumentation project consists of taking a sensor and implementing a computer interface. It should incorporates the following:
    · Sensor signal conditioning
    · A microcontroller (HC11/PIC) or data acquisition unit
    · LabVIEW (and/or Visual Basic, Visual C, Visual C#)
    · System calibration and data reduction
    · Documentation
    Description
    The students will typically work in groups of 2 (unless discussed in advance with the lab instructor). The project consists of the sensor signal conditioning, microcontroller software and hardware development. The microcontroller (or data acquisition unit) performs the A/D conversion and transmits the raw data to PC (running LabVIEW, Visual Basic, or Visual C). The data is processed on PC and displayed in real time as a graph/chart.
    Calibration is an essential part of this instrumentation project. The students have to use proper calibration procedure, similar to the procedures followed in industry.
    The user is offered the following choices (through visual interface)
    a) Time interval between successive measurements (units of 0.01 second)
    b) Total number of measurements to be taken.
    A calibration should be carried out over a suitable range that is both functional and easily attainable in laboratory, e.g. a temperature range of 0 – 100 (»90) °C is quit acceptable for temperature sensors.
    Submission
    · Demonstrate the program during any scheduled laboratory time.
    · Submit the following both as hardcopy as well as on a floppy disk
    · Project description and operation
    · All design calculations related to circuit design.
    · A brief - user guide (1 or 2 pages)
    · Commented source files
    · Schematic diagrams of signal conditioning circuit
    · Clearly identify the names of individuals in the group, on the diskette and all associated documentation
    Evaluation Criterion:
    (30%) Documentation
    (20%) Calibration Procedure
    (50%) Overall project
    Submission: April 8/2005.
    Last date for submission (without penalty): April 20, 2005
    I have choosen to do RTD using lab jack with lab view... If anyone can help me even with where to start it owuld be greatly appreicated.
    thank you
    Mike

    That's a pretty open ended question but I'll suggest some directions
    It sounds like your requirement is to demonstrate in the lab, so you
    can probably get away with assuming 1) fixed wire length and 2) fixed
    environmental temperature. If so, the easiest measurement will be a
    two-wire approach. 3 and 4 wire measurements provide better accuracy
    but may not be necessary.
    Some background on RTD measurements:
    http://www.omega.com/temperature/Z/pdf/z054-056.pdf
    Typical use of an RTD is to supply a reference current and measure the
    voltage across the RTD. Keep the current small to avoid self-heating
    in the element. A pulsed current source can reduce effective heating
    while allowing a larger peak current. Use the voltage and currents
    measured to calculate the resistance of the RTD which will be
    approximately linear with temperature over a given range.
    Once you know the resistance, calculate the temperature. Find a table
    for your RTD and pick two values, one each near the upper and lower
    ends of your desired range. Do a linear interpolation between the
    known values to determine your unknown temperature from the resistance.
    Calibration can be accomplished by adjusting the known points chosen
    for the interpolation or by a distinct calibration process, depending
    on how accurate you are trying to get and over what range.
    Hope this is useful.
    Matt

  • Binary to Voltage Conversion of encoder data on cRio 9073 using FPGA

    I am using FPGA with a cRio 9073 to acquire torque and absolute quadrature encoder values. It says in the FPGA instructions that the documentation for the 9073 should include the binary to voltage conversion, but when I looked at the documentation, it wasn't there. Where can I find the conversion value or function to convert binary encoder data back to voltage? The encoder is hooked up to an analog converter and is acquired with a 9215 AI (+-10V differential). Thanks

    There are individual formulas for one or a group of modules.
    LabView examples path:
    LabVIEW 2010\examples\CompactRIO\Basic IO\Analog Raw Host Calibration\AI Raw Host Calibration
    LabView help topic 
    Converting and Calibrating CompactRIO Analog Input Values (FPGA
    Interface)
    Best regards
    Christian

  • I have LabView but I do not have the Calibration and Configuration Palette,and I could not download it, how can I download it or if i cannot,can I work with the NI-DAQ Calibration_1200?

    I have read in a tutorial for the board 1200 that I can calibrate it with the Calibration and Configuration Palette in LabVIEW, but I do not have them and I could not download it to access its libraries, so I can only download the NI-DAQ software,What's my best choice and if it is to download the palette with its libraries, how can download it with them?I'd appreciate your answers

    If you wish to use your 1200 device in LabVIEW, you must download and install NI-DAQ. When you install NI-DAQ, it will ask you if you would like to install support for LabVIEW. By installing this support, you will then have access to the DAQ pallette in LabVIEW. The DAQ pallette requires that you have NI-DAQ installed.
    For more information on installing and using your device, you can refer to the DAQ Quick Start Guide. You can download it from:
    http://digital.ni.com/manuals.nsf/14807683e3b2dd8f8625677b006643f0/0eca53fe80911b1f862568560068295d
    Regards,
    Erin

  • Calibration avec Labview IMAQ Vision

    Bonjour,
    Nous sommes étudiants en première année à Polytech' Clermont-Ferrand, université Blaise Pascal.
    Nous utilisons le module de traitement d'image du logiciel Labview depuis plusieurs mois avec M.Lafon afin de repérer les coordonnées en centimètres des centres de gravité de particules.
    Pour cela nous utilisons une grille de calibrage constituée de 4 points disposés en carré, espacés entre eux de 13cm. Or lorsqu'on lance le programme de calibrage (bloc « Learn Calibration Template »), l'origine du repère choisit par Labview change régulièrement. A savoir que certaines fois Labview choisit le point en haut à gauche en tant qu'origine et d'autres fois il s'agit du point en haut à droite, par contre nous avons remarqué que l'orientation des axes ne change pas.
    Nous avons effectué plusieurs tests afin de comprendre le processus utilisé dans le choix de l'origine, cela sans résultats, nous ne comprenons toujours pas comment le logiciel la choisit.
    Pouvez-vous nous aider à répondre à cette question, ou nous dire comment l'obliger à toujours choisir le point en haut à gauche en tant qu'origine ?
    Merci d'avance.
    Cordialement.

    Bonjour,
    Merci d'avoir posté sur le forum NI. Vous avez déjà posté votre question ici des personnes ont déjà commencées à vous répondre. Je vous invite donc par la suite à poster uniquement une fois votre problématique
    Cordialement
    Mathieu B
    National Instruments France
    #adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
    Forum Aéronautique, Spatial et Défense. Avec la participation exceptionnelle de Bernard DUPRIEU, Res...

  • LabVIEW drivers for I2C card (PCI93LV) from Calibre

    Hi, I have a PCI93LV card from Calibre (UK) that interfaces with I2C buss. I'm trying to control that card from LabVIEW, but I don't have any drivers. Can anyone please advise or suggest any solutions? My operating system is Win98. Thank you in advance.

    I developed a set of VI's a couple of years ago to handle the interface with those boards.
    I just called the functions that where available in the supplied dll.
    Watch your calling convention and it should be straigth forward.
    Sorry, I am not permitted to give away code (gotta feed the family).
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Driver for Labview for ATE-100 pressure calibrator from Ashcroft Inc

    Guys,
    I need a subVI to communicate a pressure calibrator model ATE-100 from
    Ashcroft (http://www.ashcroft.com/products.cfm?doc_id=348) with a
    Labview application. The equipment has been originally shipped with and
    old exe file supposed to do the job, nut I could not manage to make it
    run. So I was searching for a way to exchange data it reads with the
    Labview application I am writing. It seems to be standard RS-232
    protocol.
    Can anyone help or give me some hints?
    Thnaks a lot,
    Pixinguinha

    The NI Instrument Driver Network doesn't have an instrument driver for this unit, so it looks like you're going to have to write your own drivers. You will, of course, need the programming manual from the manufacturer. This isn't all that difficult as there are tons of examples. You can just pull down, say, an instrument driver for a multimeter with a serial interface to see how it's done. To just try out serial communication you can use the "Basic Serial Write and Read" available in the LabVIEW examples. From the LabVIEW menu select Help->Find Examples. Change to the "Search" tab and enter "serial" in the search box. You'll get several examples.

  • I need GPIB Driver for Keithley 263 Calibrator/Source Meter using Labview. Can anybody help? Thank you.

    I need GPIB Driver for Keithley 263 Calibrator/Source Meter using Labview. Can anybody help?
    Thank you in advance.

    Hi,
    on Keithley's website is explained :
    "The Model 263 offers a more economical, more precise alternative to the Model 6430 or the Model 236 when used as a low-current source. It is used almost exclusively as a current calibrator."
    This leads to Model 236.... A LabVIEW-driver is available @: http://search.ni.com/query.html?lk=1&col=alldocs&nh=500&rf=3&ql=&pw=595&qp=%2BContentType%3AInstrumentDriver+%2BIDNetManufacturer%3A%22Keithley%22&qt=&layout=IDNet
    Maybe you have to "reconstruct" some parts - but it will help.
    regards
    wha

  • Correct Calibrated Image Crashes Labview

    I'm trying learn a calibration to find the four corners of an LCD and to remove any perspective or rotation. I'm pretty sure I've got it all set up correctly the only thing I want to fix is Correct Calibrated Image creates some jagged edges when the interpolation is set to Zero Order, but when I set it to Bi-Linear (which is the only other non-grayed out option) and then execute the VI LabView crashes everytime.
    Any help would be appreciated.
    Thanks,
    Kevin
    Solved!
    Go to Solution.

    Here's the project I'm working from. If you execute the test.vi with the File Path = test.jpg, Width = 400, Height = 240, and File Path 2 = test2.jpg. It should work and you should end up with test2.jpg's rotation being removed. Then if you go into Correct Image.vi under the Display class and change the interpolation from Zero Order to Bi-Linear, and run test.vi LabView should crash.
    Thanks,
    Kevin
    Attachments:
    vision.zip ‏1424 KB

  • Writing in sensor calibration data to my virtual channels in MAX using Labview.

    System consists of a few pxi6713 cards, a 6527 digital input card and some scxi cards (1520, 1104c, 1125).
    Every 3months or so we run a calibration on the testing stand which uses these cards, and the sensor ranges and physical ranges listed in MAX has to be manually entered each time.
    I am looking for a way to do this automatically.  This may be a problem, but we are using NI-DAQ legacy (perhaps if absolutely necessary we could goto daqmx).
    I have figured out howto read virtual channel information from max correctly, but I have no understanding of trying to write calibration information to the virtual channels.
    Any help is greatly appreciated!
    Thank you!
    jacob

    Hi mak90,
    There are no ways to change the sensor range and physical ranges
    programmatically in Traditional DAQ. Under DAQmx, you have Scaling VIs,
    Calibration VIs, Task VIs that allow you to change various settings in
    MAX programmatically.
    Please let us know if you have any further questions.
    Best regards,
    Nathan Yang
    Applications Engineer
    National Instruments

  • Pressure sensor calibration in labview

    Hi all,
    I have a problem in calibration of pressure sensor.In attachment i am sending the data-sheet of the sensor and the block diagram i made.I used linear fit to calibrate.y axis range:0-100 bar x axis range:0-10VDC
    Are there more accurate solutions existing?
    Regards,
    Attachments:
    H72300.pdf ‏380 KB
    pressure read 2.vi ‏33 KB

    That linear fit being in there makes the VI more complicated than it needs to be.  Since 0-10V equals 0-100 psi, it is simple enough to just multiple the DAQ read by 10 and your done.
    Back to your other thread, now that you've told us that the sensor outputs 0-10V, that means you have the right DAQ card, but using Pressure(Bridge) in the DAQ assistant was just wrong.  You would have used analog input.  You can set a custom scale in there to convert 0-10V to 0-100 psi.
    At the end of the day, your current VI is better because you are using the real DAQmx VI's rather than the DAQ assistant.

  • Has anybody got Labview GPIB driver for a Time Electronics 5021 Calibrator

    Windows NT and 2000 platforms.
    Time electronics ltd multifunction claibrator model no 5021 or model no 9821

    Hi,
    National Instruments' Instrument driver library is at:
    www.ni.com/idnet.
    I did a search and couldn't find an instrument driver for this instrument.
    You can contact the instrument manufacturer to see if they have a LabVIEW instrument driver.
    Other options are contacting an alliance member to develop the instrument driver or you can do the development yourself. Here's a tutorial on writing instrument drivers in LabVIEW:
    http://zone.ni.com/devzone/conceptd.nsf/2d17d611efb58b22862567a9006ffe76/117f9eaedfd2c9e58625680a005acd06?OpenDocument
    Hope this helps.
    DiegoF.

  • Creating bridge scale in labview -No calibration certificate with pressure sensor

    Hello everyone,
    I watched a Laview video online on how to measure pressure transducer  and applied the steps to my laboratory work ( https://www.youtube.com/watch?v=spTHTfjVlo8 ) but I couldn't go further when  I got to the section of creating a bridge scale for the pressure sensor model I bought from Measurement Specialties   (EPX-N02-1B-/Z2/L1M/25M) because the calibration certificate did not come with the device.
    I proceeded to run the test but the pressure value I got was way beyond the capacity of the device (1 bar).
    My question is; is there a way to sensibly adjust the values of the physical and electrical units without having the calibration certificate of the device?
    Thanks in advance for your response.

    Hi Wapz,
    My name is Mitchell Goon, from National Instruments Technical Support.
    You will be able to download a detailed calibration certificate for your device from the NI website. You can refer to the link below for more information on how to obtain the Calibration Certificate:
    http://digital.ni.com/public.nsf/allkb/7695C93D283BC1A086256B1800631DCF?OpenDocument
    With regards to your query, you will need to have external signal conditioning before acquiring any data from the pressure transducer. These may include:
    Bridge completion
    Excitation
    Remote sensing
    Amplification
    Filtering
    Offset
    Shunt Calibration
    Connecting the pressure transducer directly to the DAQ device may yield unexpected results. You may wish to refer to the following links if you need more information of acquiring pressure with a DAQ device:
    http://www.ni.com/white-paper/3816/en/
    http://www.ni.com/white-paper/3639/en/#toc4
    Additionally you may wish to refer to this tutorial on how to setup your DAQ device for pressure measurements:
    http://www.ni.com/tutorial/7138/en/
    I hope that this information will be useful for you. If you require any further assistance, please do not hesitate to contact us and we would be more than happy to assist you.
    Kind Regards,
    Mitchell Goon
    National Instruments
    Applications Engineering
    www.ni.com/support

  • Problem -10401(LabView-calibration)

    I have the PCI-MIO-16E-4 DAQ with
    CB-68LP. My problem is that I can't calibrate the card because error -10401
    appear. The first person that will tell me the correct answer I promise that
    I sent him a high T bismuth superconductor from Colorado superconductors
    Inc. with thermocouples.
    Thank you
    e-mail: [email protected]

    Lucas Foussekis-Evageliou wrote:
    > I have the PCI-MIO-16E-4 DAQ with
    > CB-68LP. My problem is that I can't calibrate the card because error -10401
    > appear. The first person that will tell me the correct answer I promise that
    > I sent him a high T bismuth superconductor from Colorado superconductors
    > Inc. with thermocouples.
    You didn't mention if you had tried anything in the way of troubleshooting, or
    checked the NI KnowledgeBase. A quick search suggests that perhaps you need to
    wire the device number of your E Series DAQ device into the task ID input,
    assuming you're using the Calibrate VI all by its lonesome.
    This advice probably isn't worth the superconductor, but I appreciate the
    incentive!
    Regards,
    John Lum
    National Instruments

  • Can you calibrate a SMU with a calibrated SMU? How? Does someone have an example of a LABVIEW program that can assist me in calibration?

    this is for a project.

    Hello,
    I'm sorry, I don't understand your question. What are you referring to by the term "SMU"? Can you give us more information?
    Scott B.
    Applications Engineer
    National Instruments

Maybe you are looking for

  • Now, USB External Hard Drive Not Recognized after 10.6.8 Update

    My external Hard Drive is not recognized on start up.  I have to unplug it and plug it back in for it to be recognized. All this since 10.6.8 updated today. I'm a fool for updating the same day. This has happened before and I always forget the bugs i

  • Cannot get mail

    iPad 2  has been getting mail fine until a day or 2 ago. I am now getting the Cannot get Mail  The mail server "imap.gmail.com" is not responding. Verify that you have entered the correct account info into Mail settings. Nothing has not changed from

  • How to run report in Netscape as a default

    Hi, how do I run reports in netscape browser. Reports currently are running in IE. In Forms - preferences I have set it to Netscape and the forms run in netscape which is what I want for reports too. Where in reports builder, i have to specify the br

  • Ovi apps

    If I put in my phone model, Ovi says Opera is available for the C6. Opera's site doesn't list the phone, and Opera usually has a browser for everything. So who's right? I don't like to use OVI as I don't want the item billed to my carrier. It's too o

  • Configure the log file for my app?

    How do I set the directory that my app will store logfiles in using Java Web Start and slf4j? I am at something of a loss here and don't know where to start.