DAQ Assist Reading Wrong Voltage

Hello,
I'm using the DAQ Assistant VI to read an analog input voltage from a National Instruments PCI-6221 card. I'm reading the voltage from pin AI0. I supply a voltage directly to this pin from a DC power supply, but the voltage measurement obtained from the DAQ Assistant is incorrect - it seems to be scaled by a factor of about 1/3. For example, if I supply 4 Volts to pin AI0, the DAQ Assist reads 1.43 Volts. I used a multimeter to confirm that the voltage at pin AI0 is in fact 4 Volts, and so I know the problem is with my LabVIEW program and not my power supply.
Here are the steps that lead to my problem:
1. In the block diagram, I insert a DAQ Assistant block.
2. In the Properies of the DAQ Assistant, I select Analog Input->Voltage
3. I select channel ai0
4. I click "test" in order to test the channel
5. The voltage is shown to be 1.43 Volts, even though 4 Volts is being supplied to the pin (this is confirmed with a multimeter).
6. To ensureI click OK to finish configuring the DAQ Assistant. I run the program and plot the voltage. The plot also shows 1.43 Volts.
Does anyone have an idea why this may be occuring. I've spent a good 4 hours trying to diagnose this and haven't found anything.
Thanks,
Abed Alnaif
Solved!
Go to Solution.

Thanks so much for your help. I used MAX, and found out that the issue
is that I had specified differential voltage, but I should have
specified RSE voltage.
However, now I have a different issue: When I apply a voltage to one pin, MAX also shows a voltage on other pins.
For example:
1. I apply a DC voltage of 4V to analog input pin ai0.
2. In MAX->Text Panels..., I select Channel Name = Dev1/ai0 and Input Configuration = RSE
3. I click Start, and the chart shows the correct voltage (4V)
4. I change the Channel Name to Dev1/ai1and Input Configuration = RSE
5. The chart shows a noisy voltage reading between 1.45 and 1.5 V, even though no voltage is applied to pin ai1
6. When I change the voltage on pin ai0 to 2V, the voltage reading on pin ai1 changes to 0.63V
7. Using my multimeter, I confirm that there is in fact 2V on pin ai0 and 0V on pin ai1
Does anyone know why applying a voltage to pin ai0 causes a voltage reading on pin ai1?

Similar Messages

  • Scaling problems with DAQ Assist

    I'm a new user and am trying to do some simple scaling of my voltage inputs using the DAQ Assist. For example: one channel inputs around 8V on a 0 to 10V input selection. I am trying to scale it (linear) to show me around 28V by using the y=mx+b formula. My m value is 3.2 and b is 0. What the DAQ assist reads is around 16V instead of 25V (3.2 * 8). I custom make several scales, basically multiplying the input by 2, 3, 4, & 5, but none of them outputs what I would expect according to the formula and even the 5x causes the displayed value to decrease. If I go to "no scale" I read 8V, which is what is actually coming into the 6255 card. Any thoughts?
    Solved!
    Go to Solution.

    Hello DB66,
    Keep in mind that the Signal Input Range should be set post scale. What do you have defined as your signal input range, you're readings may be scaling themselves to the signal input range. With a coefficient value of 3.2, your range should be Max=32 Min=-32, since your device likely has a +/-10V range.
    Hope that helps.
    Regards,
    Glenn

  • DAQ Assistant Tasks

    How do I create a task in LabView DAQ Assistant for one of our cDAQ modules without actually being connected to the cDAQ module?
    Solved!
    Go to Solution.

    You can simulate a large scope of instrument supported by DAQmx with Measurement & Automation Explorer.
    Right click on NI-DAQmx Peripherals then Create new then Simulate and choose in the list of supported device.
    For cDAQ, your chassis in first then your module.
    Excuse me but I don't have an english version of MAX and so, I don't have the correct translation of command...
    When your simulated device is configured, you ca use it with the LabVIEW DAQ Assistant.

  • How can I programmatically change the voltage range settings in a DAQ Assistant

    Hi,
    First post here.  
    I need to be able to change the voltage range properties of a daqmx DAQ Assistant based on user input.  My hardware, an SCXI-1102C does not allow changing this property on a running task, so I'd like to either set the analog input voltage range before the DAQ Assistant activates, or pause the DAQ Assistant immediately after it starts, set the values and then resume.
    I don't know how to edit the task ahead of time because the DAQ assistant creates the task when it runs, and there is no task before that.
    In the attached picture, I have a conditional section, set to run only if the while loop iteration is 0.  I take the task from the Daq assistant, send it to a stop task vi, set the property, and then send the task on to the start task vi. I can watch it run with the debug light on, and everything seems to work correctly, but on the second (and all the other) iteration of the loop, I read out AI.Max and it seems like the DAQ Assistant has re set it back to 5V.  Can anyone see what is going wrong here?
    BTW, this is continuous acquisition and the code does not produce error messages when it runs.
    I did come across a similar question that someone posted here back in 2006, but his question was specifically aimed at a Labview API (VB, I think), and not an actual G solution.
    Attached are the actual vi in question and a png image of the block diagram.
    Thanks! 
    Ruby K
    Solved!
    Go to Solution.
    Attachments:
    Labview_question.PNG ‏14 KB
    Sample_AIV.vi ‏91 KB

    First, if you want to start getting beyond the basics with DAQ, you are going to have to stop using the DAQ assistant and do it with lower level DAQmx VI's.  There are hundreds of examples in the example finder.  You can even right click on the DAQ assistant and select open front panel.  That will create a subVI that you can open and see what is going on behind the scenes.  Do it.  I think you'll find the DAQ task is being recreated on each (though I'm not 100% of how the settings are established or maintained in each section of that subVI).
    The second problem is you have a bit of a race condition on iteration 0.  Those two DAQ property nodes are running at the same time.  So when you read the AI.Max, it may be happening before or after the AI.Max is set in your case structure.
    Third, make sure you wire up your error wires.

  • Using DAQ-assist to input a waveform; need help building a counter to count voltage "spikes"

    Hey all! I'm pretty new to labView and even newer to this forum, but its nice to meet you all...I hope that perhaps someone can help me with my problem.
    Allow me to begin by detailing the specifications of the problem.  I am an undergraduate student, and have a job doing research in a MEMS (micro/nanotech) lab.  The graduate student I am making this program for is working on biomedical applications;  eventually, the program will be connected to a microdevice that has a tiny channel in it, cut through a wee little capacitor, which blood will run through.  As red blood cells pass this capacitor, the voltage will spike; meaning that for each voltage spike, we can (and are trying to) count the number of red blood cells.
    However, I am still in the early developement of the program, so this above specific info is not that important.  Basically, I am using a function generator to input a waveform to the DAQ assistant of, say 500 mV.  I am trying to write a program that increments a counter every time I turn the voltage above say 550 mV (peak-to-peak), counting the number of simulated "spikes."  I have tried quite a lot to write a working program, and although I have not gotten it to work yet, I will post a screenshot of my most recent attempt HERE:
    I thank you in advance for any helpful tips, advice, or straight up assistance you may be able to give me.  Please ask me any clarifying questions about the program I wrote or the application, or anything.  Happy Friday! 

    Hey guys, it's been a while!  A lot of stuff has been happening in my life and I have had virtually no time to work on my LabView project.  
    I did create a LabView program based off IanW's reccomendation.  I am unsure of what exactly is going wrong, but when I run it, only a simple "snapshot" of a waveform from the DAQ shows up in the graph.  Even when I put the DAQ assist in a seperate while loop, the same thing happens.  I am including a screenshot of the project in case I am messing something entirely different up.  If you happen to read this, I really appreciate your help and thank you Ian! 
    I am also having a random issue with the filter signal VI.  So that background signals in the actual experiment do not read as "spikes" I have been instructed to include a high-pass filter in the VI.  However, everytime I use the high pass filter VI, it botches my signal and turns it into a bunch of noise!  I, nor my graduate mentor (who isn't too well-versed in LabView) have any idea why this is - we've tried using different types of filters to no avail.  
    Lastly, I would like to talk to Peter about a few questions I had abour LabView design.  In case you're still around, I will write another post later today with more detail.  In the meantime, I will try to find some of the example VIs about shift registers   All who read this have a great day!
    Attachments:
    count spikes pic.png ‏29 KB

  • DAQ Assistant: Clock Settings (Samples To Read, Rate) can affect signal readings?

    Dear all,
    I'm totally new to Labview, and recently I get confused with the equipment I'm dealing with.
    Technical details:
    The Labview version is 7.1; computer operate system is Windows XP;
    The equipment has a NI PCI-6220 and a 68 Pin Connector Block to read signals from the equipment;
    There are 4 channels in DAQ Assistant (2 pressure reading, 2 temperature reading);
    For the first pressure reading, Signal Input Range from 4m to 20m Amps;
    Clock Settings are Samples To Read = 5, Rate (Hz) = 20.
    Description of the problem:
    I use Labvew to monitor and record pressure readings and temperature readings. The Labview configuration was set up by my advisor several years ago. Recently, I found the pressure reading vibrated a lot; for example, 5.01 to 5.05 bar within a second. In order to get a stable pressure reading, my advisor suggested me to change "Clock Settings" in DAQ Assistant from Samples To Read = 5, Rate (Hz) = 20, to Samples To Read = 250, Rate (Hz) = 1000. In this case, she believed that since we increase sample numbers and sampling rate, we could have more data, and thus have stable pressure readings.
    At first I could have very stable pressure reading. The last digit (0.01) did not change within 20 seconds. However, somehow after a day the pressure reading became unstable and even worse than previous. (pressure reading vibrates from 5.01 to 5.30 within a second)
    This is not the worst case. We found that when we set Clock Settings: Samples To Read = 5, Rate (Hz) = 20, the pressure reading is about 8 bar. However, when we set Clock Settings: Samples To Read = 250, Rate (Hz) = 1000, the pressure reading is about 5 bar. In this case, we even don't know which pressure reading is correct.
    Labview records current, and transforms it into pressure reading. Thus my advisor tried to monitor the current reading by Labview, and she found the current reading changed when she changed the Clock Settings. (0.004 Amps (5 bar) when Samples To Read = 5, Rate (Hz) = 20; 0.005 Amps (8 bar) when Samples To Read = 250, Rate (Hz) = 1000)
    Since we only change the sample numbers and sampling rate, the average readings should still be similar. However, the reading are not similar. That is what confuses me.
    My questions are, if Clock Settings in DAQ Assistant could affect signal readings? If so, how it could affect the signal readings? What is the effect of "Samples To Read" and "Rate (Hz)"? How to determine these parameters to get the true pressure readings?
    Thank you very much for your help. Hope to have some feedbacks from you.
    Best regards,
    Cheng-Yu
    Energy and Mineral Engineering
    the Pennsylvania State University

    A 6220 cannot read a current, it can only read a Voltage, so you'll probably have some (or should have) a resistor accros the voltage input. (normally 50 Ohm for a 0-20 mA signal).
    My first step would be to measure this voltage with a multi-meter so you know what the actual voltage should be.
    Then I would read that same voltage with MAX (measurement and automation explorer) to make sure you have the right value
    Now about the changing of the voltage/current/pressure, how have you terminated the other signals? Have you provided a good earthing?
    If you sample with a high frequency (1 kHz), perform an FFT on the acquired data, I can imagine a dominant 50 or 60 Hz (depends on where you live) in the signal that might cause your problem.
    Ton
    Free Code Capture Tool! Version 2.1.3 with comments, web-upload, back-save and snippets!
    Nederlandse LabVIEW user groep www.lvug.nl
    My LabVIEW Ideas
    LabVIEW, programming like it should be!

  • I have a DAQ Assistant configured to read multiple channels at the same time. When I wire a graph indicator to the output, I see all of my signals jumbled together. How do I split them up into seperate signals?

    I have a DAQ Assistant configured to read 2 channels at the same
    time. When I wire a graph indicator to the output, I see the 2
    signals jumbled together. How do I split them up into seperate signals?
    When I wire any type of indicator it is showing just one output of a single channel.
    I want 2 indicators showing 2 different signals as expected from the 2 channels configured. How to do this?
    I have tried using split signal but it end up showing only 1 output from 1 signal in both the indicators.
    thanks in advance.
    Solved!
    Go to Solution.

    Yes you are right. I tried that but I did not get the result.
    I just found the way. When we launch split signal, we should expand it (split signal icon) from above and not from below. It took me a while to figure out this. 
    thanks 

  • I'm trying to set up a DAQ assist just to measure some voltage, how do i get the graph to start from 0 (time) every time I press run

    Hi all,
    I am trying to set up a simple DAQ assist to measure some voltages (currently a 9 volt battery to aid set up), when choosing to use a waveform chart to log the voltages the graph doesnt start from 0 (time seconds) how do I do this and get it to reset every time I press run or even stop.
    What I want to see at the end is a chart for the full lenght of the test showing voltage against time in seconds.
    Any ideas peeps
    many thanks
    Shane

    Hi Shane,
    Look at this VI
    Here, I clear the chart before running the VI, using a 'history data' property node ( i pass an empty array to clear it)
    In effect, each time you run the VI, the chart will begin at 0:00
    Hope this helps
    Regards
    Dev
    Attachments:
    chart_start.vi ‏20 KB

  • How to properly read data from one DAQ-assistant and write simultaneously with another DAQ-assistant (which is inside a loop)

    Hello.
    I'm a newbie working on my Master's thesis conserning a project that is based on old G-code made by another newbie so bear with me.
    I need to create a sequance of output controls. For this I'm using a for loop that eventually creates two triangular ramps during a period of 90 seconds. I've confirmed that this function works properly by measuring the actual output of the DAQ-decice (NI USB 6353).
    The problem is the following: During this controll-cycle I need to simultanously collect data from the same DAQ-device. At this point there is only one DAQ-assistant output-block in the main loop of the program and all the signals are derived from it to where they are needed.There is a case-structure (the bottom case structure in the picture) that contains the functions needed to collect the data during the test cycle. However these two actions, outputting data and inputting data, are not synchronized in any way which may be the reason why I get the 200279 error or alternatively the 200284 error during the test cycle. I've tried changing the sample rate, buffer size and the timeout time as adviced but nothing seems to help.
    What would be the simplest way to solve this problem?
    Help is greatly appreciated!
    Attachments:
    problem.jpg ‏206 KB

    Thanks for quick reply.
    However, I did try it (see the picture) but I still have a problem: I only get 100 samples / channel during the test sequence (all from the first seconds of the sequence) in total even though I've set the data aqcuiring DAQ-assistant as "continous" and "samples to read = 95k" and rate is 1000Hz.
    Edit.
    And lastly, I have trouble adding this "extra" DAQ-assistant to the vi. because I get an error about a resource (The 6353) being reserved, even though I connected a false constant to the "STOP" -input of the main DAQ-assistant.
    Attachments:
    is_this_what_you_meant.jpg ‏212 KB

  • When using DAQ assistant to read frequency

    When using DAQ assistant to read frequency and Task timing is set to:
    N Samples, Clock settings to read 26,
    Frequency setup to rising edge,
    1 counter with 10 kHz to 1 kHz range.
    I get back a single number.
    Can I assume this is an average reading of 25 samples with the first sample unused?
    What is the base clock used?
    Is the “26” 26 cycles of the frequency to be measured?

    Hello,
    If you choose to acquire N samples from the DAQ Assistant then it will acquire all of these samples and return them to LabVIEW as an array.  However, when you use the DAQ Assistant it outputs the data in the dynamic data type first.  This data type makes it easy to graph and run the data through other express VIs.  If you were to create a Numeric Indicator from this data type it would just display the last element from the dynamic data array.  To display this data properly in a numeric format convert the dynamic data to an array of doubles by using the Convert From Dynamic Data function in LabVIEW.  Then you can select to convert it to a 1D array of scalars and when you create an indicator off of the output of this function all of the data should be displayed.
    The timebase that is used for lower frequency measurements is the onboard clock, which is internally connected to the Source.  Then you connect your signal to be measured to the Gate of the counter.  Since the frequency of the onboard clock is known it can be used to calculate the frequency of an unknown source based on when the counter is on and off (determined by the Gate). 
    Have a good day,
    Brian P.
    Applications Engineer

  • How can I read the active (plugged in) DAQs and then send that to the device name input on DAQ assist?

    I have a system property node for daqmx but it does not let me change it to read when i right click on it. I am trying to have my program detect the name of the daq that is plugged in to the PC and then send that to daq assistant so that it will run properly wiithout me manually having to change the device name every time i switch hardware.
    Solved!
    Go to Solution.

    labview12110 wrote:
    Im just frustrated that the only function I have is to get a list of things that I can't do anything with. MAX knows which is active can I call it up somehow?
    You have do do programming.  That is what LabVIEW is.  MAX gives you all the tools to do everything you want and much more just program it to do what you want.
    Attached is a VI that I think does what you want.  I looks at all of your devices and returns the first non simulated one.  Apparently this list already excludes devices not connected to the system.
    Unofficial Forum Rules and Guidelines - Hooovahh - LabVIEW Overlord
    If 10 out of 10 experts in any field say something is bad, you should probably take their opinion seriously.
    Attachments:
    Find Non Simulated Device.vi ‏6 KB

  • How to read multiple voltages from DAQ at same time?

      Hello all,
      I am attempting to read multiple voltage inputs to my USB-6008 DAQ but when I try to read a second voltage, I get an error saying "Erorr 50103 the resource is reserved." Someone in another thread said:
    "The reserved resource is your DAQ board. You should not be using two separate simultaneous tasks for the same DAQ board. You should use a single task with two channels configured."
    So I made multiple channels in one task, but I don't want to read all the channels at once. Is there a way to specify which channel of the task to read? Or is it not possible to acquire multiple voltages from the DAQ at all?
    So the question boils down to: if you have 3 different voltage inputs into your DAQ, all corresponding to different components, is there a way to be tracking and reading all of them at the same time?
    Thanks a lot

    shahidi124 wrote:
      Hello all,
      I am attempting to read multiple voltage inputs to my USB-6008 DAQ but when I try to read a second voltage, I get an error saying "Erorr 50103 the resource is reserved." Someone in another thread said:
    "The reserved resource is your DAQ board. You should not be using two separate simultaneous tasks for the same DAQ board. You should use a single task with two channels configured."
    So I made multiple channels in one task, but I don't want to read all the channels at once. Is there a way to specify which channel of the task to read? Or is it not possible to acquire multiple voltages from the DAQ at all?
    So the question boils down to: if you have 3 different voltage inputs into your DAQ, all corresponding to different components, is there a way to be tracking and reading all of them at the same time?
    Thanks a lot
    I don't understand your question.  You made a task to read all the channels at once, but you do not want to read all the channels at once?  And you want to track and read all the channels at the same time?
    You need to explain it a little bit better and add some code of what you have done so far.  A task will read all the channels in the task and give you the results so you can track them.  So I am confused.

  • DAQ-Assist crashes Labview....

    Hi,
      I'm running Labview 8.0.1
     I use the DAQ-Assist to read info from some channels.....
     I drag the icon on my block diagram, which automatically opens the wizard. I setup channels, type of signal (voltage, current, etc), hit OK and after ~30 seconds (where LV says first "Verifying task' and then "Building VI, please wait"), it creates an ExpressVI without any problems.
     The problem is whenever I try to edit/change/add settings.... Say that later I wanna change the voltage range on one of the channels, or add another channel. I go in (double click on the icon), make the changes and when I hit OK, LV crashes after about 15 or 20 seconds of displaying the "buidling VI, please wait" message.
     Usually I have to re-create the whole thing (drag a second icon, and delete the first one) to avoid the problem.
     I've seen this happen even with new VI's that I create from scratch. It happens 95% of the time, other times it works flawlessly.
     What's wrong?
    Attachments:
    DAQ-Assist-error.JPG ‏187 KB

    Hello Emerino,
    Thanks for contacting National Instruments. 
    It looks like there is an issue with the installation of either your DAQ driver.  The knowledgebase LabVIEW Application Error: The instruction at address x referenced memory at address y. The memory c... goes over possible causes of the error you are seeing.  Most likely a file in your LabVIEW or DAQmx install has become corrupted and is causing LabVIEW to crash. 
    First, attempt to repair all of your NI software.  This can be done by going to the Add or Remove Programs screen and clicking the change/remove button next to the National Instruments Software selection.  This will bring up a list of all of the installed NI software that is on your computer.  Highlight all of this software and choose to repair.  During this process the installer will ask for the LabVIEW CDs as well as locations of any installers that were downloaded to your hard drive. 
    If you are still having trouble running your DAQ assistant, update to the latest version of DAQmx.  This will have updates to numerous known issues as well as increase the stability of the DAQmx driver. 
    The Investigate Previous Internal Error window you are seeing after a crash is a way for you to report any unexpected errors in LabVIEW that caused it to shut down.  If you choose to report the error, National Instruments can use the information to improve its software products.  This window should only appear when opening LabVIEW after a crash occurs. 
    Please post back if you continue to have issues or if you have any questions. 
    Regards,
    Browning G
    FlexRIO R&D

  • How do I use DAQ Assistant to output DC on an AO?

    My background includes 15 years of programming PLC's. AO's were simple. You passed the AO a value between preset ranges and it output a proportional voltage signal based on the hardware capabilities.
    I foolishly thought LabView and DAQ AO's would be about the same. I built a vi to generate a DC signal which I could control the amplitude of. I also tied it to a chart to monitor how it worked. I managed to draw some very pretty ramping pictures on my chart, but when I put a DMM across the physical terminals I see no voltage.
    Then I used MAX to test the channel and measured it again. Now I see voltage on my DMM. I was attempting to use the DAQ Assistant to do this. Do I need to force it to re-read the changing value I am sending it? All of the examples I find show me more primitive methods, and I could not seem to get any of them to do what I want either. I need to send a speed reference to a DC drive.
    My incorrect method was to recursively add a value for ramping the signal up or subtract to ramp down. This method works perfectly on an industrial PLC. I connected a chart to the signal just as it entered the DAQ Assistant, but the only way I could check the actual was with the DMM.
    Are there any examples that use the Assistant? If not then I guess I will have figure out how to use the primitives anyway.
    technomage

    Hello technomage,
    It looks like we have discussed this problem in another thread (found here).  If you have any other questions about it, please post there.
    Regards,
    Jesse O.
    Applications Engineering
    National Instruments
    Jesse O. | National Instruments R&D

  • DAQmx driver for DAQ 6009? Need DAQ Assistant in Block Panel palette.

    Do I understand things right?  The Ni USB DAQ 6009 is supported in DAQmx base and not DAQmx, or is this only true for the 6008? I need the DAQ Assistant in my LV 2009 Block Panel palette.  Do I go to my Ni-DAQmx 8.8 CDs, or do I download something.  Maybe I have the DAQmx base already?  I'm going to search my palette and NI Max.  I am also going to look at http://www.ni.com/support/daq/version_portable.htm#du.   If I'm going about this all wrong, please help.
    Thanks
    Norm

    Hi Dennis
    My DAQmx Cds are 8.8.  I don't  have a selection for the DAQ Assistant on my block panel palette, so I'm looking to get an update for my DAQmx driver for my USB 6009 DAQ that I'll be using with my  student software for LabVIEW 2009 version 9.0.  I've looked at a NI document "NI-DAQmx and NI-DAQ Driver Support: Portable Devices"  that was published on Jan 17, 2013.  It says for Windows 7 I should be using 9.6mx, but this document make no reference to a version of LabVIEW.  The date of the document is 1/17/13; maybe it is refering to  only to version LabVIEW 2013.  Can you help?  
    From what I've read I believe I need to use DAQmx and not DAQmx base, but I can't help wondering if the DAQmx base has something to do with the "NI DAQmx Device Basics" that I see on the right side of the screen when I'm in NI MAX getting ready to run the test panels.    ????? 
    Thank you for your time
    N.D.

Maybe you are looking for

  • Optical sound out

    Hi What is the best way to connect the new mac mini to get high quality audio ?

  • ORACLE_HOME in Oracle RAC 11g (Windows Server 2008)

    Hello, I've got two (2) nodes in an Oracle RAC 11g in Windows Server 2008. Oracle software is installed in C:\app. I realized that it's not been set ORACLE_HOME path. In System Properties-> Advanced -> Environment Variables, I see that ORACLE_HOME is

  • Debit memo and Credit memo

    Dear Expert, I always confused on Debit memo and Credit memo...especially the business scenario behind and how it is reflecting in SAP system. Let's take this example: e.g the Price in the main Invoice is lesser than the actual price, and the differe

  • Problems with the correct Customizing of the Support Desk Rule 13200137

    Hallo All, I have an issue with the rule 13200137. If I create the responsibility and assign a user, the Support Desk works well. The partner processing get the user of the created rule and fills the field "SupportDesk", with the assigned user. I tri

  • Calling Backing Bean Method with h:outputLink

    Is there a way to call a backing bean using h:outputLink isntead of h:commandLink ?