Simple DAQ assistant problem

This is probably a simple fix that I am screwing up, but I can't get changes that I make in the DAQ assistant to apply.  Primarily changing the channels that the assistant reads, I will make the change, exit the assistant, and when I enter the assistant again the onlt connection found is ai/0.  Is there something that I can do to fix this?
Chris O.

Also, when I run the program, i receive Error -50103, which is not in the error guide, I have attached my program to this message, any help would be greatly appreciated, I am still trying to figure LabView out.  The program is used to measure a voltage from a 4 wave gages, and create an 11 point calibration curve for each gage.  The user should be able to select a position, define the water depth, and click measure to measure the voltage.  Once 11 measurements have been taken and the user is satisfied with the results, the user will click a "calibration complete" button which will fit linear curves to the data and export the slope and intercept of the lines.  The secondary calibration listed assumes that the slope has already been measured, and calculates the intercept based on one iteration of data.  I am using a USB-6008 right now to figure the program out, and eventually will change to a higher quality DAQ board when the programs have been written successfully.
Attachments:
Linear Wave Gage Calibration.vi ‏917 KB

Similar Messages

  • I'm trying to set up a DAQ assist just to measure some voltage, how do i get the graph to start from 0 (time) every time I press run

    Hi all,
    I am trying to set up a simple DAQ assist to measure some voltages (currently a 9 volt battery to aid set up), when choosing to use a waveform chart to log the voltages the graph doesnt start from 0 (time seconds) how do I do this and get it to reset every time I press run or even stop.
    What I want to see at the end is a chart for the full lenght of the test showing voltage against time in seconds.
    Any ideas peeps
    many thanks
    Shane

    Hi Shane,
    Look at this VI
    Here, I clear the chart before running the VI, using a 'history data' property node ( i pass an empty array to clear it)
    In effect, each time you run the VI, the chart will begin at 0:00
    Hope this helps
    Regards
    Dev
    Attachments:
    chart_start.vi ‏20 KB

  • Scaling problems with DAQ Assist

    I'm a new user and am trying to do some simple scaling of my voltage inputs using the DAQ Assist. For example: one channel inputs around 8V on a 0 to 10V input selection. I am trying to scale it (linear) to show me around 28V by using the y=mx+b formula. My m value is 3.2 and b is 0. What the DAQ assist reads is around 16V instead of 25V (3.2 * 8). I custom make several scales, basically multiplying the input by 2, 3, 4, & 5, but none of them outputs what I would expect according to the formula and even the 5x causes the displayed value to decrease. If I go to "no scale" I read 8V, which is what is actually coming into the 6255 card. Any thoughts?
    Solved!
    Go to Solution.

    Hello DB66,
    Keep in mind that the Signal Input Range should be set post scale. What do you have defined as your signal input range, you're readings may be scaling themselves to the signal input range. With a coefficient value of 3.2, your range should be Max=32 Min=-32, since your device likely has a +/-10V range.
    Hope that helps.
    Regards,
    Glenn

  • How do I use DAQ Assistant to output DC on an AO?

    My background includes 15 years of programming PLC's. AO's were simple. You passed the AO a value between preset ranges and it output a proportional voltage signal based on the hardware capabilities.
    I foolishly thought LabView and DAQ AO's would be about the same. I built a vi to generate a DC signal which I could control the amplitude of. I also tied it to a chart to monitor how it worked. I managed to draw some very pretty ramping pictures on my chart, but when I put a DMM across the physical terminals I see no voltage.
    Then I used MAX to test the channel and measured it again. Now I see voltage on my DMM. I was attempting to use the DAQ Assistant to do this. Do I need to force it to re-read the changing value I am sending it? All of the examples I find show me more primitive methods, and I could not seem to get any of them to do what I want either. I need to send a speed reference to a DC drive.
    My incorrect method was to recursively add a value for ramping the signal up or subtract to ramp down. This method works perfectly on an industrial PLC. I connected a chart to the signal just as it entered the DAQ Assistant, but the only way I could check the actual was with the DMM.
    Are there any examples that use the Assistant? If not then I guess I will have figure out how to use the primitives anyway.
    technomage

    Hello technomage,
    It looks like we have discussed this problem in another thread (found here).  If you have any other questions about it, please post there.
    Regards,
    Jesse O.
    Applications Engineering
    National Instruments
    Jesse O. | National Instruments R&D

  • Using DAQ-assist to input a waveform; need help building a counter to count voltage "spikes"

    Hey all! I'm pretty new to labView and even newer to this forum, but its nice to meet you all...I hope that perhaps someone can help me with my problem.
    Allow me to begin by detailing the specifications of the problem.  I am an undergraduate student, and have a job doing research in a MEMS (micro/nanotech) lab.  The graduate student I am making this program for is working on biomedical applications;  eventually, the program will be connected to a microdevice that has a tiny channel in it, cut through a wee little capacitor, which blood will run through.  As red blood cells pass this capacitor, the voltage will spike; meaning that for each voltage spike, we can (and are trying to) count the number of red blood cells.
    However, I am still in the early developement of the program, so this above specific info is not that important.  Basically, I am using a function generator to input a waveform to the DAQ assistant of, say 500 mV.  I am trying to write a program that increments a counter every time I turn the voltage above say 550 mV (peak-to-peak), counting the number of simulated "spikes."  I have tried quite a lot to write a working program, and although I have not gotten it to work yet, I will post a screenshot of my most recent attempt HERE:
    I thank you in advance for any helpful tips, advice, or straight up assistance you may be able to give me.  Please ask me any clarifying questions about the program I wrote or the application, or anything.  Happy Friday! 

    Hey guys, it's been a while!  A lot of stuff has been happening in my life and I have had virtually no time to work on my LabView project.  
    I did create a LabView program based off IanW's reccomendation.  I am unsure of what exactly is going wrong, but when I run it, only a simple "snapshot" of a waveform from the DAQ shows up in the graph.  Even when I put the DAQ assist in a seperate while loop, the same thing happens.  I am including a screenshot of the project in case I am messing something entirely different up.  If you happen to read this, I really appreciate your help and thank you Ian! 
    I am also having a random issue with the filter signal VI.  So that background signals in the actual experiment do not read as "spikes" I have been instructed to include a high-pass filter in the VI.  However, everytime I use the high pass filter VI, it botches my signal and turns it into a bunch of noise!  I, nor my graduate mentor (who isn't too well-versed in LabView) have any idea why this is - we've tried using different types of filters to no avail.  
    Lastly, I would like to talk to Peter about a few questions I had abour LabView design.  In case you're still around, I will write another post later today with more detail.  In the meantime, I will try to find some of the example VIs about shift registers   All who read this have a great day!
    Attachments:
    count spikes pic.png ‏29 KB

  • DAQ Assistant in subvi not updating output to DAQ board with each call...

    Hi All,
    I am calling a simple subvi that creates a user-defined number of pulses with "Square Waveform.vi."  This square wave (with the given total number of pulses) is then used as an input to a DAQ Assistant controlling an analog output signal on a NI USB-6259 DAQ board.  I am using Labview 8.5 right now.
    However, each time I call this subvi from my main program, the output I measure from the DAQ board is identical to whatever I set in the first call (i.e., if i created two pulses in the first call, I get two pulses on every call, regardless of the input I feed to the subvi).  The multiple calls to this subvi are made in sequential frames in a stacked sequence.  I believe stacked sequences are frowned upon by good labview people, right?  But putting that aside for the moment...
    The "#-of-pulses" input I give to the subvi is updated in a subvi front panel number indicator and a graph of this waveform.  Just not in the real output I measure from the board.  Why is the hardware output being asserted (with the original input value) before this new number can reach the DAQ Assistant?
    The sloppy fix to this is just to put that square wave creation code in my main program each time I need it.  This does work and fixes my problem.  However, I would like to use subvis to keep things clean.
    I am not a good Labview programmer, but have used this software for a number of projects and am stumped by this.  Any ideas?
    Thanks,
    John

    Hi John,
    I am running your code over here and seeing
    the same results.  I believe the problem is that the DAQ Assistant is
    being called inside a loop (really a sequence structure, but
    nonetheless more than once).  Sometimes it is difficult to troubleshoot
    the DAQ Assistant in cases like this--it is trying to be "smart" and
    seems to be avoiding re-configuring its parameters inside the loop. 
    This is intended to improve loop speed for when customers are
    performing continuous operations.  In this case, it is performing a
    finite generation, and the number of samples generated appears to carry
    over from one loop iteration to the other.
    It sounds
    like you have discovered one workaround for this already: putting a DAQ
    Assistant in each frame of the main VI.  Two other options that come to mind are:
    Use the lower-level DAQmx functions inside
    the sub VI.  Here you will have explicit control over when the task is
    created and cleared, and when parameters are set.  You can find
    examples of how to use the DAQmx API in the Example Finder at:
    Help >> Find Examples... >> Hardware Input and Output >> DAQmx
    Write
    a consistent amount of samples to the DAQ assistant by "zero-padding"
    your signal.  For example, instead of writing [10, 1010], try writing
    [1000, 1010].  In this case, it wouldn't need to reconfigure the number
    of samples to generate.
    One lesson to take away here is
    that the DAQ Assistant is good for basic functionality, but for more
    advanced control over the execution and configuration of your task you
    should learn to use the lower-level DAQmx functions.  In this case it
    sounds like the problem is actually a bug.  I'll file a bug report, since the DAQ
    Assistant is not checking for waveform timing changes even though your
    timing is set to Use
    Waveform Timing.
    Thank you for pointing out this odd
    behavior--out of curiosity which version of DAQmx are you using? 
    -John
    John Passiak

  • In Labview 8.5, what happens if the signal input exceeds the signal input range set by the DAQ Assistant?

    Hello all,
    This should be a pretty simple question, but I can't seem to find the answer on-line and don't currently have the capabilities to test this:
    I'm using LabVIEW 8.5 and have a VI that imports sensor data through the DAQ Assistant. In the configuration tab there is a signal input range. What happens if my sensor exceeds this range? Will I get a warning? Will the value default to the maximum (or minimum)? I was interested in writing in some code to display an error as I approach the limits of this range, but was unsure if I also needed to include some code to display an error if the range is exceeded as well.
    Thanks for the help,
    Tristan
    Solved!
    Go to Solution.

    Hello Tristan,
    The behavior depends on the range you choose and the device you are using.
    If you are using a device with only one valid input range, we will use this range even if you set a smaller minimum and maximum in the DAQ Assistant.  Thus, if your device only supports ±10V and you set the range to ±8V, you will still continue to get valid data after your sensor exceeds 8V until you approach 10V.  Once you reach the limit of the range of your device, the output will "rail" and just return the maximum value until the signal drops below the maximum again.
    Note: A device that is nominally ±10V usually has some overshoot (like ±10.2V) that is typically specced in the manual.
    However, if you are using a device with multiple input ranges then things get more complex.
    The NI-DAQmx drive will pick the smallest range that fully encompasses the range you choose.  So, suppose your device supports the following input ranges: ±0.2V, ±1, ±5V, and ±10V and you choose 0V - 3V as the range in the DAQ assistant.  The NI-DAQmx driver is going to look at your input range and the list of input ranges that your hardware supports and choose the smallest that encompasses the full range you set.  This would the ±5V, because that's the only range that contains up to 3V.  As a result, any input signal between ±5V will be returned and any outside this range will "rail" to either the maximum or minimum value.
    We do this because using smaller ranges make more effective use of the resolution of the ADC.  Thus we try to use the most efficient range based on what you request without picking a range that will cause you to miss data.
    Let me know if I can clarify this further. 
    Seth B.
    Staff Test Engineer | National Instruments
    Certified LabVIEW Developer
    Certified TestStand Developer
    “Engineers like to solve problems. If there are no problems handily available, they will create their own problems.”- Scott Adams

  • Daq Assist and Graphing

    A very simple problem...
    Very new to LabView, and I am struggling with wiring up my Daq Assistant in order to graph data from a load cell. I've connected my load cell to the Daq and want to measure force readings over a span of time/when I press stop. When I run my program it only graphs a finite number of readings and then erases the graph to copy new readings on top. I put my graph outside of the while loop so that it would graph one reading at a time as they were read but it's not working. If my wiring isn't what's wrong, I have a feeling that my time settings for the Daq Assist are not right (and I don't know how to set those either.) I don't understand the description/effects of Rate and Samples to Read.
    Thanks for your help.
    Solved!
    Go to Solution.
    Attachments:
    Learning Load Test.vi ‏61 KB

    AFLR wrote:
    A very simple problem...
    Very new to LabView, and I am struggling with wiring up my Daq Assistant in order to graph data from a load cell. I've connected my load cell to the Daq and want to measure force readings over a span of time/when I press stop. When I run my program it only graphs a finite number of readings and then erases the graph to copy new readings on top. I put my graph outside of the while loop so that it would graph one reading at a time as they were read but it's not working. If my wiring isn't what's wrong, I have a feeling that my time settings for the Daq Assist are not right (and I don't know how to set those either.) I don't understand the description/effects of Rate and Samples to Read.
    Thanks for your help.
    Hi AFLR,
    I think settings are fine, you have set DAQ to read 100 samples at the rate of 100samples/second, so you'll get 100 samples every second.
    Now in order to retain the previous data in the Graph (which is not the nature of Graph), you may need to preserve it by writing extra code.
    If you already know about:
    1. Shift registers and
    2. Components of Waveform
    You can easily implement this requirement, find the attached VI for your reference.
    I am not allergic to Kudos, in fact I love Kudos.
     Make your LabVIEW experience more CONVENIENT.
    Attachments:
    Learning Load Test_Modified.vi ‏79 KB

  • Relay module and DAQ Assistant

    Hi everyone!
    I'm a Labview beginner and currently dealing with a priori easy business. Recently I got a USB Relay Module to be incorporated in an alarm system. Let's say if one gets a certain value bigger than other, the relay should close and activate a siren (see example attached). For this I've used the DAQ assistant and configured one of the module output channel. By using a simple boolean switch, I can easily close and open the relay. However, if I use a case structure, an error is obtained, as the DAQ Assistant for one output can be only used once. I mean, if the relay is closed and I'd like to get back the original situation, i.e. opened relay, what should I do?
    Schematically:
    -if A>B, then closed relay
    -if A<B, then opened relay
    Sorry for the messy explanation, but I think you get the point.
    Thanks in advance 
    Solved!
    Go to Solution.
    Attachments:
    alarm test.vi ‏44 KB

    I just got it. The problem was that in the main vi, not in the example I attached, the same output via DAQ Assistant was configured for two different case structures. Obviously the relay module was going completely mad, since I had two independent pairs of TRUE and FALSE working at the same time. If I get, for instance, TRUE for one case structure and FALSE for the other, the switch doesn't know what to do. Hope now it's clear...

  • DAQ Assistant to typedef data cluster

    Hi All,
    I was hoping someone could be able to shed some light on an issue I have been pulling my hair on for a week now.  I am recently new to LabView (less than 6 months), and up until I have been able to locate/research any issues I had ran into.  The problem I'm having is as follows:
    Basically, I am using an SCXI-1581 and an SCXI-1102B (along with multiple other pieces of hardware) for a device we are creating.  Generating the signals with the DAQ Assistant was quite simple.  The problem I am running into is finding a way to relay these signals to an array or cluster, so that multiple other subvi's can quickly just "Unbundle by Name" to wire to their specified indicators on each page.  I have tried arrays, clusters, typedef's, etc to no avail and I'm not sure where else to look for help.  Unbundling the typedef cluster onto the subvi's works like a charm, the problem is the typedef ctl is not getting any data sent to it so the indicators on the SubVI's never receive a value other than the 0 located in the cluster typedef.  If there is a better way to go about doing this I would sure appreciate the help. 
    Breakdown:  I have 41 thermistors being generated from the DAQ Assistant.  I need to somehow convert this data into a form where it will be able to be constantly updated and read from a typedef/global/something.  I have taken a screenshot of what I was working on when I decided to post here.  There are broken wires atm, because I was trying everything I could think of.  Signal Manipulation/From DDT, Array to Cluster, Build Cluster by Name, Build Array, numerous things.  I basically want all 41 thermistors to leave the DAQ Assistant, be split by channel name/value, and input into an array or cluster that can become global.  I have broken it down to 4 channels in my screenshot for simplicity's sake.
    My apologies for repeating myself or running on, I'm working on very little sleep at the moment
    I appreciate any help anyone can shed on the situation.  Thank you!
    -Justin
    Attachments:
    DAQ_Cluster.png ‏17 KB

    I didn't have it connected in that particular screenshot but I have tried that before.  I just redid that scenario and took another screenshot.  Bringing a cluster typedef and dropping it in, switching to indicator so it will allow an input, and changing the Original Cluster Visible property to FALSE (just so two clusters of 4 Numeric Indicators are not visible).  Using the attached screenshot, the numeric values do update (on the front panel of this VI), but as soon as I connect that typedef to another VI and unbundle, those values do not populate on the subVI.  I changed the DAQ Assistant in this current example (to test)  as I am at home and do not have the 1581/1102B hardware here with me. 
    I will note that I do have this functionality working on another section of the GUI I am working on.  This working portion has 8 Boolean values in a typedef cluster and is exported and unbundled in the same manner as I am trying to get working with these data values on the subVI.  The way that the values get sent to the typedef is different but overall the same way that I would like to aim for on the DAQ_Cluster2 image.  Two screenshots enclosed this time.  I can go into much further detail if necessary.  Thanks again for the feedback.  I'm going to knock out some more of the pages involved for now, if anyone has any more feedback that would be fantastic.
    Also aeastet thank you for your reply as well.  I started reading through your link to Ben's nugget but my brain is mush right now and I am not too familiar at all with some of that functionality.  I will look back at it more soon.
    Thanks again!
    Attachments:
    DAQ_Cluster2.png ‏27 KB
    FCV_Cluster.png ‏8 KB

  • Error message from labview when trying to set up the DAQ assistant

    I recieve a message from labview when I try to set up the DAQ assistant and select a channel to use. The error I get states "An exception occured within the external code called by a Call Library Node. This might have corrupted LabView's memory. Save any work to a new location and restart labview". Labview then freezes. I have reset the DAQ device in trying to slove this but I still get the same message. How do I solve this?? thanks - mars2006

    Hi Mars-
    It sounds like your NI-DAQ installation may have become corrupted. I would suggest uninstalling and reinstalling the DAQmx 7.4 driver to correct this problem and ensure that you're up to date. This download is available here: NI-DAQ 7.4
    If the problem persists you may want to uninstall and reinstall LabVIEW and then NI-DAQ in that order. The error message will usually give an indication as to which VI the error occurred in. Please let us know which VI is failing if you're unable to avoid the error with these suggestions.
    Have a good day-
    Tom W
    National Instruments

  • How to deal with the Error-89130 about the DAQ assistant?

    Once prss the "test" button of the DAQ assistant, the "Error-89130 occurred at DAQ Assistant" show up. It`s no doublt that the labview programme with a DAQ assistant  can`t be run and instead  the " Error-88303 occurred at DAQMx star Task Vi:1" or " Error-88304..." show up while other labview programme without DAQ assistant can run corrrectly. I have reseted or reinstalled the DAQmx, but it didn`t work.  Any other ideas for troubleshooting this? Thanks

    Please refer to this KB, and see whether it works. If problem also exists, please tell the DAQmx version and the DAQ card name.
    Haifeng Xu
    NISH AE

  • How to properly read data from one DAQ-assistant and write simultaneously with another DAQ-assistant (which is inside a loop)

    Hello.
    I'm a newbie working on my Master's thesis conserning a project that is based on old G-code made by another newbie so bear with me.
    I need to create a sequance of output controls. For this I'm using a for loop that eventually creates two triangular ramps during a period of 90 seconds. I've confirmed that this function works properly by measuring the actual output of the DAQ-decice (NI USB 6353).
    The problem is the following: During this controll-cycle I need to simultanously collect data from the same DAQ-device. At this point there is only one DAQ-assistant output-block in the main loop of the program and all the signals are derived from it to where they are needed.There is a case-structure (the bottom case structure in the picture) that contains the functions needed to collect the data during the test cycle. However these two actions, outputting data and inputting data, are not synchronized in any way which may be the reason why I get the 200279 error or alternatively the 200284 error during the test cycle. I've tried changing the sample rate, buffer size and the timeout time as adviced but nothing seems to help.
    What would be the simplest way to solve this problem?
    Help is greatly appreciated!
    Attachments:
    problem.jpg ‏206 KB

    Thanks for quick reply.
    However, I did try it (see the picture) but I still have a problem: I only get 100 samples / channel during the test sequence (all from the first seconds of the sequence) in total even though I've set the data aqcuiring DAQ-assistant as "continous" and "samples to read = 95k" and rate is 1000Hz.
    Edit.
    And lastly, I have trouble adding this "extra" DAQ-assistant to the vi. because I get an error about a resource (The 6353) being reserved, even though I connected a false constant to the "STOP" -input of the main DAQ-assistant.
    Attachments:
    is_this_what_you_meant.jpg ‏212 KB

  • How to control a digital output signal using the DAQ assistant?

    I am using a USB 6251 DAQ board and would like to control a switch. I am gathering airflow, pressure, and acoustic data on the same board (analog input) and would like the switch to trigger when the airflow value is within a pre-determined range. Finally, I would like to reverse the polarity of the switch (to off) about a second after the initial digital signal, all of which is during continuous data acquisition.
    I am able to actuate the switch only when pressing the "run" button in the DAQ assistant window. I am proficient with data acquisition but have never tried programming an output... Please help!
    Thanks!
    -a troubled researcher
    P.S. I am running LabVIEW 8.5 as well.

    blsmith4,
    You probably won't get the control of the digital port on your card by only using the DAQ Assistant. One of the following examples should provide you the functionality that you would like out of the box:
    Digital - SW - Timed Output (Simple)
    Digital - Continuous Output (More Complex)
    Continuous Write Digital Port - External Clock - Non Regeneration (Most Complex)
    Let me know if these work better for you.
    Best,
    Jason M.
    Applications Engineer
    National Instruments

  • Struggling how to set up 2 daq assist on labview in the same VI

    hi all,
    hope someone can help me out
    I am currently using the scxi 2345 with some SCC FT01 feed through modules, these are just to record some voltages from some transducers.
    What I am struggling with is setting up 2 daq assist on a single VI, it is coming back with a error stating specified resouce is reserved and then a window saying learn more about automatic handling. How would I also have these 2 values logged on the same graph or chart?
    Anyone any ideas how to set these up.
    I ideally want to have 3 daq assist in the same Vi but not sure if this is possible. It would be 1 for thermocouple input and 2 for voltage as stated above
    many thanks
    shane dover

    Hi Shane and Dennis,
    I've been working my way around the same type of problems, and I am
    currently re-writing everything to use on Asst. per board on my PXI
    chassis, so that no resources become reserved.
    As a result, I have to separate the signals on the output dynamic data
    in order to do different post processing.  So I have been toying
    with staying with dynamic data, inputting dyn. data into an array, and
    inputing it into a cluster.  It seems that each has advantages and
    drawbacks, but arrays seem to be the easiest with which to work when
    splitting signals coming from a DAQmx Asst. and funneling some of the
    signals into a post-processing sub-VI.  The drawback to this is
    that time stamps in the Dyn. Data are lost if you want to record the
    data and the time of the reading. 
    Question:  Is there an easy way to pipe Dynamic Data into a
    calculation subVI?  Is it workable after it's been split from
    other streams? 
    Thanks,
    Brad

Maybe you are looking for

  • Please can someone tell me if there is an equivalent to ctrl, alt, delete for the imac?

    I really need to close down mail as it has frozen - any ideas?

  • Purchasing Vendor During MIRO

    We have implemented partner functions, linking a purchasing vendor with an A/P vendor, using standard functionality. In testing, A/P reports that they are currently able to enter the invoice against the purchasing vendor as well as against the A/P ve

  • Error in launching xMII

    Hi experts, I was using xMII 11.5 in my system, it got corrupted some how and then i could not able to login. I reinstalled xMII many times, still i could not able to resolve the issue. I am getting error as "null" in both localhost and server name U

  • HT201303 i need to change my payment info

    can,t change my payment info have new credit card will not allow me to change!

  • Testing Strategy

    Hi, I'd like to know the best practices while preparing a test strategy document for CRM-OD. I have done it for Siebel but i understand in OD it should be quite simpler. Any comments/ documents which could be shared please. Thanks, Smita