DAQ Assistant Tasks

How do I create a task in LabView DAQ Assistant for one of our cDAQ modules without actually being connected to the cDAQ module?
Solved!
Go to Solution.

You can simulate a large scope of instrument supported by DAQmx with Measurement & Automation Explorer.
Right click on NI-DAQmx Peripherals then Create new then Simulate and choose in the list of supported device.
For cDAQ, your chassis in first then your module.
Excuse me but I don't have an english version of MAX and so, I don't have the correct translation of command...
When your simulated device is configured, you ca use it with the LabVIEW DAQ Assistant.

Similar Messages

  • NI-DAQmx task works in MAX or DAQ Assistant test panel but not in LabVIEW

    I am attempting to read a single AI channel from a PCI-6024E card via an SCB-68. I have created a NI-DAQmx Analog Input Voltage Task in MAX for this channel, sampling in contiuous aquisition mode at 100 kHz, 10000 samples at a time, with RSE terminal config. If I use the Test feature from MAX, the channel acquires data as expected.
    In LabVIEW, I call this task using an DAQmx Task Name Constant. If I right-click this constant and select "Edit Task", the Daq Assistant opens and I can use the Test feature from the DAQ Assistant to see that the data is still being acquired as expected.
    However, when I try to programmatically read this channel in LabVIEW using the VI "DAQmx Read (Analog Wfm 1Chan NSamp).vi", the VI returns a constant DC value of 500 mV, which I know is incorrect (I can monitor the signal across the two terminals in the SCB-68 with a DMM to know that the signal coming in varies as expected, and as I read using the test panels). This erroneous reading occurs even if I make a new VI, drop the task name constant on the diagram, right-click the task name constant and select "Generate Code - Example" and let LabVIEW create its own acquisition loop (which is very similar to what I already set up, using the "DAQmx Read" VI).
    Any ideas why the Test Panels work correctly but the LabVIEW code does not?

    Hello bentnail,
    I'm not sure why the test panels are readin the value correcly but the LabVIEW code does not, but there are a couple of things we can try.
    1) What happens if you just use the DAQ Assistant and place it on the block diagram? Does it read out the correct values?
    2) Try running a shipping example that comes with LabVIEW. "Acq&Graph Voltage-Int Clk.vi" should work well.
    3) What kind of signal are you expecting to read (peak to peak voltage, freqeuncy, etc.)?
    Thanks,
    E.Lee
    Eric
    DE For Life!

  • How to deal with the Error-89130 about the DAQ assistant?

    Once prss the "test" button of the DAQ assistant, the "Error-89130 occurred at DAQ Assistant" show up. It`s no doublt that the labview programme with a DAQ assistant  can`t be run and instead  the " Error-88303 occurred at DAQMx star Task Vi:1" or " Error-88304..." show up while other labview programme without DAQ assistant can run corrrectly. I have reseted or reinstalled the DAQmx, but it didn`t work.  Any other ideas for troubleshooting this? Thanks

    Please refer to this KB, and see whether it works. If problem also exists, please tell the DAQmx version and the DAQ card name.
    Haifeng Xu
    NISH AE

  • How can I programmatically change the voltage range settings in a DAQ Assistant

    Hi,
    First post here.  
    I need to be able to change the voltage range properties of a daqmx DAQ Assistant based on user input.  My hardware, an SCXI-1102C does not allow changing this property on a running task, so I'd like to either set the analog input voltage range before the DAQ Assistant activates, or pause the DAQ Assistant immediately after it starts, set the values and then resume.
    I don't know how to edit the task ahead of time because the DAQ assistant creates the task when it runs, and there is no task before that.
    In the attached picture, I have a conditional section, set to run only if the while loop iteration is 0.  I take the task from the Daq assistant, send it to a stop task vi, set the property, and then send the task on to the start task vi. I can watch it run with the debug light on, and everything seems to work correctly, but on the second (and all the other) iteration of the loop, I read out AI.Max and it seems like the DAQ Assistant has re set it back to 5V.  Can anyone see what is going wrong here?
    BTW, this is continuous acquisition and the code does not produce error messages when it runs.
    I did come across a similar question that someone posted here back in 2006, but his question was specifically aimed at a Labview API (VB, I think), and not an actual G solution.
    Attached are the actual vi in question and a png image of the block diagram.
    Thanks! 
    Ruby K
    Solved!
    Go to Solution.
    Attachments:
    Labview_question.PNG ‏14 KB
    Sample_AIV.vi ‏91 KB

    First, if you want to start getting beyond the basics with DAQ, you are going to have to stop using the DAQ assistant and do it with lower level DAQmx VI's.  There are hundreds of examples in the example finder.  You can even right click on the DAQ assistant and select open front panel.  That will create a subVI that you can open and see what is going on behind the scenes.  Do it.  I think you'll find the DAQ task is being recreated on each (though I'm not 100% of how the settings are established or maintained in each section of that subVI).
    The second problem is you have a bit of a race condition on iteration 0.  Those two DAQ property nodes are running at the same time.  So when you read the AI.Max, it may be happening before or after the AI.Max is set in your case structure.
    Third, make sure you wire up your error wires.

  • When using DAQ assistant to read frequency

    When using DAQ assistant to read frequency and Task timing is set to:
    N Samples, Clock settings to read 26,
    Frequency setup to rising edge,
    1 counter with 10 kHz to 1 kHz range.
    I get back a single number.
    Can I assume this is an average reading of 25 samples with the first sample unused?
    What is the base clock used?
    Is the “26” 26 cycles of the frequency to be measured?

    Hello,
    If you choose to acquire N samples from the DAQ Assistant then it will acquire all of these samples and return them to LabVIEW as an array.  However, when you use the DAQ Assistant it outputs the data in the dynamic data type first.  This data type makes it easy to graph and run the data through other express VIs.  If you were to create a Numeric Indicator from this data type it would just display the last element from the dynamic data array.  To display this data properly in a numeric format convert the dynamic data to an array of doubles by using the Convert From Dynamic Data function in LabVIEW.  Then you can select to convert it to a 1D array of scalars and when you create an indicator off of the output of this function all of the data should be displayed.
    The timebase that is used for lower frequency measurements is the onboard clock, which is internally connected to the Source.  Then you connect your signal to be measured to the Gate of the counter.  Since the frequency of the onboard clock is known it can be used to calculate the frequency of an unknown source based on when the counter is on and off (determined by the Gate). 
    Have a good day,
    Brian P.
    Applications Engineer

  • DAQ Assistant in subvi not updating output to DAQ board with each call...

    Hi All,
    I am calling a simple subvi that creates a user-defined number of pulses with "Square Waveform.vi."  This square wave (with the given total number of pulses) is then used as an input to a DAQ Assistant controlling an analog output signal on a NI USB-6259 DAQ board.  I am using Labview 8.5 right now.
    However, each time I call this subvi from my main program, the output I measure from the DAQ board is identical to whatever I set in the first call (i.e., if i created two pulses in the first call, I get two pulses on every call, regardless of the input I feed to the subvi).  The multiple calls to this subvi are made in sequential frames in a stacked sequence.  I believe stacked sequences are frowned upon by good labview people, right?  But putting that aside for the moment...
    The "#-of-pulses" input I give to the subvi is updated in a subvi front panel number indicator and a graph of this waveform.  Just not in the real output I measure from the board.  Why is the hardware output being asserted (with the original input value) before this new number can reach the DAQ Assistant?
    The sloppy fix to this is just to put that square wave creation code in my main program each time I need it.  This does work and fixes my problem.  However, I would like to use subvis to keep things clean.
    I am not a good Labview programmer, but have used this software for a number of projects and am stumped by this.  Any ideas?
    Thanks,
    John

    Hi John,
    I am running your code over here and seeing
    the same results.  I believe the problem is that the DAQ Assistant is
    being called inside a loop (really a sequence structure, but
    nonetheless more than once).  Sometimes it is difficult to troubleshoot
    the DAQ Assistant in cases like this--it is trying to be "smart" and
    seems to be avoiding re-configuring its parameters inside the loop. 
    This is intended to improve loop speed for when customers are
    performing continuous operations.  In this case, it is performing a
    finite generation, and the number of samples generated appears to carry
    over from one loop iteration to the other.
    It sounds
    like you have discovered one workaround for this already: putting a DAQ
    Assistant in each frame of the main VI.  Two other options that come to mind are:
    Use the lower-level DAQmx functions inside
    the sub VI.  Here you will have explicit control over when the task is
    created and cleared, and when parameters are set.  You can find
    examples of how to use the DAQmx API in the Example Finder at:
    Help >> Find Examples... >> Hardware Input and Output >> DAQmx
    Write
    a consistent amount of samples to the DAQ assistant by "zero-padding"
    your signal.  For example, instead of writing [10, 1010], try writing
    [1000, 1010].  In this case, it wouldn't need to reconfigure the number
    of samples to generate.
    One lesson to take away here is
    that the DAQ Assistant is good for basic functionality, but for more
    advanced control over the execution and configuration of your task you
    should learn to use the lower-level DAQmx functions.  In this case it
    sounds like the problem is actually a bug.  I'll file a bug report, since the DAQ
    Assistant is not checking for waveform timing changes even though your
    timing is set to Use
    Waveform Timing.
    Thank you for pointing out this odd
    behavior--out of curiosity which version of DAQmx are you using? 
    -John
    John Passiak

  • DAQ Assistant: How Can I Control this at the lowest Level?

    DAQ Assistant:
    I have the PCI slot version of this which I use to generate specific signatures. However I would like to get down to the very low level of this block to see what makes it tick if only to add some other features such as have it dynamically change high and low times after "N" number of pluses/bursts.
    At the current state I can't seem to get any further than the GUI that I work with right now. I can give you a VI upon request however this VI is included with labview from what I understand I have version 8.2. If anyone wants a copy of the vi which contains the DAQ Assistant block I will be more than happy to include.
    DAQ Assistant Location: Right click on the block diagram of a VI --> Measurement I/O --> NI-DAQmx --> DAQ Assist (This is an icon in itself)
    I see the read,write nodes however I have played and tried to see what they do but I have had no such luck. If anyone can point me in the right direction I would be grateful.
    Thank You.
    Solved!
    Go to Solution.

    You've got a couple of misconceptions about the DAQ Assistant. First, there is no such thing as a pci version. There are some slight differences between versions of the DAQmx driver. Second, the DAQ Assistant is a code generator. When you start it up, it will create custom code for the type of task you want (digital i/o, analaog i/o, etc.) so the VI you eventually have on the block diagram is not included in any version of LabVIEW.
    Once you have configured the assistant, you can right click on it and select 'Open Front Panel' this will convert the assistant task to a normal subVI that you can opne and view the block diagram. You can also right click and select 'Create DAQmx Code'. This will place the low level DAQmx functions on your block diagram. You could also skip the whole assistant and just start with the low level DAQmx functions in the first place. There is help associated with each and you have all of the shipping examples to look at. There is also the Getting Started with NI-DAQmx page.

  • Labview ignores all but 1 Daq Assistant, how can I avoid this?

    When I am putting together my block diagram, I use the DAQ Assistant to orchestrate everything.  Then when I go to create another DAQ Assistant instance, LabVIEW seems to ignore the first one.  It appears as though I can only use one DAQ Assistant at a time in my VIs, is this the case?  I am a bit of a novice, is there an easy way to avoid this, or change the orginal DAQ Assistant into some other form so I can use another DAQ Assistant for developing another part of my VI?
    I have looked long and hard for a good explanation of the difference between Tasks and Virtual Channels and while I have found efforts at explaining it, I do not understand.  I see that I can convert my DAQ Assistant into a task, but then I have a task output which does not help me (I need a data output for my VI).
    I am sure this is obvious to someone that has done it a million times, but I just can't find an answer wading through all of the knowledge base information or the LabVIEW help.
    Thanks a bunch!
    ~milqman

    Hello milq,
    That following tutorial that Novatron linked you earlier is an awesome resource for becoming familiar with programming with the NI-DAQmx.
    Learn 10 Functions in NI-DAQmx and Solve 80% of Data Acquisition Applications
    An important point to take away from this tutorial is that you can either create a DAQmx task using the wizard-style interfaces provided by the DAQ Assistant and within Measurement & Automation Explorer (MAX), or you can programmatically create and configure your DAQ Task in your LabVIEW code using the DAQmx Create Virtual Channel VI, DAQmx Timing VI, etc.  When you create a new DAQmx Task using the wizard-style interface, you are configuring all of the settings for your task manually.  When you use that DAQmx Task in LabVIEW (with a DAQmx Task Control or Constant), you are referencing all of those configuration options you've manually set.  To actually perform a read or write operation based on those settings, you need to wire the DAQmx Task into the 'task in' input of a DAQmx Read or DAQmx Write VI. 
    So for your analog input operation, you can wire the task directly into a DAQmx Read VI.  In the DAQmx Read VI, you'll want to select the type of task your reading from in the drop down box (Analog), if your reading from a single or multiple channels (1 Chan or N Chan), and if taking just one or multiple measurements with each call to the DAQmx Read VI (1 Samp or N Samp)
    For your digital output operation, you'll want to configure a separate digital output task, and wire it directly to a DAQmx Write VI.  Again, select the type of task your writing to (Digital), if your writing to a single or multiple channels, and if you want to perform a single write or multiple writes.
    To help with getting started with DAQmx programming in LabVIEW, I would highly recommend taking a look at the DAQmx example located in the LabVIEW Example Finder (Help > Find Examples).  A difference you'll note in these examples is that all the task configurations that you have been making with the wizard-style DAQ Assistant are done programmatically with the DAQmx Create Channel and DAQmx Timing VI.  I hope this helps and good luck!
    Travis G.
    Applications Engineering
    National Instruments
    www.ni.com/support
    Message Edited by Travis G. on 06-21-2006 05:23 PM
    Attachments:
    AnalogInDigitalOut.Jpg ‏21 KB

  • Simulated device in MAX, self tests without error and has working Test Panels, but doesn't show up in DAQ assistant.

    I'm trying to create a development machine where we can test new code without using our physical hardware. I've followed this guide in setting up a simulated device. I can get to step 3.2b, but the device does not show up in the DAQ assistant. In MAX, the device self tests and self calibrates successfully, and when I open the test panels, I see some sort of signal. I assume this is a default simulated input since I haven't told the device to look for anything? Note that the two devices I'm trying to create show up in the Devices and Interfaces section, but that even after running Self-Calibrate, the Self-Calibration date is still unspecified.
    When I try to test the device and create a voltage input according to the guide, I am unable to see either device in the DAQ task creator.
    Steps 1 and 2 of this guide are obviously satisfied. Step 3 is not, but this is unsurprising since a simulated device wouldn't be found in the Device Manager anyways. Also, I am not running RT, so step 4 is satisfied.
    Does anyone have any ideas?
    Solved!
    Go to Solution.

    That would be because the PXI 5124 is a digitizer not a analog input device.  You need to use the NI SCOPE driver not NI DAQmx
    Jeff

  • Error -2147220733 occurred at DAQ Assistant (in Measurement and Automation Explorer)

    Ok!  Just before the weekend I figured out how to make channels in Measurement and Automation Explorer for inputs through a couple different NI input devices (USB-9211A & PCI-6229 DAQ).  Things were going well.  Loaded up the computer today, added a few more channels.  Worked fine.  Now all of a sudden, any channel I make has an error and if I try modifying existing channels and saving, I get same error:
    "This global channel currently has an error.  You can save the global channel, but it cannot be used until all errors have been fixed.  Press "Yes" to save anyway, or "No" to cancel and show the error."
    If I hit yes, the channel is non functional.  After hitting no, I get:
    "Error -2147220733 occurred at DAQ Assistant.  Possible Reason(s): "
    with no possible reasons listed.  I tried restarting the program.  I tried restarting the computer.  I verified connections.  Searched for the above stated error number on ni.com, google.ca, and in the MAX help files, and found absolutely nothing.  I have no clue what to do.  Any help would indeed be very much appreciated.

    Hi there,
    This is my first time on here and I've only been using this software for a couple of days so it's possible I'm making a trivial error but I thought I'd post here anyway as I can't find anything on the net about my problem.
    I also get an error while trying to save:
    Error -2147220733 occurred at DAQ Assistant
    Possible Reason(s):
    Requested Code: -2147220733
    But mine comes about in different circumstances to the one in the original post on this thread by Kahless. I was editting a VI logger task and within that I was trying to edit an NI-DAQmx Task. I was trying to change the clock settings, specifically I was attempting to change the Rate (Hz) from the 1k default to just 20, or 200. I got the error while trying to save so changed it back to 1k, but the error persisted even though I'd set it back to what it was.
    I thought at first it might be simply that I was trying to edit an NI-DAQmx task within VILogger, so I tried making the ammendment directly but the problem was still there. 
    Any ideas?
    Many thanks.

  • Signal express tc calibratio​n to daq assistant calibratio​n

    I ran some calibrations in Signal Express for K-type thermocouples. I didn't realize that those calibrations don't carry over to other NI programs. I would like those same calibration values to be used in the DAQ Assistant, rather than completing all my calibrations over again. Is there a way that LabVIEW can grab the Signal Express calibrations and use them?

    Hi Jdezman,
    I will try and help you with this aspect of hardware calibration.
    NI-DAQmx will allow you to do a system calibration. Once creating your task in signal express, there will be a 'Calibration' tab under the 'Thermocouple Setup'. This will direct you to the 'Calibration Wizard'. This Wizard is the same in both Signal Express and the 'Measurement and Automation Explorer' (MAX).
    There is a KnowledgeBased article which will guide you through this process:
    http://zone.ni.com/devzone/cda/tut/p/id/4224
    Hope this helps with your system calibration. Please let me know how you get on,
    Regards,
    Aaron. E
    Applications Engineer Team Lead
    National Instruments
    ni.com/support

  • Change pin configuration in linear position DIO daq assistent

    Hi,
    I have an Device for interferometric Displacement Measurements that gives out an digital A quad B Signal plus the clock signal (runs at 10MHz).
    I wanted to use the DAQ Assistent to calculate the Displacement. The Problem is, if I select the "linear Position" Option it wants me to use the Inputs PFI8 and PFI10 for A and B.
    Unfortunately we use a BNC2110 Box which only connects PFI0 - PFI9. Leading to my qestion: Is there a way to change that Pin-Configuration?
    If not, are there complete vi´s available for downliad where I can select the Inputs for A,B and clk and that at least gives me the A and B Signal in Labview?
    Setup-wise we are using a 6259 M-Series PCI DAQ-Card which should be able to handle 10MHz.
    I would be really happy if anyone can help me with that!
    Cheers,
    Daniel

    Hi Daniel!
    Unfortunately the DAQmx assistant is limited to a small subset of the DAQmx capabilities. If you want to change the PFI assignment for a counter task you have to use the low level DAQmx functions provided in the Measurment I/O > DAQmx palette. I attached a small example where you can see how PFI for a Counter Linear Position Task are assigned with a DAQmx property node. Hope that helps!
    Regards,
      Georg
    Attachments:
    Counter - Read Encoder.vi ‏41 KB

  • Error with 2 Daq assistant

    Hello
    Another noob joining the community. Here is my question.
    What am i doing wrong? I use 2 Daq assistants on the NI9401 module this is for digital input and output. I use it for generating 2 signals. If i test it with 1 it works but if i go to 2 or more it fails. This is the error i get . I have also added the program as an attachment.
    Thanks in advance.
    Solved!
    Go to Solution.

    Hello,
    That error means that the DIO task is reserved and cannot be obtained by one of the DAQ assistants. There is actually a nice example program on what you want to do. It can be found here:
    https://decibel.ni.com/content/docs/DOC-11632
    This example does not use the DAQ Assistasnt but the DAQmx API VIs (which the DAQ Assistant is using also anyway). If the order is not important (and since the these DIO tasks are software timed it is probably not an issue you can also choose to use the error cluster from one DAQ assistant and wire it to the other (see attached VI). Doing it that way you force one DAQ task to wait for the other one to finish. In that way they cannot run at the same time preventing for the error to happen.
    regards,
    Rik Prins, CLD
    Applications Engineering Specialist Northern Europe, National Instruments
    Please tip your answer providers with kudos.
    Any attached Code is provided As Is. It has not been tested or validated as a product, for use in a deployed application or system,
    or for use in hazardous environments. You assume all risks for use of the Code and use of the Code is subject
    to the Sample Code License Terms which can be found at: http://ni.com/samplecodelicense
    Attachments:
    2 Daq assistant generating error (3).vi ‏169 KB

  • DAQ Assistant with multichannels causing Simulation Loop slow?

    Hi, another LabView newbie here.
    I have in a Real Time Target (NI 9132)  a Control & Simulation Loop with DAQ Assistant block inside, whose signals are fed into a Discrete State Space block. The discrete state space model has 1 second time step. I have set the Simulation Loop parameters so that it executes every 1 second as well (see Fig. A below). *sorry for the big white gap under the figures..
    The DAQ assistant acquisition mode is set as "1 Sampe (On Demand)".
    However, when I run the VI, the plot seemed to be updated much slower than 1 second rate. To confirm this, I put an "Elapsed Time" block inside the Simulation Loop. The "elapsed time" shows the actual time in seconds while the simulation plot show slower time (see Fig. B below).
    I tried to isolate the problem by removing the block one by one. Finally, I found out that this problem was caused by (at least) the DAQ Assistant which acquires multichannels data of NI 9214. When I remove some channels and leave one or two channels, the VI runs at the actual time (see Fig. C below). But when I added more channels reading, it became slower again. 
    Here is the snippet of the block diagram (after all other blocks were removed):
    What am I doing wrong here? I'm going to use all of NI 9214 channels so how not to have similar problem like this?
    I look forward to hearing any relevant comments from the members. Thanks in advance.
    Tian

    Hi Tian,
    why do you need a Sim loop anyway?
    - When it comes to speed you shouldn't use the DAQAssistent. Use basic DAQmx functions…
    - Use parallel running loops for each task. Put DAQmx functions in their own loop, running in parallel to your Sim loop…
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • Sample clock of DAQ assistant

    I am using DAQ assistant to generate output voltage and another DAQ assistant to measure input voltage. I am to sepecify teh clock type for the two DAQs. I want the same clock type for both the DAQs so that the data from both DAqs are synchronized, that is run with the same clock. I m not using any external clock and want it software timing. I want to know if it is okay to select internal clock type for the output voltage DAQ assistant and then select external clock type for the input voltage DAQ assistant and select the clock source as the analog output sample clock?
    OR if i select the clock type as internal for both the DAQ assistants will the data be synchronized?

    Hi Amber,
    Analog Input is not retriggerable,
    meaning that once you stop the task you have to restart the entire task
    (including the analog output). But if you want to just stop displaying the data
    on the input channel, simply encase the Measurement
    output in a case structure with a Boolean control that you can click to update
    the graph or not.
    If that doesn’t work for what you’re
    trying to do, please give more details as to your overall application so the
    community can help answer your specific questions.
    Mark E.
    Precision DC Product Support Engineer
    National Instruments
    Digital Multimeters (DMMs) and LCR Meters
    Programmable Power Supplies and Source Measure Units
    Attachments:
    Not update voltage.png ‏5 KB

Maybe you are looking for

  • I just uninstalled and reinstalled itunes and it will not open

    why will iTunes not read my library?

  • How do I get an unlimited license for ARD

    I need to supply a school lab manager with Apple Remote Desktop for daily operations- (about 45 iMacs). Is the Volume Purchase Program version a limited or unlimited license at $79.99? None of the online documentation says.

  • Sending Email with Subject?

    Hi - creating a button for the user to send an email to technical support..  Anyway to pre-populate the Email subject? Maybe with Javascript?

  • Where can I download ios 5

    I have been trying for 3 days. Apple Helpline said they were having a few calls about server but that was 3 days ago. How can I download it? Alternatively does the file always get thrown away after installation? I downloaded it once on another Mac fo

  • Green Screen with Hardware Acceleration Disabled

    I am having the EXACT same problem. The screen on Youtube for example goes screen and then fastforwards automatically (while still being screen). Sometimes (very often) the playback from a video restarts - randomly, without me doing anything. Don't k