DAQ Assistant with multichannels causing Simulation Loop slow?

Hi, another LabView newbie here.
I have in a Real Time Target (NI 9132)  a Control & Simulation Loop with DAQ Assistant block inside, whose signals are fed into a Discrete State Space block. The discrete state space model has 1 second time step. I have set the Simulation Loop parameters so that it executes every 1 second as well (see Fig. A below). *sorry for the big white gap under the figures..
The DAQ assistant acquisition mode is set as "1 Sampe (On Demand)".
However, when I run the VI, the plot seemed to be updated much slower than 1 second rate. To confirm this, I put an "Elapsed Time" block inside the Simulation Loop. The "elapsed time" shows the actual time in seconds while the simulation plot show slower time (see Fig. B below).
I tried to isolate the problem by removing the block one by one. Finally, I found out that this problem was caused by (at least) the DAQ Assistant which acquires multichannels data of NI 9214. When I remove some channels and leave one or two channels, the VI runs at the actual time (see Fig. C below). But when I added more channels reading, it became slower again. 
Here is the snippet of the block diagram (after all other blocks were removed):
What am I doing wrong here? I'm going to use all of NI 9214 channels so how not to have similar problem like this?
I look forward to hearing any relevant comments from the members. Thanks in advance.
Tian

Hi Tian,
why do you need a Sim loop anyway?
- When it comes to speed you shouldn't use the DAQAssistent. Use basic DAQmx functions…
- Use parallel running loops for each task. Put DAQmx functions in their own loop, running in parallel to your Sim loop…
Best regards,
GerdW
CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
Kudos are welcome

Similar Messages

  • Dynamically passing data into a running while loop from a DAQ assist. in an ouside while loop

    Hello,  I'm currently a student working on a senior project and I'm trying to do a state machine that will turn off and turn on a compressor depending on time and coprocessor failure. 
    In the run state, wich is #1 on the case structure box I placed the DAQ assist.  Which takes in data from an accelerometer.  If the accelerometer's value is above the limit four times it will end the loop or if the time runs out it will end the loop. 
    The problem I am having is that i need to run four compressors.   I was thinking about having four case structures all within the outside loop, but I can only have the DAQ assist in one location.  This means that I now have to move the DAQ assist out of the run loop or run all four compressors in the one case structure.  If i remove the DAQ assist.  I can only get it take in data once when the loop starts and never again.  I understand why, but is there a way to dynamically pass data froma  DAQ assist.  into a running loop? 
    Also on a side note, i can't find a tutorial on how to really create a state machine using Enums.  Does any one know where to find this. 
    I have attached my curent program.
    Thank you for your help,
    Ryan
    Attachments:
    TEST STAND STATE MACHINE 2-28-07.vi ‏288 KB

    in labview choose file->new... then pick standard state machine and there are instructions.
    you can create a custom control and either use an enum, text ring or menu ring.  Edit the values and then save the control and drop into your vi and you can wire this to your case structure
    - James
    Using LV 2012 on Windows 7 64 bit

  • How to modify input range (DAQ assistant) with a numeric control ?

    hello every one
    I am currently working with the NI USB-6218 acquisition card.
    I would like, in order to acquire a signal, to be able to choose the input range of the DAQ assistant with a numeric control ( and not by opening the DAQ assistant window ) (like the "number of sample" and the "rate" entry...)
    Is it possible and if yes, how ?
    Thanks so much in advance for your answers !
    Solved!
    Go to Solution.

    hello
    sorry I forgot it. but after spending a few hours on the problem this morning, i solved it Apparently, I corrupted the general DAQmx assistant last friday. By restarting my computer this morning, this issue was solved and I was able to transform the constant into command for the input range.
    Thanks a lot for your help !!!
    Guyua

  • How to properly read data from one DAQ-assistant and write simultaneously with another DAQ-assistant (which is inside a loop)

    Hello.
    I'm a newbie working on my Master's thesis conserning a project that is based on old G-code made by another newbie so bear with me.
    I need to create a sequance of output controls. For this I'm using a for loop that eventually creates two triangular ramps during a period of 90 seconds. I've confirmed that this function works properly by measuring the actual output of the DAQ-decice (NI USB 6353).
    The problem is the following: During this controll-cycle I need to simultanously collect data from the same DAQ-device. At this point there is only one DAQ-assistant output-block in the main loop of the program and all the signals are derived from it to where they are needed.There is a case-structure (the bottom case structure in the picture) that contains the functions needed to collect the data during the test cycle. However these two actions, outputting data and inputting data, are not synchronized in any way which may be the reason why I get the 200279 error or alternatively the 200284 error during the test cycle. I've tried changing the sample rate, buffer size and the timeout time as adviced but nothing seems to help.
    What would be the simplest way to solve this problem?
    Help is greatly appreciated!
    Attachments:
    problem.jpg ‏206 KB

    Thanks for quick reply.
    However, I did try it (see the picture) but I still have a problem: I only get 100 samples / channel during the test sequence (all from the first seconds of the sequence) in total even though I've set the data aqcuiring DAQ-assistant as "continous" and "samples to read = 95k" and rate is 1000Hz.
    Edit.
    And lastly, I have trouble adding this "extra" DAQ-assistant to the vi. because I get an error about a resource (The 6353) being reserved, even though I connected a false constant to the "STOP" -input of the main DAQ-assistant.
    Attachments:
    is_this_what_you_meant.jpg ‏212 KB

  • Multiple DAQ devices with AI scan clk loop control

    I have an application running under RT on a PXI controller that acquires AI data from a 6052E board. The time critical loop uses an external digital trigger for the scan clock. AI Single scan controls loop timing since it puts the thread to sleep until the arrival of the digital trigger. It works great. The attached VI is a simplified example of how the system works.
    However, I needed more AI input lines and have purchased a second 6052E board. I now need to scan multiple devices within the loop and am having trouble coming up with the best solution. Ideally, it would be great if I could group the AI channels on both boards and scan them with one node, as in the example, but that seems impossible. I'll prob
    ably have to synchronize the two boards and use two AI single scan nodes to get the data. Does anyone have an idea about the best way to configure the clock/trigger for the second board to accomplish this? I've looked through examples and haven't yet found anything that exactly fits the bill.
    I'm using LV 7.0 and RT 7.0 on Mac OS X.
    Thanks.
    Attachments:
    Single_Device_AI_cntrld_loop.vi ‏95 KB

    Byron,
    You will need to route the scan clock for the PXI-6052E (master) that is receiving the external clock to the second PXI-6052E (slave). In the example you attached to your post, all you would need to do is duplicate the code from AI Config.vi to AI Clear.vi for the slave device. In your actual application, since the scan clock is not already available on one of the PXI Trigger (RTSI) lines, you will need to use Route Signal.vi to route the master device's AI Scan Start (scan clock) signal to a RTSI line. You can then select this RTSI line as the source of the scan clock for the slave device.
    Good luck with your application.
    Spencer S.

  • Alert Component with OnKillFocus Causes Infinite Loop

    I have this script...
    txtPPM.onKillFocus = function(txtPPM)
    import mx.controls.Alert;
    Alert.show("blah blah");
    When my focus shifts away from the txtPPM textfield I get an
    infinite loop error. I haven't been able to find a successful
    method to fix this. Any suggetions?

    Probably related to the FocusManager (V2 components). There
    may be other
    ways to work around it, but here's something quick:
    import mx.controls.Alert;
    var alertClickHandler:Function = function (evt_obj:Object) {
    switch (evt_obj.detail) {
    case Alert.OK :
    trace("You clicked: " + Alert.okLabel);
    break;
    case Alert.CANCEL :
    trace("You clicked: " + Alert.cancelLabel);
    break;
    txtPPM.onKillFocus = txtFocusHandler;
    function showAlert(){
    Alert.show("blah blah blah", "", undefined, this,
    alertClickHandler);
    function txtFocusHandler(newFocus:Object){
    this.onKillFocus = undefined;
    showAlert();
    txtPPM.onKillFocus = txtFocusHandler;

  • Stopping the VI when running a infinite simulation loop

    Hi,
    This is my first time on this forum and for that matter, I started with Labview a few months back. I am developing a standalone application for servo motor control through USB 6211. The motor control was completed and I created a DLL which I could run through VC++. My application requires me to call this DLL or an EXE based on this VI repeatedly with direction, velocity and angle parameters and the remainder of the program depends on the finishing of this executable.
    The issue was that the VI did not stop execution after the motor was turned off, I suppose due to the simulation loop. I tried using the abort execution block but was advised to avoid doing so. My next step was to move all the DAQ assistant blocks into a while loop and couple the motor ON/OFF control with the stop button. But this hasn't helped either. I am attaching the VI and the SubVI here. I went through the board but did not come across a query involving a simulation loop. Any suggestions?
    Attachments:
    rotate.vi ‏551 KB
    Velocity_Con.vi ‏430 KB

    Hi Vivek!
    Thank you for contacting National Instruments.  From the information you have provided here, along with the attached VIs, I would agree that it is the simulation loop that is causing the problems when stopping the VI.  It looks like you are using the LabVIEW Simulation Module. 
    When using these loops there are two primary ways of stopping their execution.  The total simulation time can be controlled from the input node, the box at the upper left of the loop, or the Halt Simulation VI can be used from within the Utilities palette.  I would suggest taking a look at the detailed help for the simulation loop in order to better understand the methods of stopping this execution.  As you mentioned it is always good programming practice to avoid using the abort button because this can result in open references being left without any programmatic resolution.
    I hope this helps!  Let me know if there is anything else I can help with or clarify.  Have a great day!
    Jason W.
    National Instruments
    Applications Engineer

  • Error 200279 Continuous DAQ Assistant

    I am getting Error 200279 while running my VI.  I am using a DAQ assistant with four channels set to Continuous Samples with a rate of 1K and Samples to Read at 100.  The error comes up at different times.  It can happen right after starting the vi or it has happened as late as 2 1/2 hours of running the vi.  I added the DAQmx property node to observe the number of available samples per channel.  The number of available samples is stays at 0 but then will spike in a few seconds until it goes over 10,000 available samples and the error pops up.
    I'm at a loss as to what might be causing the sudden spike of samples because the program does the same calculations each iteration of the loops.  Therefore, I don't believe the program is running to slow to read the samples.
    Thanks for the help,
    Tony

    I am writing to a file called datalog.csv.  At first I thought it might have been the cause.  When I got the vi, the data is recorded to datalog.csv at a period determined by an input, Sample Rate.  The fastest I have written to the file was 20 ms (50 times a second) which meant the program was opening, writing to, and closing datalog.csv 50 times a second.  So I did change how the data was saved by moving data writing to a separate loop and by only opening the closing the file once. but I still got the error, so I changed the program back to the way it was.
    a little bit of input on the vi.  it is for controlling a tesile strength tester.  it controls the amount of tension is being pulled on a powerline cable.  the controller in the vi is the bottom loop.  writing to datalog and the data acquisition are located in the top loop.
    Thanks
    Attachments:
    cabletest_v2_input_test.vi ‏747 KB

  • Using DAQ-assist to input a waveform; need help building a counter to count voltage "spikes"

    Hey all! I'm pretty new to labView and even newer to this forum, but its nice to meet you all...I hope that perhaps someone can help me with my problem.
    Allow me to begin by detailing the specifications of the problem.  I am an undergraduate student, and have a job doing research in a MEMS (micro/nanotech) lab.  The graduate student I am making this program for is working on biomedical applications;  eventually, the program will be connected to a microdevice that has a tiny channel in it, cut through a wee little capacitor, which blood will run through.  As red blood cells pass this capacitor, the voltage will spike; meaning that for each voltage spike, we can (and are trying to) count the number of red blood cells.
    However, I am still in the early developement of the program, so this above specific info is not that important.  Basically, I am using a function generator to input a waveform to the DAQ assistant of, say 500 mV.  I am trying to write a program that increments a counter every time I turn the voltage above say 550 mV (peak-to-peak), counting the number of simulated "spikes."  I have tried quite a lot to write a working program, and although I have not gotten it to work yet, I will post a screenshot of my most recent attempt HERE:
    I thank you in advance for any helpful tips, advice, or straight up assistance you may be able to give me.  Please ask me any clarifying questions about the program I wrote or the application, or anything.  Happy Friday! 

    Hey guys, it's been a while!  A lot of stuff has been happening in my life and I have had virtually no time to work on my LabView project.  
    I did create a LabView program based off IanW's reccomendation.  I am unsure of what exactly is going wrong, but when I run it, only a simple "snapshot" of a waveform from the DAQ shows up in the graph.  Even when I put the DAQ assist in a seperate while loop, the same thing happens.  I am including a screenshot of the project in case I am messing something entirely different up.  If you happen to read this, I really appreciate your help and thank you Ian! 
    I am also having a random issue with the filter signal VI.  So that background signals in the actual experiment do not read as "spikes" I have been instructed to include a high-pass filter in the VI.  However, everytime I use the high pass filter VI, it botches my signal and turns it into a bunch of noise!  I, nor my graduate mentor (who isn't too well-versed in LabView) have any idea why this is - we've tried using different types of filters to no avail.  
    Lastly, I would like to talk to Peter about a few questions I had abour LabView design.  In case you're still around, I will write another post later today with more detail.  In the meantime, I will try to find some of the example VIs about shift registers   All who read this have a great day!
    Attachments:
    count spikes pic.png ‏29 KB

  • Control & Simulation Loop failed to compile

    Dear Forum Members,
    I have a problem with a Control & Simulation Loop program (attached) that just doesn't compile & run.  I believe that the problem is associated with the 'Feedback Node' at the bottom of this Control & Simulation Frame since if this is taken out the program will run ok.
    Can anyone please advise me if this 'Feedback Node' is incorrectly used or in violation of anything.  I have tried various ways to overcome the problem i.e. initialising it at the start but nothing works.  The error message that I received just says "VI failed to compile".
    Appreciate any help with this.

    Hello bunnykins, 
    The Feedback node really isn't a supported function in control design and simulation. The behavior you're reporting, and the work around are both documented in the know issues of the module here: 
    201449
    Return
    A Feedback Node on a Simulation Diagram causes the VI to fail to compile
    The Feedback Node does not make sense semantically within a Simulation Diagram, due to the fact that most ODE Solvers will execute the diagram multiple times per iteration and may need to reject steps and try again, filling the Feedback Node with bad data.
    Workaround: Use the Memory block from the Simulation Utility palette. If a delay of greater than 1 is desired you can chain multiple memory blocks in sequence.
    Reported Version: 8.5
    Resolved Version: N/A
    Added: 07/31/2010
    Applications Engineer
    National Instruments
    CLD Certified

  • Mutiple readings from Express DAQ Assistant VI

    Hello All!
    First I'm a fresh noob to LabView and although I understand most of the basic concepts, I'm also new to programming in general. I'm working on a project to remotely monitor several different parameters of a system. Right now I'm working on a test program to monitor three parameters in my shop across the street. I'm using a NI USB-6008 as a DAQ device, DAQmx 8.0 and LabView 8.0. All of my input signals are 4-20 mA signals coming from my three sensing devices. 
    Here's my problem; I need to monitor all three signals on digital displays. I can accomplish this by using three different instances of the express DAQ assistant but then I get the dreaded "50103 error" unless I wire the stopped output of one assistant to the stop input of the next in a feedback loop and even then I get the 50103 error on the first iteration of the loop. From my research I understand that this isn't the correct way to program this. I should use one DAQ assistant with multiple channels assigned to each one of my parameters. I've did this but now how do I make each one read out on an individual digital display? I only have one output from my DAQ assistant.

    Hi George,
    Use split signal in the signal manipulation functions menu to get your 3 individual singals on respective indicators
    Regards
    Dev
    Attachments:
    SPLIT.vi ‏85 KB

  • Missing wire connectors on DAQ Assistant express VI's

    I'm seeing some strange behavior on one specific computer running LabVIEW. The problem is that DAQ Assistant VI's have no connector points. I have nine other identical computers (used in a classroom setting) but I don't see this behavior on the others (I haven't tested them all).
    I've removed LabVIEW (and all other NI software) and re-installed it (LabVIEW v 8.6.1). I also tried a more recent version of LabVIEW (2010?). Same behavior, no connector dots on DAQ Assistant VI's.
    The computers use DAQmx with USB hardware (cDAQ).
    I guess my next step is to format the hard drive and re-install everything. My theory is there's some kind of driver corruption. What do you think?

    Allie,
    Here are the steps I use to create the Express VI using the DAQ Assistant:
    * On the block diagram:
    Right click -> Measurement I/O -> MX -> DAQ Assistant
    * Drop Blue box on block diagram.
    * Configuration Screen Appears
    Aqcuire Signals -> Analog Input -> Voltage -> Dev1 (USB 9219) -> ai0 -> Finish
    * Voltage Input Setup Appears
    Accept default settings -> click OK
    * Voltage Input screen disappears, a blue DAQ Assistant appears on block diagram. No connection points.
    When I execute the exact same procedure on another computer I get a DAQ Assistant with connection points.
    Weird eh?
    thanks!

  • DAQ Assist Fails

    Hi,
    i have problems with my DAQ-card 6024E and LabView 7. In MAX is may daq-card installed, and all tests are working properly.
    then i start 'new vi' and put DAQ-assist on block window. i start then my DAQ- wizard and what i want is to configure ANALOG INPUT or something else. anyway, for "my physicaly channels" wizard says: 'no supported devices found'. ?!? how that can be? i have card that properly works in MAX but not in DAQ-assist
    have LabView 7 and DAQ 7.01
    thanx
    Vedran Divkovic
    Ruhr Uni Bochum
    Departement: PY
    Germany

    The DAQ-card 6024E isn't supported in the DAQmx API in DAQ 7.01. You need to upgrade to DAQ 7.1 if you want to use the DAQ Assistant with this device.

  • DAQ Assistant in subvi not updating output to DAQ board with each call...

    Hi All,
    I am calling a simple subvi that creates a user-defined number of pulses with "Square Waveform.vi."  This square wave (with the given total number of pulses) is then used as an input to a DAQ Assistant controlling an analog output signal on a NI USB-6259 DAQ board.  I am using Labview 8.5 right now.
    However, each time I call this subvi from my main program, the output I measure from the DAQ board is identical to whatever I set in the first call (i.e., if i created two pulses in the first call, I get two pulses on every call, regardless of the input I feed to the subvi).  The multiple calls to this subvi are made in sequential frames in a stacked sequence.  I believe stacked sequences are frowned upon by good labview people, right?  But putting that aside for the moment...
    The "#-of-pulses" input I give to the subvi is updated in a subvi front panel number indicator and a graph of this waveform.  Just not in the real output I measure from the board.  Why is the hardware output being asserted (with the original input value) before this new number can reach the DAQ Assistant?
    The sloppy fix to this is just to put that square wave creation code in my main program each time I need it.  This does work and fixes my problem.  However, I would like to use subvis to keep things clean.
    I am not a good Labview programmer, but have used this software for a number of projects and am stumped by this.  Any ideas?
    Thanks,
    John

    Hi John,
    I am running your code over here and seeing
    the same results.  I believe the problem is that the DAQ Assistant is
    being called inside a loop (really a sequence structure, but
    nonetheless more than once).  Sometimes it is difficult to troubleshoot
    the DAQ Assistant in cases like this--it is trying to be "smart" and
    seems to be avoiding re-configuring its parameters inside the loop. 
    This is intended to improve loop speed for when customers are
    performing continuous operations.  In this case, it is performing a
    finite generation, and the number of samples generated appears to carry
    over from one loop iteration to the other.
    It sounds
    like you have discovered one workaround for this already: putting a DAQ
    Assistant in each frame of the main VI.  Two other options that come to mind are:
    Use the lower-level DAQmx functions inside
    the sub VI.  Here you will have explicit control over when the task is
    created and cleared, and when parameters are set.  You can find
    examples of how to use the DAQmx API in the Example Finder at:
    Help >> Find Examples... >> Hardware Input and Output >> DAQmx
    Write
    a consistent amount of samples to the DAQ assistant by "zero-padding"
    your signal.  For example, instead of writing [10, 1010], try writing
    [1000, 1010].  In this case, it wouldn't need to reconfigure the number
    of samples to generate.
    One lesson to take away here is
    that the DAQ Assistant is good for basic functionality, but for more
    advanced control over the execution and configuration of your task you
    should learn to use the lower-level DAQmx functions.  In this case it
    sounds like the problem is actually a bug.  I'll file a bug report, since the DAQ
    Assistant is not checking for waveform timing changes even though your
    timing is set to Use
    Waveform Timing.
    Thank you for pointing out this odd
    behavior--out of curiosity which version of DAQmx are you using? 
    -John
    John Passiak

  • Is the DAQ Assistant compatible with LabView7?

    I am equipped with LabView7 and a NI DAQ-Pad 6015 and am trying to monitor temperature using two thermocouples. I have searched for some tutorials online but most suggest using DAQ assistant, which is not listed in the input section of my functions palette. I have downloaded the most recent NI-DAQmx driver which is supposed to be compatible with LabView 7.x. I'm not sure if this means that it is compatible with LabView 7.0. I still do not see the DAQ assistant appear.
    Any help or suggestions (about DAQ assistant or how to program in LabView7 to solve my problem) is greatly appreciated!
    Thanks.
    Solved!
    Go to Solution.

    I have installed DAQmx 8.1 and now the DAQ assistance appears. However, when I select Analog Input >> Temperature >> Thermocouple and arrive at the physical channels screen, it says "No supported devices found."
    I also notice that when I open the Measurement & Automation explorer, it does not appear to detect my DAQ pad, even though the light on it is blinking Ready. A window pops up and says "Can not find RTE directory: C:\Windows\system32\cvirte" followed by "Unable to open c:\program files\national instruments\max\bin\cvirte.rsc". After clicking okay to both windows, I check under Devices and Interfaces to see that my DAQpad 6015 is not listed.
    Previously before I had uninstalled everything and reinstalled using DAQmx 8.1 my device was detected, but now it doesn't seem to be detected anymore.
    Also, I just realized that while it says everywhere that I have LabView7, when I start the program, the opening window has LabView7 Express written on it. I hope this isn't the cause of all the troubles.
    Message Edited by Particle42 on 05-28-2009 03:19 PM

Maybe you are looking for

  • How to declare value binding to array list element in a pojo?

    I have a POJO called P, that contains an ArrayList member called w. class P { private ArrayList w = new ArrayList(5); w is initied in the class constructor. The POJO is in the SessionBean and I would like to reference individual elements in the Array

  • Acrobat X Pro ouput preview issue

    I have recently upgraded Acrobat from version 9 to 10. I frequently use the separations preview in the Output Preview tool to check the black separations on PDF files I receive for printing. When I try to click the black separation on and off again,

  • Swf not working in IE 8, but working in firefox, chrome...

    I have just made a website (www.doneanddone.co.nz), and I have two swf. One sound swf and one video swf but none are loading in IE? Anyone know what wrong?

  • Help with classnotfound error

    Hello, I have an application that uses one jar file (jt400.jar). It runs fine in the ide. After I jar it, it runs, then complains that it cannot find the jt400.jar Error is: Exception in thread "main" java.lang.NoClassDefFoundError: com/ibm/as400/acc

  • Split applet in multiple jars downloadable on demand

    What I'm trying to do is fairly simple, usual and a widespread practice in Flash/Flex/Silverlight. I need to split an applet in multiple modules, then download only the modules the users explicitly requests, and provide a preloader for each download.