Control & Simulation Loop failed to compile

Dear Forum Members,
I have a problem with a Control & Simulation Loop program (attached) that just doesn't compile & run.  I believe that the problem is associated with the 'Feedback Node' at the bottom of this Control & Simulation Frame since if this is taken out the program will run ok.
Can anyone please advise me if this 'Feedback Node' is incorrectly used or in violation of anything.  I have tried various ways to overcome the problem i.e. initialising it at the start but nothing works.  The error message that I received just says "VI failed to compile".
Appreciate any help with this.

Hello bunnykins, 
The Feedback node really isn't a supported function in control design and simulation. The behavior you're reporting, and the work around are both documented in the know issues of the module here: 
201449
Return
A Feedback Node on a Simulation Diagram causes the VI to fail to compile
The Feedback Node does not make sense semantically within a Simulation Diagram, due to the fact that most ODE Solvers will execute the diagram multiple times per iteration and may need to reject steps and try again, filling the Feedback Node with bad data.
Workaround: Use the Memory block from the Simulation Utility palette. If a delay of greater than 1 is desired you can chain multiple memory blocks in sequence.
Reported Version: 8.5
Resolved Version: N/A
Added: 07/31/2010
Applications Engineer
National Instruments
CLD Certified

Similar Messages

  • Simple counter within Control & Simulation Loop

    Does anyone know a simple way of creating an incremental counter within the Control & Simulation Loop ? It's not possible to have a For loop within the Control & Simulation Loop hence the shift register method is out.

    The "simulation parameters" function on the utilities pallete outputs a "Timestep Index" that is incremented each step. That's the simplest option. Alternatively, you could use a sub VI that executes on major steps and contains a for loop.

  • Extract Signal Tone Information in Control & Simulation Loop

    Hi,
    I have a simple Block Diagram to try Control & Simulation Loop. I added a Sine Wave signal and a Waveform Chart connected to it. They work just fine.
    I tried to add a Extract Single Tone Information , from Signal Processing, and connect it to the output of the Sine Wave generator but I do not know how to create a proper Time Signal.
    I thought of storing the output of the Sine Wave in an array but I was unsuccessful since I could not add any shift register to this type of control loops.
    Any help or suggestions?
    Thanks
    Attachments:
    Control_n_Sim_Test.PNG ‏9 KB

    Hi Siamak,
    Thanks for your reply mate.
    However, when I built a same BD as you posted on, it somehow didn't work. I also attached a photo of my BD here.  I used the gauge to measure when I run the programme. The gauge shows there is no signal output after the "collector"VI (gauge no.13). would you please check it for me? Many thanks!
    BR
    Floyd
    Attachments:
    tone info extraction.png ‏134 KB

  • Control & simulation loop, VI failed to compile

    hi!
    I want to simulate asynchronous generater. but I cant run my VI. It failed to compile. I have read some reason in this forum but I cant fix It. who can help me!
    the name of main VI is: mpdb_2.vi
    thanks you!
    Attachments:
    MPDB_2.zip ‏1951 KB

    Which version of LabVIEW are you using?
    In 2013, the error says: Tmech_Ef_1.vi 'Tmechc_Ef':subsystem has illegal term names. And in analyzing the subsystem, I noticed that the Enum where you choose "EF or Tmech" has a "2 <space>" as a label.
    To fix this, just put simple name without spaces in the beginning or the end of the name. In doing this, the VI is runnable.
    Now, another note: I noticed you are trying to pass paraemters between Mathscript and Control Design and Simulation. We have a feature called Parameter Hierarchy that you can just transfer information between both Models of Computation without going through the wires. Also, we do have a plug-in for Mathscript that you can just use for this purpose. Please see shipping examples in the links below and documentation for more information:
    C:\Program Files (x86)\National Instruments\LabVIEW 2013\examples\Control and Simulation\Simulation\Mathscript Integration
    C:\Program Files (x86)\National Instruments\LabVIEW 2013\examples\Control and Simulation\Simulation\Get-Set Model Parameters
    Barp - Control and Simulation Group - LabVIEW R&D - National Instruments

  • Memory function in Control and Simulation loop - ODE solver problem

    Hello,
    I am currectly using the control & simulation loop to simulate the behaviour of what is essentially a spring-damper-mass system. In the process the change in time (dt) is being used to integrate an arbitrary value. I am using a built in memory function to store the time, to calculate the time change (dt).
    The simulation is rather complex, due to the necessary accuracy needed, not all the ODE solvers can handle it. Currently I am using Adams-Moulton method, this works fine for the simulation. However it cannot detect the change in time, the change is constantly zero. This problem worked it self out by using another ODE solver, but then the simulation was rather messed up (even when I tuned the step sizes and tolerances). So I am quite confident that Adams-Moulton is one of the best suited ODE solver for the problem at hand.
    Is there another way to store the previous time and use it calculate the time difference, than using the memory function? Has anyone experienced such problems before?
    I have been doing alot of error searching using the probe, but I am quite sure that there is a problem with the ODE solver and the memory function. See picture below, showing in basic how the change in time is being calculated.
    I am rather new to LabVIEW, so if there could be something else I have missed I will be glad to hear it.
    PS! I have tuned the minimum step size/relative and absolute tolerances for the Adams-Moulton to simulate the behaviour of the system correctly.
    Solved!
    Go to Solution.

    Hi Willy,
    I am sorry, I can not upload the VI some of the content is confidential. I have attached a larger picture of the section were the change in volume and time is calculated, dV and dt. Also I have marked the two memory functions used. Hopefully this can help.
    My parameters:
    - ODE solver Adams-Moulton
    - Relative tolerance      1e-8
    - Absolute tolerance     1e-7
    - Minimum step size     0,0005
    - Maximum step size    0,01
    - Initial step size           0,01
    - Auto discrete time      On
    - Decimation                 0
    - Synz loop to timing source off
    Attachments:
    06-05-2014 21-20-00.png ‏52 KB

  • DAQ Assistant with multichannels causing Simulation Loop slow?

    Hi, another LabView newbie here.
    I have in a Real Time Target (NI 9132)  a Control & Simulation Loop with DAQ Assistant block inside, whose signals are fed into a Discrete State Space block. The discrete state space model has 1 second time step. I have set the Simulation Loop parameters so that it executes every 1 second as well (see Fig. A below). *sorry for the big white gap under the figures..
    The DAQ assistant acquisition mode is set as "1 Sampe (On Demand)".
    However, when I run the VI, the plot seemed to be updated much slower than 1 second rate. To confirm this, I put an "Elapsed Time" block inside the Simulation Loop. The "elapsed time" shows the actual time in seconds while the simulation plot show slower time (see Fig. B below).
    I tried to isolate the problem by removing the block one by one. Finally, I found out that this problem was caused by (at least) the DAQ Assistant which acquires multichannels data of NI 9214. When I remove some channels and leave one or two channels, the VI runs at the actual time (see Fig. C below). But when I added more channels reading, it became slower again. 
    Here is the snippet of the block diagram (after all other blocks were removed):
    What am I doing wrong here? I'm going to use all of NI 9214 channels so how not to have similar problem like this?
    I look forward to hearing any relevant comments from the members. Thanks in advance.
    Tian

    Hi Tian,
    why do you need a Sim loop anyway?
    - When it comes to speed you shouldn't use the DAQAssistent. Use basic DAQmx functions…
    - Use parallel running loops for each task. Put DAQmx functions in their own loop, running in parallel to your Sim loop…
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • How to use Fuzzy Logic Controller for transfer function in labview control and simulation loop?

    I am facing problem with fuzzy PD logic controller for transfer function in control and simulation loop.
    Plz Help me in this regard...................
    i have attache snapshot of my program
    Attachments:
    fuzzy in simulation loop.JPG ‏52 KB

    Hi Sankhadip,
    Sorry for the late response. I was looking at your code and
    I noticed that the graph scale does not start from zero. That might be the
    reason why you don't see the transient in the simulation. To change the scales
    simply double click on the lower limit and set it to zero. If this is not the
    expected results, can you please post the expected results, so we can see what are the
    differences between the results . Also, you might be using different
    solvers, and that gives different results as well.    
    Thanks and have a great day.
    Esmail Hamdan | Applications Engineering | National
    Instruments

  • How to control DC motor in a simulation loop??

    Hi,
    I am Xiaofei, a beginner of LV. We plan to use feedback control on 2 DC motors using Simulation loop. Now the question is: i am not quite sure if the simulation loop is only a virtual simulation of the system, or it can be used to control the real system. We will use 2 encoders to detect the actual postion of the motor shafts, and the decoding code works well in a while loop, as the attached VI. But we don't know how to make it work in a simulation loop. Whenever we put it in a simulation loop and click "run", there's no response  no matter what we do on the system. So if you know, could you tell me if the simulation loop is suitable for our application and what we can do to run the encoders in it? Thanks a million!! 
    Xiaofei

    Hi Xiaofei,
    Thanks for posting on the NI forums!  Can you tell me a little more about your setup?  What do you mean by Simulation loop?  Do you mean Simulation Module or Simulation Interface Toolkit?  What hardware are you using to control the motors and read the encoder feedback?  What kind of motors do you have?  Can you read the encoder inputs in LabVIEW when you turn the motors by hand?
    Thanks,
    Maclean G.
    National Instruments
    Applications Engineer

  • Synchronize Control and Simulation loops

    When simulating control systems with LV Control and Simulation loops, I often have multiple loops running at different rates. For example, I have a PWM loop running at 20 kHz, a data acquisition loop running at 100 kHz, and a control loop running at 10 kHz. How can I synchronize all of these loops so that they stay on the same time base? Obviously the master time base will need to be at least as fast as the fastest simulation loop.
    I've tried synchronizing all loops to the 1 kHz clock (I'm running on Windows), but each loop runs one period per clock tick (e.g. my 20 kHz loop counts up 50us per clock tick, my 100 kHz loop counts up 10us per clock tick, etc). I need all of the loops to be synchronized to one master time base so the simulation time is identical in each loop, but each loop will be executed at a different rate.
    Any thoughts?
    Solved!
    Go to Solution.

    Hello,
    A quick suggestion - why can't you run all three systems in a single simulation loop, but have different sample rates for the blocks for each system ?
    Is your system entirely digital, or a mixture of continuous and digital - it may simplify things if you can convert everything to discrete time.
    Hope this helps,
    Andy Clegg
    Consultant Control Engineer
    www-isc-ltd.com

  • Resetting Integrator in Control and Simulation Loop

    Hello,
    I am trying to run a real-time simulation in Labview 14. I have prepared front panel controls and data flow such that I can reset the simulation to some preset initialization values upon clicking a button. However, I am not able to figure out how to reset the integrator in the control and simulation loop. Even after reinitialization of all the values, the integrators will overwrite the initialization values with whatever they were holding. Please help me find a way out.
    Thanks

    Please share what code you have so that we can see what you've tried and understand exactly what you're trying to do.

  • How to reset indicators in a control design and simulation loop before each run?

    Hi,
    I am new to Labview. I have tried using property node to reset indicators before each run. But since it is in a simulation loop, it gets resetted many times. I am primarily interested in resetting only before each run. 
    Any hint will be of great help.
    Thank you!
    Regards,
    Divya

    Hi Divya, 
    This KnowledgeBase article describes 2 methods for resetting your Front Panel programmatically:
    http://digital.ni.com/public.nsf/allkb/08E7DDAE66A7D02C86256DDA00630E75
    However if you have tried these methods, and are still not getting the behaviour you want, it would be easier to help if you could post your code (or even a screen shot of the section of code where you're trying to reset your values)

  • ERROR:Simulator:861 - Failed to link the design

    hi
    i'm a very new student user of xilinx 14.2 design tool who just used it 2 weeks.
    this week i should do "Simulate Behavioral Model" but i can't....
    just i saw below messages.. and i don't know what is the matter.
    Started : "Simulate Behavioral Model".
    Determining files marked for global include in the design... Running fuse... Command Line: fuse -intstyle ise -incremental -lib unisims_ver -lib unimacro_ver -lib xilinxcorelib_ver -o {E:/2012_fall_ISE Project/3rd week/1bit full adder/full_adder1/full_adder1_TB_isim_beh.exe} -prj {E:/2012_fall_ISE Project/3rd week/1bit full adder/full_adder1/full_adder1_TB_beh.prj} work.full_adder1_TB work.glbl {} Running: C:\Xilinx\14.2\ISE_DS\ISE\bin\nt\unwrapped\fuse.exe -intstyle ise -incremental -lib unisims_ver -lib unimacro_ver -lib xilinxcorelib_ver -o E:/2012_fall_ISE Project/3rd week/1bit full adder/full_adder1/full_adder1_TB_isim_beh.exe -prj E:/2012_fall_ISE Project/3rd week/1bit full adder/full_adder1/full_adder1_TB_beh.prj work.full_adder1_TB work.glbl ISim P.28xd (signature 0x1048c146) Number of CPUs detected in this system: 2 Turning on mult-threading, number of parallel sub-compilation jobs: 4 Determining compilation order of HDL files Analyzing Verilog file "E:/2012_fall_ISE Project/3rd week/1bit full adder/full_adder1/half_adder1.v" into library work Analyzing Verilog file "E:/2012_fall_ISE Project/3rd week/1bit full adder/full_adder1/full_adder1.v" into library work Analyzing Verilog file "E:/2012_fall_ISE Project/3rd week/1bit full adder/full_adder1/full_adder1_TB.v" into library work Analyzing Verilog file "C:/Xilinx/14.2/ISE_DS/ISE//verilog/src/glbl.v" into library work Starting static elaboration Completed static elaboration Fuse Memory Usage: 104956 KB Fuse CPU Usage: 327 ms Compiling module half_adder Compiling module full_adder Compiling module full_adder1_TB Compiling module glbl Time Resolution for simulation is 1ps. Waiting for 3 sub-compilation(s) to finish...
    ERROR:Simulator:861 - Failed to link the design
    Process "Simulate Behavioral Model" failed
    i've obeied the procedure in my reference but not working. 
    so i re thought and re did but don't know why..
    in my friend's computer with OS windows 7, that works well with same source but not mine..
    hmm..by any chance, doesn't xilinx 14.2 support OS windows 8 yet??
    my OS is windows 8 32bit enterprise k RTM ver.
    is this a OS compatibility problem or just my mistake in doing my procedure??
    i want to know if there is anyone whose OS is windows 8 and Isim works well!!!
    i can't find about 'operating xilinx in windows 8' in my country's sites.
    please tell me :)
    P.S.  sorry for my poor English, i don't know english well 'cause i'm Korean,
    if i could use Korean, I would say what i think and want more concretely..ㅠㅠ
    anyway thanks for reading..& please reply if you knows~

    This may be the same fix that ningunos2010 provided but I got this idea from anthbs's post. I looked up what .exe files the ISim tool used, and then simply took the nt64 files and copied them over to the nt folder so that it wouldn't run the 32 bit ISim but the 64 bit one that anthbs claimed would work. At first it didn't come up with any errors it says the simulation ran successfully but it wouldn't open the actual ISim gui. In order to fix this I went to the ISimgui.exe file and unclicked run as administrator. I'm not sure if that's clicked by default or if I had changed it at some point trying to fix the issue.
    Anyway these are the files I copied over from nt64 to nt folder:
    fuse.exe
    isimgui.exe
    vhpcomp.exe
    vlogcomp.exe

  • Control Simulation and Design with DAQmx

    Hi all, thanks in advance for any helpful direction you can provide.
    I was recently setup with Labview 2013 Developer Suite and the Control and Simulation Design add-on and was hoping to get a little nudge in the right direction.
    I have used Labview and the NI DAQmx setup for regular external measurements before but never integrated with the Control and Simulation toolbox. I am trying to use the Control and Simulation Toolbox to model a simple system (to start) in conjunction with the DAQmx setup I have, and am somewhat at a loss.
    For example, I want to use Labview Control and Simulation to simulate a simple integrator (1/s). The input for the integrator is an external analog voltage (from separate electronics) coming in through the DAQmx setup and the output of the integrator is also and external analog voltage going back to the external electronics. In essence, a very simple control loop.
    I have been digging through tutorials and searching for examples, of which there have been many excellent ones. However, I have yet to find an example whereby the control and simulation loop is talking to the outside world through the DAQmx interface. I do not believe the RT module or RT target is necessary for timing as the fastest we are running the loop is around 10Hz. I have found some small examples discussing how the Control and Simulation Toolbox timing can be controlled by the RT target or even the DAQmx but I am still struggling on how to pass data in and out of the loop and how to set the timing parameters. How do you get it to simply run endless, for example.
    Again, thanks for any information or direction you can provide.

    Hey Cabala,
    Based on your description this would be a great starting example. 
    https://decibel.ni.com/content/docs/DOC-11521 
    Depending on your hardware you may or may not be able to use hardware timed single point for acquisition but you could set it up for continuous as well and use the same structure of 1 read and 1 rate for each cycle of the loop. 
    You can replace proportional gain function with the CD and Sim Loop and place a Integrator function with in that.
    Sorry it appears one of the dependencies for that example is missing so you can't run it off immediately. 
    Please let us know if you have any other questions. 
    Kyle Hartley
    RIO Product Support Engineer
    National Instruments

  • Stopping the VI when running a infinite simulation loop

    Hi,
    This is my first time on this forum and for that matter, I started with Labview a few months back. I am developing a standalone application for servo motor control through USB 6211. The motor control was completed and I created a DLL which I could run through VC++. My application requires me to call this DLL or an EXE based on this VI repeatedly with direction, velocity and angle parameters and the remainder of the program depends on the finishing of this executable.
    The issue was that the VI did not stop execution after the motor was turned off, I suppose due to the simulation loop. I tried using the abort execution block but was advised to avoid doing so. My next step was to move all the DAQ assistant blocks into a while loop and couple the motor ON/OFF control with the stop button. But this hasn't helped either. I am attaching the VI and the SubVI here. I went through the board but did not come across a query involving a simulation loop. Any suggestions?
    Attachments:
    rotate.vi ‏551 KB
    Velocity_Con.vi ‏430 KB

    Hi Vivek!
    Thank you for contacting National Instruments.  From the information you have provided here, along with the attached VIs, I would agree that it is the simulation loop that is causing the problems when stopping the VI.  It looks like you are using the LabVIEW Simulation Module. 
    When using these loops there are two primary ways of stopping their execution.  The total simulation time can be controlled from the input node, the box at the upper left of the loop, or the Halt Simulation VI can be used from within the Utilities palette.  I would suggest taking a look at the detailed help for the simulation loop in order to better understand the methods of stopping this execution.  As you mentioned it is always good programming practice to avoid using the abort button because this can result in open references being left without any programmatic resolution.
    I hope this helps!  Let me know if there is anything else I can help with or clarify.  Have a great day!
    Jason W.
    National Instruments
    Applications Engineer

  • Toolkit for feedback system? Motion; PID; Control/Simulation???

    Hi, I have to develop/program an organ bath system - a feedback system mimicking real sinusoidal breathing oscillations (shown in attached images). I have NI Labview 8.5, NI-Motion 7.6, a linear motor (M-235 DC-Mike actuator), an MID-7654 Servo Power Motor Drive and a pressure transducer. I believe I will need to control the PID controller and am aware of the PID Control Toolkit as well as the Control Design and Simulation Toolkit for NI Labview. However, is it possible to control the system using the NI-Motion software I have at the moment? If not, do I have to purchase both the PID and Control/Simulation Toolkit or just one? Thanks in advance...
    Attachments:
    feedback design1.JPG ‏25 KB
    feedback system1.JPG ‏42 KB

    Dear Garry,
    Do you have a motion controller to interface the MID-7654 to your
    computer and LabVIEW? This would be the PCI-734x or PCI-735x. If you
    do, I believe that you could implement your application with LabVIEW
    and NI-Motion. You could do so by using the analog feedback feature for
    the control loops for each axis. Then, you could specify the optimal
    sinusoidal moves/pressure patterns that mimic real breathing patterns.
    The analog feedback from your pressure transducer will be used during
    the move(s) to maintain the pressure that you want.
    Please see Chapter 13 here for more details:
    NI-Motion User Manual
    http://www.ni.com/pdf/manuals/371242c.pdf
    Here is also a good discussion forum post on Analog feedback:
    Can i use NI-7358 to implement a hybrid position force/torque control system?
    http://forums.ni.com/ni/board/message?board.id=240&thread.id=2976&view=by_date_ascending&page=1
    I believe that the above option would work for you, and you would not
    have to use the Control Design and Simulation Module or the PID
    Toolkit. Please let me know if you have any additional questions. I
    haven't actually set up a system with analog feedback from a pressure
    transducer before, but I believe that the above would be a very viable
    option.
    Best Regards,
    ~Nate

Maybe you are looking for

  • Transformation is adding unformatting data

    I am very new with XSLT and all the details, so excuse me is I don't see the error in my ways. I also realize this is probably a very long post, but I did not want to leave anything out just in case someone wanted to see the xml, xsl or html. What I

  • Problem with upgrading the dsee to 6.3.1

    Hello I have to upgrade the version of directory server from 6.2 to 6.3.1 when i am upgrading the shared components there is the patch 125446-13 i follow the procedure unzip the patch cd into it and launch patchadd . but i get the followign error : b

  • How do I unlock my phone if i dont remember the code?

    I am going back to an old phone and it is locked. I cant remember the code what do I do?

  • How do you disconnect an Canon HG10 from a mac after importing?

    Thanks to the weak support in the manual about using the camera with a mac, I can't seem to figure out the proper method for disconnecting the camera. Ejecting the camera's drive icon doesn't free the camera back up. It sits there cautioning against

  • Dock error after 10.5.6 update

    After the update 10.5.6 get the following error, just after login. Somebody knows how to solve this issue. Process: Dock [189] Path: /System/Library/CoreServices/Dock.app/Contents/MacOS/Dock Identifier: com.apple.dock Version: 1.6.6 (614.7) Build Inf