Simulations in FPGA

Hi all,
I started learning LabVIEW FPGA very recently and I have a query regarding the simulations that can be done. I don't have any NI FPGA hardware with me. I am using only the development computer for programming in LabVIEW FPGA.
 I am using "Execute VI on»Development Computer with Simulated I/O"option when I add only FPGA in my project to simulate some simple codes. But when I try to add Real Time system and then add FPGA, I/O ports never get listed under FPGA. I am attaching a snapshot of Course Manual of LabVIEW FPGA and project file I tried to replicate from the same. You can see that there are no I/O ports listed in my project file. 
How to get these ports appear in my project file without posessing any Hardware? ( I have replicated the steps given to simulate CompactRIO or Single-Board RIO in http://digital.ni.com/public.nsf/allkb/F466AD83D24F041D8625714900709583 , but when I added a program and compiled it, it gave error after it couldn't find the real time target). 
If I am able to add the ports, is it possible for me to get the simulated waveforms from those ports like I can view waveforms using ModelSim? Is it not possible to create a complete project and check the input and output waveforms in simulator on the development computer itself without having any of the hardware?
Kindly help me out. Thanks in advance.
Sharath

Well, what you're asking for is a very common request, and I definitely agree that being able to see waveforms from within LabVIEW would be extremely useful.
It is really straight-forward to use ModelSIM, as you mentioned, or even ISIM (ships with the LabVIEW FPGA Module Compilation Tools) to get simulated waveforms.
ISIM can get a little difficult if you need to simulate the host interface, as this requires VHDL knowledge to write a test-bench. Fortunately, there is a really easy way to work around needing to write VHDL to provide the diagram with data. My usual method is shown here:
Using a conditional disable structure like this lets you pre-load a memory block with IO data. You can even use an initialization VI to load in TDMS or CSV data if you want. (the default case just has my IO item)
If you do have access to ModelSIM PE, Cycle Accurate Co-Simulation is a really powerful tool, and I highly recommend it. It's probably less tedious than you think once you get used to it.
Cycle accurate simulation is currently only supported on some targets, and unfortunately not the 9074 that you appear to be using. For simulation, I would recommend adding an RSeries(784x or 785x) or FlexRIO device to the project to simulate small pieces of code.
Cheers!
TJ G

Similar Messages

  • Fpga simulation with simulated IO on development computer - synchronization to clock

    I would like to simulate an FPGA design using the "Simulated IO on development computer" concept.
    But for more complex designs such as ADC inputs or in my case cameraLink it is necessary to synchronize the stimulus and the response to the system clock.
    Is that possible using "Simulated IO on development computer"?

    Hi etgohomeok,
    I believe I understand your explanation, but please correct me if at any point I misread some of the information above.
    Debugging LabVIEW FPGA code on the host computer definitely has a lot of benefits such as traditional LabVIEW debugging tools, visualization tools, and LabVIEW libraries not usable in LabVIEW FPGA VIs. Most of the time the goal of using the code in simulation mode allows us to verify the proper logic in the VI without the pains of compile time.
    With that said, execution on a development computer will definitely not run at the rates of your FPGA VI and that can be verified in the Understanding Simulated Time on the Host section of the Debugging FPGA VIs Using Simulation Mode (FPGA Module) help document.
    When executing the code on the development machine LabVIEW delays as long as necessary to run the logic in the VI rather than making sure it meets the strict timing requirements that compiling the FPGA VI would.
    I hope this information helps!
    Sam B.
    Applications Engineer
    National Instruments

  • FPGA Simulation ISim error

    I'm looking to start simulating my FPGA Target in ISim and have run into an interesting error.
    When I have built and exported the Simulation file (It runs in ISim although without a Testbench it doesn't show much).
    When I try to write a LV VI using the Simulation VIs I get an error as shown below.
    Is this functionality not supported for ISim?  THIS Webpage seems to suggest editing the VHD Files is required for interfacing int his way which is really not very comfortable.
    Can someone comment on this because information ont his topic is not exactly easy to find on the NI website.  I have found that Xilinx has abandoned Modelsim so hopefully this functionality is (or will be) supported again.
    Shane
    Say hello to my little friend.
    RFC 2323 FHE-Compliant

    I know ISim can simulate but my problem is the difference between the two available ways of simulating.
    One is to write a test routine in VHDL which, having written FPGA code in LabVIEW is really not comfortable at all. (This is possible with ISim)
    The better of thwe two methods is to actually use LabVIEW to write the test code which then interfaces with the simulated hardware, allowing us to implement software for testing much more rapidly and also utilising actual project code.
    At the moment this second version is only available using Modelsim SE (Which apparently no longer exists, it has changed names) which costs in the region of 20k Euros.  Kind of makes LabVIEW look cheap.
    So my request was if this second, far more comfortable method, would be possible to implement in Isim or if it already IS implemented but I have performed soem wrong action in tryint to get it to work.
    Best regards
    Shane.
    Say hello to my little friend.
    RFC 2323 FHE-Compliant

  • DMA RT to FPGA guaranteed order?

    I have a question regarding the sending of data via FIFO to an FPGA card via DMA.  I would assume that if I have several locations in my RT code sending data via DMA FIFO that it is still guaranteed that any given DMA Transfer (let's say 12 data values) are delivered atomically.  That is to say that each DMA node data is sent as a contiguous block.
    Take the following example.  I have two DMA FIFO nodes in parallel.  I don't know which is going to be executed first, and they will most of the time be vying for bandwidth.  Both nodes send over the SAME DMA FIFO.  Does the data arrive interleaved, or is each sent block guaranteed to be contiguous ont he receiving end.  Do I end up with
    Data0 (Faster node)
    Data1 (Faster node)
    Data2 (Faster node)
    Data3 (Faster node)
    Data11 (Faster node)
    Data0 (Slower node)
    Data1 (Slower node)
    Data11 (Slower node)
    or do the individual items get interleaved.
    I'm kind of assuming that they remain in a contiguous block which I'm also hoping for because I want to abuse the DMA FIFO as a built-in timing source for a specific functionality I require on my FPGA board.  I can then use the RT-FPGA DMA Buffer to queue up my commands and still have them execute in perfect determinism (Until the end of the data in my single DMA transfer of course).
    Shane.
    Say hello to my little friend.
    RFC 2323 FHE-Compliant

    Woah, new avatar. Confusing! 
    I am going to preface this by saying that I am making assumptions, and in no way is this a definitive answer. In general, I have always had to do FPGA to RT streaming through a FIFO and not the other way around.
    When writing to the FPGA from the RT, does the FIFO.write method accept an array of data as the input? I'm assuming it does. If so, I'd then make the assumption that the node is blocking (like most everything else in LabVIEW). in which case the data would all be queued up contiguously. Interleaving would imply that two parallel writes would have to know about each other, and the program would wait for both writes to execute so that it could interleave the data. That doesn't seem possible (or if it was, this would be an awful design decision because what if one of the writes never executed).
    Of course, this is all assuming that I am understanding what you are asking.
    You're probably safe assuming the blocks are contigous. Can you test this by simulating the FPGA? If your'e really worried about interleaving, could you just wrap the FIFO.write method up in a subVI, that way you are 100% sure of the blocking?
    Edit: Had a thought after I posted, how can you guarantee the order things are written to the FIFO? For example, what if the "slow" write actually executes first? Then your commands, while contiguous, will be "slower node" 1-12 then "faster node" 1-12. It seems to me you would have to serialize the two to ensure anything. 
    Sorry if I'm not fully understanding your question.
    CLA, LabVIEW Versions 2010-2013

  • Problem in creating the simulation dll using SIT 2.0.3,matlab 7.1,labview 7.1

    I am currently using SIT 2.0.3 ,using this toolkit i am trying to create the simulation DLL for labview.but i am getting the following error in matlab command window
    Error: File: C:\SimulationInterfaceToolkit\ModelInterface\basic.tlc Line: 117 Column: 7
    Undefined identifier ReleaseVersion
    Error: File: C:\SimulationInterfaceToolkit\ModelInterface\basic.tlc Line: 117 Column: 25
    The == and != operators can only be used to compare values of the same type
    Error: File: C:\SimulationInterfaceToolkit\ModelInterface\basic.tlc Line: 260 Column: 26
    Undefined identifier ReleaseVersion
    Error: File: C:\SimulationInterfaceToolkit\ModelInterface\basic.tlc Line: 260 Column: 44
    The == and != operators can only be used to compare values of the same type
    Is this a version problem?
    I am using
    Simulation interface toolkit 2.0.3
    Matlab 7.1.0.246(R14) Service pack
    Real time workshop V.6.3
    Labview 7.1
    Microsoft visual C++ 6.0
    Can anybody help me to solve this issue.

    Hi Jayasheela,
    Here are the Read Me files for different versions of Simulation Interface Toolkit. Usually, the toolkit will only work with the versions of software that are explicitly stated.
    Simulation Interface Toolkit 3.0 Readme
    The MathWorks, Inc. MATLAB® / Simulink® application software release 13.x or 14.0, 14.1, 14.2, 14.3
    The MathWorks, Inc. Real-Time Workshop® release 13.x or 14.0. 14.1, 14.2, 14.3
    Microsoft
    Visual C++ 6.0. You can use Microsoft Visual C++ .NET 2003 only if you
    installed the LabVIEW 7.1.1 maintenance release.
    National Instruments LabVIEW Real-Time Module 7.1.x for ETS Targets.
    (Optional) National Instruments LabVIEW FPGA Module 1.1.x, for customized FPGA VIs used in real-time simulations involving FPGA devices
    Simulation Interface Toolkit 3.0.1 Readme
    The Simulation Interface Toolkit (SIT) 3.0.1 updates SIT 3.0
    to support LabVIEW 8.0, the LabVIEW 8.0 Real-Time Module, and the
    LabVIEW 8.0 FPGA Module. You also can use SIT 3.0.1 with LabVIEW 7.1.x. However, you cannot install SIT 3.0.1 for both LabVIEW 8.0 and LabVIEW 7.1.x.
    Simulation Interface Toolkit 3.0.2 Readme
    The Simulation Interface Toolkit (SIT) 3.0.2 updates SIT 3.0.1
    to support LabVIEW 8.2, the LabVIEW 8.2 Real-Time Module, and the
    LabVIEW 8.2 FPGA Module. You also can use SIT 3.0.2 with LabVIEW 7.1.x or with LabVIEW 8.0.x. However, you cannot install SIT 3.0.2 for more than one version of LabVIEW on the same computer.
    This maintenance release also adds support for the following products:
    The MathWorks, Inc. MATLAB® / Simulink® application software R2006a.
    The MathWorks, Inc. Real-Time Workshop® R2006a.
    If you install SIT 3.0.2 for LabVIEW 7.1, you can use
    Microsoft Visual C++ 6.0 to convert models to model DLLs. If you
    install SIT 3.0.2 for LabVIEW 7.1.1, 8.0, 8.0.1, or 8.2, you can use
    either Visual C++ 6.0 or Visual C++ .NET 2003.
    Simulation Interface Toolkit 4.0 Readme
    The MathWorks, Inc. MATLAB® / Simulink® application software release 13.x, 14.x, or 2006a, 2006b, 2007a
    The MathWorks, Inc. Real-Time Workshop® release 13.x, 14.x or 2006a, 2006b, 2007a
    Microsoft Visual C++ 6.0 or .NET 2003
    You can also take a look at this KnowledgeBase article, which condenses some of the previous information.
    MATLAB®, Real-Time Workshop®, and Simulink® are the registered trademarks of The
    MathWorks, Inc. Further, other product and company names mentioned herein are
    trademarks, registered trademarks, or trade names of their respective companies.
    Amanda Howard
    Americas Services and Support Recruiting Manager
    National Instruments

  • How exactly does the memory in Labview Fpga work

    I am using a PXI 7853 and I for the past few days I have been playing around with using Memory blocks in the FPGA .
    Now I am relatively new to Labview FPGA programming and hence I would be grateful if someone could provide me with clarifications on teh following :
    a) Since I am working on the development host computer when I initialize the Memory with the option of using a Memory initialization VI then how does it exactly happen in the backend .What I mean to ask is , when i change values of memory in the development computer and then compile the FPGA VI into the board ,is it that the Memory information is ported into the FPGA .If this is the case then in what form are the details initially saved in the development computer .
    b) Is it possible for me to use the initialization VI method to change the valuesi n memory while the FPGA VI is running .If not ,then would it make a difference if I stop the VI and then change the values using initialization method .Would that actually reflect on the FPGA or should I have to re compile the FPGA VI every time I change the memory values in the development computer using the 'Initialization VI method ( that is available as an option when we right click on the memory block in the project explorer window )
    I tried testing with simulation for FPGA VI and found that when i try to change the memory values by running the initialization VI . a pop up comes that says that it is not possible as the FPGA VI is still in use .
    Any light on this or guidance with links would be highly appreciated 
    Cheers 
    sbkr
    Solved!
    Go to Solution.

    sbkr wrote:
    a) Since I am working on the development host computer when I initialize the Memory with the option of using a Memory initialization VI then how does it exactly happen in the backend .What I mean to ask is , when i change values of memory in the development computer and then compile the FPGA VI into the board ,is it that the Memory information is ported into the FPGA .If this is the case then in what form are the details initially saved in the development computer .
    When you compile the FPGA VI, it will include the values used to initialize the memory.
    Are you asking what happens if you run the FPGA VI on your development computer, and your FPGA VI writes to the memory block, will the new values be included when you compile the FPGA VI? No, those values will be lost. The values that are included in the bitfile are the values that you used to initialize the memory block, as defined in the memory properties dialog box. The initial values are saved in the LabVIEW project file along with the memory block definition.
    sbkr wrote:
    b) Is it possible for me to use the initialization VI method to change the valuesi n memory while the FPGA VI is running .If not ,then would it make a difference if I stop the VI and then change the values using initialization method .Would that actually reflect on the FPGA or should I have to re compile the FPGA VI every time I change the memory values in the development computer using the 'Initialization VI method ( that is available as an option when we right click on the memory block in the project explorer window )
    You need to recompile the FPGA in order to use new initialization values, because those values are part of the FPGA bitfile.

  • Pendulo invertido

    De antemano gracias por la ayuda que me puedan brindar...
    La universidad en la que estoy estudiando tiene 2 modulos Qnet (pendulos invertidos) que quiero trabajar sobre  NI Elvis I con el ELVIS 3.0; llevo mas de quinde dias intentando haer que funcionen pero no lo he logrado, la universidad tiene instalado Labview 2009. con los modulos:
                   Aplication Builder
                   Control Design and Simulation Module
                   FPGA
                   Mobile Module
                   Real-Time Module
                   Statechar Module
    y las siguientes toolkits
                   Adaptive Filter
                   Advances Signal Processing
                   Digital Filter Design
                   PID and Fuzzi Logic
    y ademas
                   Report Generation Toolkit for Microsoft Office
                   System Identification
    tengo un CD donde encuentro varias *.vi, y varios documentos pdf de manuales y guias; cuando intento correr por ejemplo el "QNET_ROTPEN_Lab_03_Gantry_Control", me pide los siguientes archivos:
       load
       896       CD Convert SS to SIM MMO Space.vi
       920       CD_Create Symbolic State Space Model.vi
       1067     cd_Minimal State-Space Realization.vi
       1193     CD ManageSnZgrid.vi
       1195     cd_Pole-Zero Map (State-Space).vi
       1197     cd_Step Response (State-Space).vi
       1287     cd_Stability (State-Space).vi
       1352     cd_Parametric Time Response (SS).vi
       1353     CD Parametric Time Response.vi
       1394     cd_Pole-Zero Map (State-Space) [Obsolete].vi
       1396     cd_Pole-Zero Map (State-Space) [Obsolete].vi
    cuando carga la aplicacion, sale un mensaje de error:
                 While loading QNET_ROOTPEN_Lab_03_Gantry_Control.vi, LabVIEW reported 1 warning:
                 - Constant changed to hidden control (1 warning)
    Quedo en espera de una respuesta...
    muchas gracias

    Hola raurelio
    Gracias por utilizar los foros de National Instruments. Las tarjeta QNET entrenadoras de Quanser son compatibles con NI ELVIS y NI ELVIS II. Para poder comunicarse con NI ELVIS y con LabVIEW se necesita tener instalados los drivers NI DAQmx y NI ELVIS disponibles en la siguiente liga:
      http://digital.ni.com/softlib.nsf/webcategories/85256410006C055586256BBB002C128D?opendocument&node=1...
      En el caso de estar usando NI ELVIS II se necesitará el driver NI ELVISmx disponible en la siguiente liga:
      http://joule.ni.com/nidu/cds/view/p/id/1133/lang/es
      Con estos drivers es suficiente para comunicarse con LabVIEW y NI ELVIS.
     Saludos
    Carlos Pazos
    Applications Engineer
    National Instruments Mexico

  • Parallel FPGA in LabVIEW/Multisim co-simulation

    Hi guys, is it possible to put 3 or 4 FPGA modules in a LabVIEW model and then co-simulate with Multisim running 1 plant model? I want to simulate a solar energy converter using multiple parallel FPGA cores (this part in on LabVIEW) driving multiple inverter bridges interacting with the grid (this part is the plant model on Multisim).
    Solved!
    Go to Solution.

    Hey hacmachdien,
    This should be possible.  If you have the Control Design and Simulation toolkit, you can use the Control and Simulation Loop to co-simulate with Multisim after installing the co-simulation plugin that comes with Multisim.  See the white paper here for more information.  
    If you encapsulate your LabVIEW FPGA logic inside of a subVI, you should be able to use those subVIs in the Control and Simulation Loop.  There are a few caveats with this.  First, the rate for the LabVIEW FPGA SubVIs will need to be configured according to how they will run in the real hardware.  For example if the subVI is inside of a Single Cycle Timed Loop that is configured for a 40 MHz clock, you will need to configure the period for that subVI to be 25 ns.  You can configure the period by right clicking on a subVI within a Control and Simulation Loop and going to SubVI Node Setup.  From this menu, you can set the execution type to be discrete and then configure the discrete timing.  Another caveat is certain things will not be supported such as I/O.  You can usually work around this by leaving your IO on the top level of your application and just passing the values into the subVI through controls and indicators.  Let us know if you have more questions!
    Pat P.
    Software Engineer
    National Instruments

  • Tick Count Express VI outputs '0' on FPGA target running with Simulated I/O

    When I set my target to "Execute VI on Development Computer with Simulated I/O", the Tick Count VIs all output '0 every time they execute. How can I get them to output a progressive count (in the "ticks" instance) or a proper timestamp (in the "ms" instance)?
    Solved!
    Go to Solution.

    I used LabVIEW 2013 SP1 and I was unable to reproduce this issue on my end. The screenshot below shows my result.
    As shown on the Front Panel, the output from the Tick Count Express VI was not 0 on every iteration of the loop.
    To make sure we are comparing the same code, can you reproduce this issue with the Tick Count shipping example?  You can find this shipping example in the Toolkits and Modules>>FPGA>>CompactRIO>>Fundamentals>>Clocks and Timing>>Tick Count section in the LabVIEW Example Finder.  
    Regards,
    Tunde S.
    Applications Engineer
    National Instruments

  • Flexrio FPGA dma and dram simulation

    I have a pair of Flex RIO 7966r boards where I am trying to perform DRAM to DMA transfers.  Each FPGA uses both banks of DRAM.  One bank for capturing Camera Link frames, the other bank for capturing sub-frames from within each frame (And performing some processing on the sub-frame data).
    Each DRAM bank is written into from their own target scopes FIFOs.
    Each DRAM bank is read into their own target-to-host DMA FIFOs.
    When only one side is operating or the other (Capturing sub-frames by themselves or full frames by themselves) everything flows nicely.  But when I capture both at the same time, there appears to be some sort of contention in the DRAM (I suspect from the DMA engine).  Since I am simulating all of this, I would like to ask if anyone has the detailed descriptions of the DRAM interface signals below?  These are generated by LabView but I have found no explanation of what they mean in any documentation.
    Also, in the simulation build, there is obviously a DMA simulation.  But from within the simulator, I can find no signals related to the FPGA based DMA FIFOs or the simulated DMA transfers.  All I can infer about the DMA transfers is the effect on the DRAM above.  The DMA FIFO is being populated directly from the DRAM (Yes, this is a highly modified variant of the 10 tap cameralink (with DRAM) example from the NI-1483 examples).
    Does anyone know how I can see the DMA bahavior from within a simulation?  This would most likely allow me to see exactly why the contention is taking place.
    Thanks!

    Hey xl600,
    I'm not immediately sure how to have ISim display the DMA Engine behavior, but I'll see if I can I dig anything up for you. I've come across a couple of other users encountering issues with FIFO signals appearing in ISim over on the Xilinx forums, so it might be worthwhile to post there as well in case it happens to be due to ISim itself.
    Regards,
    Ryan

  • FlexRIO update fpga simulation libraries error

    I downgraded to ModelSim 6.5c to remain fully compatible with niFPGA.  I am now trying to generate simulation VHDL files and need to update my FPGA simulation libraries.  When I do so, I get the following error:
    "An error occured while updating the FPGA simulation libraries.  Make sure ModelSim is installed correctly and the files in the ModelSim directory are not read-only."  
    Not the most helpful of messages.  ModelSim is installed correctly and runs.  All files in all subdirectories have read/write permissions.  What is labview looking for that it is not finding?  Maybe that will help narrow down the issues I am having.
    Thanks in advance.

    Hello again manjagu.
    There was a Corrective Action Request (CAR) filed regarding this sort of behavior however, it indicates that there should not be an issue with 6.5c and LabVIEW 2010 SP1.  I am curious as to if the problem may be a result of an artifact left by the installation of version 10 or 6.6d.  Did you uninstall these versions completely prior to installing version 6.5c?  One thing that might possibly work to remedy this situation is to remove the ModelSim environment variable from your system as that seems to be related to the issue.  For information on modifying environment variables in case you are unfamiliar with this process, please reference the article below.
    How to Add, Remove or Edit Environment variables in Windows 7?
    Hope this helps.
    Regards,
    Michael G.
    Michael G.
    Applications Engineer
    National Instruments
    Self-realization: I was thinking of the immortal words of Socrates, who said, "... I drank what?"

  • FPGA code simulation on dev computer doesn't run

    Wrote some code in LV2013 for NI5640R FPGA, and when tried to run it on Dev computer - nothing happened. VI isn't running, no errors or popup windows.
    While looking for the cause of the problem, I've noticed that in Tools->Options->FPGA module Simulator was set to "<None>". ModelSim is not installed, and ISim doesn't work for some reason (it can be selected, but simulation still doesn't run).
    I've tried to repair the NI FPGA, Xilinx Tools 10 and 14 with no result.
    Any ideas?

    hey thu^^
    This is actually probabably caused by changing your LabVIEW Default Directory under Tools>>Options>>Paths. You can try changing it back, or if you don't have any specific settings in your LabVIEW environment that you want to keep, just rename your labview.ini file (C:\Program Files (x86)\National Instruments\LabVIEW 201x) to labview.ini.bak and restart LabVIEW. You could also just do that as a quick test.
    Cheers!
    TJ G

  • Fpga Simulation from Custom VI'a. Problem with reading TDMS data for simulation.

    Hi there. I am havin a small no Big problem trying to use certein data for simulation purpose. All I/O are set up in custom Via. Everything works fine when it is set up like on a pic.1 Unfortunatly I would like to use certein data writtent to TDMS file. I tried to do it like in PIC2. It works on normal VI'a. It reads the file to array. Unfortunatly I don't know how to make it works in custom via set up for FPGA simulation. I am getting 0.
    PIC.1 Random data...
    PIC.2 Data from a file format.

    oki I managed my self. For some reason I had to skip first number in the array.

  • FPGA Simulation - custom VI for I/O relative path

    Hello
    I'm trying to use FPGA simulated mode with Custom VI for FPGA I/O on different machines. The problem is that if I open the project on different machine in different location, the path to my VI is still absoulte, which result in error when running the FPGA. Is it possible to set this path relative to the project directory?
    This image says it all i think:

    Hi PiDi,
    Unfortunately it is not possible to set it relative path.
    I found this idea also on idea exchange. I have already voted for it. I invite you to vote and hopefuly there will be other users that are interested in this particular feature and it will be implemented in future releases of LabVIEW.
    http://forums.ni.com/t5/LabVIEW-FPGA-Idea-Exchange/Use-relative-paths-or-project-items-for-custom-te...
    Best regards,
    IR

  • Fpga debug using simulated IO via custom VI - error message

    I have an PXIe-7966R with MXI interface to Win7-64/LV2013-32bit. The setup is working. VI's can execute on FPGA target.
    I want to debug an FPGA vi on the host computer using simulated IO via custom VI. I try to follow the tutorial in LabVIEW Help: Tutorial: Creating Test Benches (FPGA Module). Have made the custom test vi and the "inverter.vi", and set the proper simulated execution mode.
    When running the "inverter.vi" (FPGA target but for now simulate on host) I get this error message dialog:
    Execution already in progress
    Another FPGA VI for this target is already executing on the development computer. Stop the other VI before running this vi.
    I can't see where I could stop this other fpga vi. In fact I can't find this other vi at all. Here is the project, which can't be any simpler:
    Any hints are welcome.
    Solved!
    Go to Solution.

    thanks for your quick reply.
    I tried restart LV and sure enough this changed things. Now I get this error
    which could make sense as the IO simulation VI (simTb.vi in my project) is not tested (can't do that without the fpga target under test). This is shown in part below. Note that it is the auto generated template from fpga simulation setup dialog.
    and here is the fancy fpga target vi under test (yes i try to make life easy:-))
    So no host vi for communication with fpga - just the simualtion test bench.
    Ok, I will work on from here and try to fix the error reported. Although I dont se any reason why it reports that its not supported to write to "IO Module\TTL Out Enable 1"
    But clearly a restart made a diffence.

Maybe you are looking for

  • Please help me about an error of JBO-26022

    My jdev version is 11.1.2.3 and weblogic server version is 10.3.5. I test a example from oracle jdev code corner, which is "68-contextual-event-table-selection-262529", and it is about table selection event handling. Then I tried the method in my own

  • Can i use an old eprint email account with a new printer?

    can i use an old eprint email account with a new printer?

  • MY IPOD NANO has a blank screen and wont turn on!! HELP

    My ipod screen is blank and it seems to be frozen that way! I've tried charging it overnight. Yet, in the morning... no results. I cant reset the ipod because it can't be turned on. I've checked the hold button and that's also good to go. Help!?!?

  • How do i trigger a button using keyboardstrokes

    hello, im trying to trigger my button using keyboard strokes. For example pressing ALT + S would trigger my addbutton. and like A would trigger my exit. I tried using some keylisteners to my Jbuttons but cant seem to get it to work. Am i heading in t

  • Tablespace allocated size in Oracle SQL developer

    Hello, Currently in my oracle 11g database, there are two datafiles under the users tablespace. Today during one application installation, I ran into this error msg: ORA-01654: unable to extend index BSANITY.SI_CUID_I7 by 8 in tablespace USERSSo i ch