Cycle accurate simulation - ModelSim Student & sbRIO

Cycle accurate simulation support the current ModelSim PE Student Edition release is 10.0a?
For project with sbRIO, can I use Cycle accurate simulation?
Thanks in advance.

NI Brazil Support answer:
Some FPGA targets don´t support SIMULATION EXPORT including sbRIO. There isn´t a road map for new releases on LabVIEW with this extended feature .

Similar Messages

  • Simulations in FPGA

    Hi all,
    I started learning LabVIEW FPGA very recently and I have a query regarding the simulations that can be done. I don't have any NI FPGA hardware with me. I am using only the development computer for programming in LabVIEW FPGA.
     I am using "Execute VI on»Development Computer with Simulated I/O"option when I add only FPGA in my project to simulate some simple codes. But when I try to add Real Time system and then add FPGA, I/O ports never get listed under FPGA. I am attaching a snapshot of Course Manual of LabVIEW FPGA and project file I tried to replicate from the same. You can see that there are no I/O ports listed in my project file. 
    How to get these ports appear in my project file without posessing any Hardware? ( I have replicated the steps given to simulate CompactRIO or Single-Board RIO in http://digital.ni.com/public.nsf/allkb/F466AD83D24F041D8625714900709583 , but when I added a program and compiled it, it gave error after it couldn't find the real time target). 
    If I am able to add the ports, is it possible for me to get the simulated waveforms from those ports like I can view waveforms using ModelSim? Is it not possible to create a complete project and check the input and output waveforms in simulator on the development computer itself without having any of the hardware?
    Kindly help me out. Thanks in advance.
    Sharath

    Well, what you're asking for is a very common request, and I definitely agree that being able to see waveforms from within LabVIEW would be extremely useful.
    It is really straight-forward to use ModelSIM, as you mentioned, or even ISIM (ships with the LabVIEW FPGA Module Compilation Tools) to get simulated waveforms.
    ISIM can get a little difficult if you need to simulate the host interface, as this requires VHDL knowledge to write a test-bench. Fortunately, there is a really easy way to work around needing to write VHDL to provide the diagram with data. My usual method is shown here:
    Using a conditional disable structure like this lets you pre-load a memory block with IO data. You can even use an initialization VI to load in TDMS or CSV data if you want. (the default case just has my IO item)
    If you do have access to ModelSIM PE, Cycle Accurate Co-Simulation is a really powerful tool, and I highly recommend it. It's probably less tedious than you think once you get used to it.
    Cycle accurate simulation is currently only supported on some targets, and unfortunately not the 9074 that you appear to be using. For simulation, I would recommend adding an RSeries(784x or 785x) or FlexRIO device to the project to simulate small pieces of code.
    Cheers!
    TJ G

  • Microarchitecture simulator for T1

    I guess the simulator for OpenSPARC T1 SAM is an instruction accurate simulator. SAS is a full system simulator which includes SAM.
    But, what i am looking for a microarchitecture simulator like MET (Microarchitectural Exploration Tool set) of IBM. In other words i need a cycle accurate simulator and not instruction accurate simulator.
    Kindly tell me the potentiality of SAM. Is there a cycle accurate simulator for T1 avaiable.
    Thanks

    The M5 Simulator has some basic support for SPARC. The 2.0 beta 3 version should appear on the M5 website, www.m5sim.org, in the next few days. It supports running binaries in our syscall emulation mode where we fake syscalls (similar to the way simple scalar did it), but take real traps for things like register windows. Additional, it has limited support for full-system simulation. It support booting a single processor, but it doesn't support any real devices yet (it uses the same hypervisor calls Legion does to read block off the disk image, and doesn't have any support for networking), only support booting one processor at the moment, and hasn't been exhaustively tested. It does however boot to a solaris prompt.
    Ali

  • How to simulate external hardware in LabView FPGA ?

    Hello,
    I have a NI 7952R connected to a 6583 IO module.
    This IO module is connected to a digital sensor that continuously sends patterned data.
    I am developping the code for the FPGA and I would like to know how to perform a cycle accurate simulation of the whole system.
    There is an example for a cycle accurate simulation of the labView -> FPGA interface, but it doesn't include the behavior of an external hardware connected to the FPGA IO module.
    Now I have a VHDL simulation model (not synthetisable) that describes my sensor. How to include that into the iSim cycle accurate simulation?
    Do I have to alter the 6583 IO module CLIP files by including my sensor description into it?

    Thank you for your answer, I think now we have a common understanding of your intent.
    Your I/O simulation model is integrated via a clip or ip integration node:
    "In addition to VHDL code for the FPGA VI, you must provide simulation models for any IP you include through the CLIP and IP Integration Nodes. You specify the models for CLIP simulation and the IP Integration Node simulation through their configuration wizards."
    Taken from:
    Introduction to Cycle-Accurate Simulation
    http://zone.ni.com/reference/en-XX/help/371599H-01/lvfpgaconcepts/fpga_simulation_intro/

  • How to measure fpga execution time

    Howdy--
    I'm hacking through my first FPGA project without yet having the hardware on hand, and I find I could answer a lot of my own questions if I could predict what the execution time (ticks, mSec, whatever) of bits of my code will be.  Running FPGA VIs on the Dev Computer with built in tick counters does not seem to be the proper way to go.  Is it possible to know the execution time of FPGA code before compiling and running it on the target?
    If it matters to anyone, my context for the question is a situation where a 10 uSec loop is imposed by the sample time of my hardware (cRIO 9076, with a couple of 100 ks/S I/O cards), and I'm trying to figure out how much signal processing I can afford between  samples.
    Thanks everyone, and have a great day.
    Solved!
    Go to Solution.

    bcro,
    You can look into cycle accurate simulation, which would give you a better understanding of how your code will work.  More information can be found here: http://zone.ni.com/devzone/cda/tut/p/id/12917
    As a rough measure, you can estimate that simple functions will take one tick to execute.  However, there is not list of what is and is not a simple function.
    You could also try placing code inside a single cycle timed loop (SCTL), which would then guarantee that all of the code in the loop will execute in 1 tick.  However, if you are doing a lot of operations or trying to acquire an analog input, this will fail compilation.
    Drew T.
    NIC AE Specialist

  • Why should I adopt LABVIEW FPGA as a tool for developing my FPGA projects?

    Dear Friends, 
    Since I have started using LABVIEW FPGA, I got too many questions in my mind looking for answers! 
    1-      Does anybody can tell me “why should I adopt LABVIEW FPGA as a tool for developing my FPGA projects?”
    I mean there are many great tools in this field (e.g. Xilinx ISE, ….); what makes LABVIEW FPGA the perfect tools that can save my time and my money? 
    I’m looking for a comparison can show the following points:
    ·         The Code size and speed optimization.
    ·         Developing time.
    ·         Compiling time.
    ·         Verifying time.
    ·         Ability to developing in future.
    ·         …etc.. 2-     
    I’ve Spartan-3E kit, I’m so glad that LABVIEW support this kit; I do enjoyed programming the kit using LABVIEW FPGA, but there are too many obstacles!
    The examples come with Spartan-3E driver don't cover all peripherals on board (e.g. LAN port is not covered)! There is a declaration at NI website which is "LabVIEW FPGA drivers and examples for all on-board resources" Located at: http://digital.ni.com/express.nsf/bycode/spartan3eI don’t think that is true!
    Anyway, I will try to develop examples for the unsupported peripherals, but if the Pins of these peripherals are not defined in the UCF file, the effort is worthless! The only solution in this case is to develop VHDL code in ISE and use it in Labview FPGA using HDL node!?
    3-      I wonder if NI has any plan to add support for Processor setup in Labview FPGA (Like we do in EDK)?
    4-      I wonder if NI has any plan to develop a driver for Virtex-5 OpenSPARC Evaluation Platform ?http://www.digilentinc.com/Products/Detail.cfm?Nav​Path=2,400,599&Prod=XUPV5 
    Thnaks & regards,Walid
    Solved!
    Go to Solution.

    Thanks for your questions and I hope I can answer them appropriately
    1. LabVIEW FPGA utilizes the intuitive graphical dataflow language of LabVIEW to target FPGA technology. LabVIEW is particularly nice for FPGA programming because of its ability to represent parallelism inherent to FPGAs. It also serves as a software-like programming experience with loops and structures which has become a focus of industry lately with C-to-gates and other abstraction efforts. Here are some general comparison along the vectors you mentioned
    Code Size and speed optimization - LabVIEW FPGA is a programming language. As such, one can program badly and create designs that are too big to fit on a chip and too slow to meet timing. However, there are two main programming paradigms which you can use. The normal LabVIEW dataflow programming (meaning outside a single-cycle loop) adds registers in order to enforce dataflow and synchronization in parity with the LabVIEW model of computation. As with any abstraction, this use of registers is logic necessary to enforce LabVIEW dataflow and might not be what an expert HDL programmer would create. You trade off the simplicity of LabVIEW dataflow in this case. On the other hand, when you program inside a Single-Cycle timed loop you can achieve size and speed efficiencies comparable to many VHDL implementations. We have had many users that understand that way LabVIEW is transformed to hardware and program in such a way to create very efficient and complex systems.
    Development Time - Compared to VHDL many of our users get near infinite improvements in development time due to the fact that they do not know (nor do they have to know) VHDL or Verilog. Someone who knows LabVIEW can now reach the speeds and parallelism afforded by FPGAs without learning a new language. For harware engineers (that might actually have an alternative to LabVIEW) there are still extreme time saving aspects of LabVIEW including ready-made I/O interfaces, Simple FIFO DMA transfers, stichable IP blocks, and visualizable parallism.  I talk to many hardware engineers that are able to drastically improve development time with LabVIEW, especially since they are more knowledgable about the target hardware.
    Compilation Time - Comparable to slightly longer to due to the extra step of generating intermediate files from the LabVIEW diagram, and the increased level of hierarchy in the design to handle abstraction.
    Verification Time - One of our key development initiatives moving forward is increased debugging capabilities. Today we have the abilities to functionally simulate anything included in LabVIEW FPGA, and we recently added simluation capabilities for Imported IP through the IP Integration node on NI Labs and the ability to excite your design with simulated I/O. This functional simualation is very fast and is great for verification and quick-turn design iteration. However, we still want to provide more debugging from the timing prespective with better cycle-accurate simulation. Although significantly slower than functional simulation. Cycle-accuracy give us the next level of verification before compilation. The single cycle loop running in emulation mode is cycle accurate simluation, but we want more system level simulation moving forwrad. Finally, we have worked to import things like Xilinx chipscope (soon to be on NI Labs) for on-chip debugging, which is the final step in the verification process. In terms of verification time there are aspects (like functional simulation) that are faster than traditional methods and others that are comparable, and still other that we are continuing to refine.
    Ability to develop in the future - I am not sure what you mean here but we are certainly continuing to activiely develop on the RIO platform which includes FPGA as the key diffentiating technolgoy.  If you take a look at the NI Week keynote videos (ni.com/niweek) there is no doubt from both Day 1 and Day 2 that FPGA will be an important well maintained platform for many years to come.
    2. Apologies for the statement in the document. The sentence should read that there are example for most board resources.
    3. We do have plans to support a processor on the FPGA through LabVIEW FPGA. In fact, you will see technology on NI Labs soon that addresses this with MicroBlaze.
    4. We do not currently have plans to support any other evaluation platforms. This support was created for our counterparts in the academic space to have a platform to learn the basics of digital design on a board that many schools already have in house. We are currently foccussing on rounding out more of our off-the-shelf platform with new PCI Express R Series boards, FlexRIO with new adapter modules, cRIO with new Virtex 5 backplanes, and more.
     I hope this has anwered some of the questions you have.
    Regards 
    Rick Kuhlman | LabVIEW FPGA Product Manager | National Instruments | ni.com/fpga
    Check out the FPGA IPNet for browsing, downloading, and learning about LabVIEW FPGA IP Cores

  • Modelsim PE Student Edition under Wine

    Hi all,
    I've been trying to run Modelsim (free student edition is windows only) under Wine with no success.  The program installs with no errors but does not start.  The problem is that some googling shows that it should be trivially installable/usable under Wine and it actually does work as expected under Xubuntu 14.04 (install, run, done).
    When I run the program via GUI, nothing seems to happen but the desktop GUI starts acting funny -- mostly the cursor disappearing under dialogs and windows.
    When running from the terminal, I get the following:
    fixme:font:freetype_SelectFont Untranslated charset 255
    Reading C:/Modeltech_pe_edu_10.3c/tcl/vsim/pref.tcl
    fixme:ole:RemUnknown_QueryInterface No interface for iid {00000019-0000-0000-c000-000000000046}
    What's the best way to figure out what's happening?  How can I check what the differences are in Ubuntu vs. Arch wine?
    I'm using MATE with wine, wine-mono, and wine_gecko from default repos.
    Thanks!

    NI Brazil Support answer:
    Some FPGA targets don´t support SIMULATION EXPORT including sbRIO. There isn´t a road map for new releases on LabVIEW with this extended feature .

  • Off Cycle Payroll error - not encountered during Payroll Simulation

    Hello,
    We've modified the Canadian payroll schema K000 by creating some new subschemas, rules and payroll operations to address calculation of the factor /801.  When running payroll simulation for a test employee, everything looks fine.  However when we run off cycle payroll for the same employee and period, we are encountering an error.  Checking the log of off cycle, it shows that it is also accessing schema K000.
    What could have caused the OC error?  is there a difference in what off cycle and simulation accesses?
    Thanks.
    Malou Navera

    I assume that you copied the K000 and create a new custom schema called ZK00 or YK00 or something else other than K000.
    Go to this table  V_T52OCV and set up an entry with
    CALC       RPCALCK0 PayrollVariantName
    "PayrollVariantName" is your variant with your custom schema for RPCALCK0.
    Or goto IMG
    Payroll: Canada
    Off-Cycle Activities
    Set up report variants for off-cycle activities
    Edited by: Amosha on Oct 4, 2011 9:30 AM

  • [fpga] How to view an integrated IP in Modelsim?

    Hi, 
    i integrate a simple adder in my FPGA vi, and i use modelsim to simulate it.
    But, i'm wonder how to see my IP in the simulator in order to debug it, because i cannot find it.
    Thanks

    Hi Mhed !
    In order to start in good conditions maybe it would be interesting to use the document below : 
    Cycle-Accurate Co-Simulation with Mentor Graphics ModelSim
    This document explains how to use ModelSim and LabVIEW FPGA from creation, configuration to build. Thanks to it, you'll be able to check you didn't forget a step of configuration or something.
    Be also aware of the version you're using : Versions Required
    Hope it will help.
    Regards,
    Antonin G. Associated LabVIEW Developper
    National Instruments France
    #adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
    Communauté francophone des développeurs LabVIEW et des enseignants en STI2D
    Si vous souhaitez partage...

  • How to get a Verilog Simulator

    Hi,
    I am a Lamar University student. I read in few threads here that students can get a free copy of NC Verilog or VCS verilog simulator at free of cost for OpenSparc project. I searched through google for 2 days but still I am not able to find a link for University students to download this. So, I request those who know regarding this to guide me as soon as possible to get one of these simulator for my project.

    may be you can use ModelSIM student edition..
    you can searching for it at google

  • Regd: Unable to automatically find executables for simulator 'mti_se' from the following paths

    Hi,
    I am trying to simulate for Spartan 6 fpga in Xilinx ISE 14.7 in Ubuntu platform. But when I compile my files I am getting the error that the simulator "ModelSim" path is not specified in the directory and its not able to simulate the file. I provided the test bench too.
    But still I am not able to compile.
    Any information will be useful
    Thanks
    Sukaniyaa

     if you want to run simulaion on modelsim then you have to. If you want to simulate your design using XILINX ISIM you need not have it installed.
    Go to design properties and change the simulator to ISIM to run on xilinx simulator

  • Student LifeCycle Management

    Hi,
    In portal, is there any business package to configure student life cycle management smething like student self service like dat.
    kindly let me knw if it is available then wer i can get that and also configuration docuemnt.
    Regards,
    Venkat

    Hi Venkat,
    for ECC 6.0 EhP3 the to be installed packages for SLCM are:
    - SAPPCUI_GP 603
    - IS-HER-CSS
    - BP_ERP5_HIGHER_EDU_RES
    But besides Audit there is not much to be found in the package...
    hth,
    Michael

  • Is MultiSim a fraudulent product?

    I'm thinking it is.  I'm thinking that way back when there was Electronics Workbench (EWB), a simple simulator that worked, but too simple for it to catch on any where but in schools, as students could afford it, but engineers couldn't use it.  Then, losing money, as graduating students can't use EWB for real engineering, so minimal return customers, the management of EWB decided to revamp things, via MultiSim, grab the schools' guaranteed cash flow on the coattails of EWB, but the original designers were no longer available to do it again as MultiSim.  But, going in the hole on the MultiSim planning, the product was released anyway, knowing it didn't work and was wall to wall bugs.  Now, years and years later, National Instruments has bought into the money pit that is the MultiSim design, once again seeing dollar signs through schools and the potential for persons such as myself to take MultiSim into engineering.  The problem is it still doesn't work enough to do that.  True, it's easier to use, but it doesn't work, so it can't be used.  The real testing and troubleshooting to fix it has not happened, even as of today.  The turnover rate over the years with EWB-MultiSim is high.  Few remain.  Anyone I've worked with is no longer there; I've gone through many names as I've been turning up these MultiSim bugs for all these years.  I look at it all and the "f" word looks more and more appropriate when it comes to MultiSim (and you pick the version, any version.)  They've jacked up the price fourfold on a product that has never worked enough to take it out of a school classroom.  Likewise, this is also why, as I've cruised around this forum, I've found loads of unanswered questions, students asking for help and getting little.  MultiSim works enough for unknowing students to buy what the teacher is selling, but in reality it's just smoke and mirrors full of industry-supplied parts fudged into somewhat working within the MultiSim kluge.  The bugs will keep coming up until MultiSim is actually troubleshot by knowledgeable people and fixed, but they won't remain with the company long enough.  Why?  Because MultiSim Inc can't pay enough for knowledgeable people to remain AND deal with the, no doubt, tons of bureaucracy that must develop from selling a bad product for this long.
    But I started this thread to more hear your thoughts.  I'm so disgusted with MultiSim that I thought I'd just actually say what I've been thinking about the product, partially as a catalyst to hear people who know this product defend it.  However, that's not to say that if you tend to agree with my take above that you can't say something also.  As I see it, the worst that could happen is NI (the latest MultiSim owners) will kick me/us out and delete the thread.  But that won't hurt me a bit.  This forum has done me zero good, and, from the looks of it, NI doesn't even know what's said here.  (They may after my last correspondence with them though, but I doubt it.)   So, though it may be forum suicide, I don't fear such a death.  However, I do see some potential good coming in doing this.  Maybe, just maybe, I am right, that there is still a slight bit of hope for MultiSim.    

    Hello Euler’s Identity: 
    I’d like to address some of the frustrations you present in your above post (in addition to the email responses you’ve received from our AEs and myself).  I’m very sorry to hear that you’ve had a negative experience with the software.  As always, our Applications Engineers would be very happy to assist you with any specific technical issues you might be having (I didn’t see any in the original post, and I believe the issue with the LM311 was addressed earlier today.):  
    I’d like to provide some of the recent history of the Electronics Workbench line of software (Multisim, Ultiboard, Multisim MCU Module).  Electronics Workbench was acquired by National Instruments (NI) in February of 2005 for two primary reasons:
    1.)     Provide a World-Class Software and Hardware Solution for Teaching Electronics Education
    With NI graphical system design tools such as NI Multisim (industry-standard SPICE simulation) and NI LabVIEW and prototyping environments such as NI Educational Laboratory Virtual Instrumentation Suite (NI ELVIS), it is now possible to design, prototype, and compare the characteristics of simulated circuits with real-world measurements in a single electronics education platform.  This complete integration across the design cycle gives students a deeper understanding of the circuit theory they are studying, and ultimately prepares them for more productive careers as design professionals
    2.)     Enhance Professional Electronics Design with Tight Integration of Measurement and Analysis Technology
    The Multisim SPICE simulation environment integrates tightly with NI measurement technology such as LabVIEW to provide a seamless transfer of simulation and real-world test data.  Professional engineers can now more easily validate designs using simulation data as a benchmark and even create more accurate simulations using real-world data as stimuli.  
    Additionally, as part of the acquisition, NI committed to develop Multisim and Ultiboard with the same strict focus on quality as the balance of the NI platform – it is one of the key focus areas for new releases.  Version 10, released in January 2007, was the first edition of the software released fully under NI standard software development practices (post-acquisition) and has received very positive feedback from existing users.  Of course, there is always room for improvement and innovation, and you can expect to see continued quality enhancements with each new release of Multisim and Ultiboard.
    The NI Electronics Workbench Group team is passionate about making world-class capture, simulation, and layout products both for education and for professional design.  As always, we welcome your constructive feedback to continue to improve these products over time, and we will do everything we can to ensure that you have a good experience with the product.
    Best regards,
    Nicole McGarry
    Director of Sales & Marketing
    NI Electronics Workbench Group

  • Adding/changing SPARC instructions

    I am hoping to find out how complicated/easy it is to add/modify the SPARC instruction set for the T1 or T2 cores.
    1. Is there a high-level flow for addding/modifying instruction decoding logic in the core RTL?
    2. Is the Sun compiler/code-generator open-source? Otherwise I guess I will have to write my own code-generator, or process the binary to patch in the modified instructions?
    3. Is there a cycle-accurate open-source instruction-set simulator that can be modified to quantify the effect of new/changed instructions?
    Thanks

    Hi Kernos,
    Hopefully those update versions will be available in Safari 3
    Doesnt' seem you can change them, however see this post
    I trust this is what he means by Curl
    Hope this helps some, perhaps someone will pipe in w/ more for you.
    Eme '~[ )♥♪

  • NewObject crash in a PLI/C/JNI code

    Hi,
    i call NewObject() function in a PLI/C/JNI app to create a java/lang/Vector instance.My Verilog simulator (ModelSim PE 5.4b) crash without display an error code.
    Please take a look , and see what is wrong with my code.
    jclass jclass_Vector;
    jmethodID jmethodID_Vector_new;
    jclass_Vector=_env->FindClass("java/util/Vector");
    if(jclass_Vector==NULL){
    return ;
    jmethodID_Vector_new=_env->GetMethodID(jclass_Vector,"<init>","()V");
    if(jmethodID_Vector_new==NULL){
    return ;
    jobject dataVector;
    dataVector=_env->NewObject(jclass_Vector,jmethodID_Vector_new);
    Thanks in advance,
    Marius

    As far as I can tell, there's nothing wrong with the small snippet of code that you've posted. The problem is elsewhere. Try making a small sample program that demonstrates the failure.
    Also, take a look at Jace - http://jace.reyelts.com/jace
    The equivalent code would be:
    // Create a new Vector
    Vector dataVector(); A lot simpler...
    God bless,
    -Toby Reyelts

Maybe you are looking for