Simulink and labview syncroniza​tion

I have a simulink model that has inputs controlled by Labview and outputs displayed in Labview. On my desktop computers the simulink and labview are running in sync, their times are always together. When I load the programs onto my two laptops, the simulink model is running 2-10 times faster than the Labview program, and when I change controls in Labview, they don't take effect in simulink unless I change the control MANY times. I think this lack of communication between labview and simulink on my laptops is happening because they are out of sync, but not sure, any suggestions?

Hello,
You mentioned that the program works fine when you are using a desktop, is simulink and LabVIEW running together on the same desktop or do you have them networked together? By default simulink communicates through port 6011 using TCP/IP. It possible that for some reason that the communication is inhibited by some process on your laptop that is also using that port and causing only some of the simulink data to get through. It is possible to change ports when starting simulink if you know that a port is blocked or being used by another process. When starting the SIT server, type NiMatlabServer ('start',xxxx) You can choose any port that you want, but many times firewalls will close ports. Each firewall is different so you will have to look into what ports that you have available for TCP/IP communication with the SIT server. Have a great day.
Regards,
Danny G.
Applications Engineer
National Instruments

Similar Messages

  • Simulink and Labview

    Hi all,
    could I use the SimMechanics(Simulink) to create a mechanical model and after that create a DLL with the SIT(Simulation Interface Toolkit) to use it in Labview?
    Thank you.
    Ziman 

    Hello!
    Yes, you can have SimMechanics and build a DLL and then use it in LabVIEW using the Simulation Interface Toolkit.
    However there are a few things to consider since it seems like it works quite differently compared to ordinary Matlab/Simulink
    Quote:
    “SimMechanics model differs significantly from other Simulink models in how it represents a machine. An ordinary Simulink model represents the mathematics of a machine's motion, i.e., the algebraic and differential equations that predict the machine's future state from its present state. The mathematical model enables Simulink to simulate the machine. By contrast, a SimMechanics model represents the structure of a machine, the geometric and kinematics’ relationships of its component bodies. SimMechanics converts this structural representation to an internal, equivalent mathematical model. This saves you from having to develop the mathematical model yourself."
    I am not sure whether the mathematical models are present for the user for optimizations purposes.
    Regards,
    Jimmie Adolph
    Systems Engineer Manager, National Instruments Northern Region
    Bring Me The Horizon - Sempiternal

  • Q10 contacts and calendar syncroniza​tion

    Using outlook account, syncronization of contacs and calendar by usb connection fails, with message at the end of the process "writing of the coomit on the device failed". It was not like this at the beginning. Computer and device software are updateted. All other usb options work fine, like sync images and music.Somebody knows how to solve it?

    First I did a sync only for calendar and it went through. Then I did the sync for contacts only and got the same error, so the problem in my case is in the contacts. So I generate the BB link logs enabling it in the BB link but without the windows registry modification (I'm not that skilled). The Log has been created anyhow, I went through it and found the line "outgoing payload" (there was only one) where a problem with one contact was indicated. I cancelled the contact on the phone and on the computer and did the sync again, but I got the same error. So I created again the log and found another problem with another contact. After three times I realised that with each try a new error with a different contact was created (only one at a time). So the problem is not solved, calendar sync works fine if done separately, contacts sync doesn't work properly: contacts added on the phone are copied to the computer but any change on the computer is not written on the phone. Maybe the way is to cancel all the contacts on the device and redo the sync from scratch, but I do not know how to cancel the contacts on the device (other than one by one, I have more than 400....) and the risk is not to get the contacts back on the device never again.
    Bye and thanks again,
    Maurizio

  • Simulink to LabVIEW automatic conversion

    I used the Tools>>Control Design and Simulation>>Simulation Model Converter to convert a Simulink file (.mdl) to a .vi (multiple files). The problem is that the converted files have a lot of errors and therefore  are not executable. The most frequent error is "You have two terminals of different types. the type of the source is cluster of 2 elements. The type of the sink is 1-D array of double [64 bit real (15 digit precision)] " How to I resolve this error? Is there a difference between Bus Creator/Bus Selector and Bundle/Unbundle in Simulink and LabVIEW respectively? I attach a sample VI. Thanks.
    Attachments:
    Water Pump.vi ‏13 KB

    Hello AAR,
    I cannot speak as much to the exact functionality of the Bus Creater/Bus Selector, but I will see if I can describe the bundle/unbundle to help clear up any confusion.
    The Bundle function creates a cluster in LabVIEW which can be a group of like or unlike data.  This cluster can be passed around the diagram and then the Unbundle or Unbundle by Name functions can be used to pass out a particular value in the cluster.
    I am not sure what your original model was doing, but it is easy to see why the VI you attached is broken.  You can resolve the broken wires in some different ways, and it depends on your intention with your original model.
    The attached file builds several scalars and an array into a 1-D array.  This array is then Bundled with a second array.  The cluster output of this bundle function is then being passed to an indicator called Water and a Global Variable called Inlet_water.  The datatype of both Water and Inlet_water is an array of doubles.  In LabVIEW the cluster datatype and array datatype or not interchangable.  I am not sure how these arrays are used in the rest of the model, but here is where your options come in.
    You could replace the Bundle with a Build Array(concatenate the inputs to make one 1-D array rather than a 2-D array).  Then, wherever you need access to specific elements of the array you can use the Index Array function.  For this modification you will need to know the index of the element you are interested in when using the Index Array.  Either that or convert your global variable and indicator to be clusters.  I think the first modification would probably be the simpler modification, but again it is dependent on how you intend to use the data in the rest of your model.
    I hope this helps, please post back if you have further questions about this.
    Regards,
    Angela M
    Product Support Engineer

  • Capture Signal in Simulink and Use it in LabVIEW

    Hi,
    Is there an effective way of capturing a signal wave in Simulink and use it as an input to a transfer function in LabVIEW?

    yeap but how could I use the signal transfered to be use to a transfer function?
    to use a transfer function, you need a control a simulation loop. how would you connect the signal in SIT to the transfer function? and how about the synchronism of the signal.
    at the moment Im using a global variable and add another while loop where I insert the control and simulation loop with the transfer function in it.
    but, my problem is I could not make the signal to run synchrounizely in time as in Simulink. If in Simulink the signal run for 10s.. in LabVIEW, the signal waveform still look the same but the time is way off by hundreds, how could I synchronize the time value to be the same (ie. 10s)?

  • Syncroniza​tion

    Centro (Sprint)
    Posts: 1
    Registered: 05-02-2010
    0
    Syncroniza tion
    Options
    Edit Message
    Delete
    Mark as New
    Bookmark
    Subscribe
    Subscribe to RSS Feed
    Highlight
    Print
    Email to a Friend
    Report Inappropriate Content
    05-02-2010 11:47 AM
    I tried to hotzync my centro in February and was not able to complete the process. I was told I had to many duplications, and I could only eliminate them by deleting them manually.
    Last February I started deleting events. It took me forever and in the meanwhile I still could not hotzync.
    I was deleting on friday once again, in my spare time, and I accidentally PURGED all my diary until this last week.
    Unfortunately, the last time I hotzynced to my desktop was sometime in Dec 09!
    I cannot hotzync now because my desktop has all these duplicates. There was once a time that I could eliminate the duplicates by pushing a button. I no longer see that application on the program.
    Do you have any experience with this?
    Is there anyway to retrieve the info i purged if I did not hotzync the last 6 months?
    Can I delete duplicates on the desktop without doing it manually?
    Message 1 of 1 (0 Views)
    Add Tag...

    Are you still experiencing this issue? If so, try doing a clean uninstall of the BlackBerry Desktop Software and then reinstall it. Here is the KB article for instructions. http://bbry.lv/aQCW6K
    -FS
    Come follow your BlackBerry Technical Team on Twitter! @BlackBerryHelp
    Be sure to click Kudos! for those who have helped you.
    Click Solution? for posts that have solved your issue(s)!

  • Fuzzy system from simulink to labview

    Hey all, hope you are well. 
    I have a developed fuzzy system in simulink, how would I go about transfering this over to labview? If anyone has any help or developed a model that can do this. Any advice is well recieved. 
    regards
    Alim Guy

    Hi Alim,
    This can be achieved using the Simulation Interface Toolkit, which can be purchased from this URL, which allows you to directly interact with models developed in Simulink. This toolkit will import the developed files into LabVIEW. In terms of researching Fuzzy Systems created within LabVIEW, if you have the Fuzzy Logic Toolkit, you can go to the NI Example Finder (LabVIEW > Help > Find Examples...) and search for the keyword 'Fuzzy' you can view Fuzzy Systems developed purely in LabVIEW. You can learn in depth about Fuzzy Logic Toolkit here to see if these tools suit your specific requirements.
    In terms of the ease of implementation, it really depends on what you're after. It's not so much what process is easiest, but what you are suits your current process of development. If you're used to developing systems in Simulink and it's the standard at your workplace, then it's probably more suitable to implement your Fuzzy System through the Simulation Interface Toolkit; there are some extra steps required when importing the data in terms of referencing the variables appropriately in LabVIEW but it's not a lot of extra work. If you're interested in developing and implementing your systems entirely from the LabVIEW environment, it will be worth using the Fuzzy Logic Toolkit.
    Regards, 
    Alex Thomas, University of Manchester School of EEE LabVIEW Ambassador (CLAD)

  • Error involving Report Generation Toolkit and Labview Run Time Engine

    Developed an application using LabVIEW 6.1 and LabVIEW Report Generation Toolkit for Microsoft Office 1.0.1. From there, tried to build a shared application for use with the LabVIEW Run Time Engine. The Run Time version functions properly until "New Report.vi" is called and then an error is generated, code 7, calling out "Open VI Reference in New Report.vi" could not be found. When building the application, I did include the "NI Reports Support" in the advanced installer options. The machine used for original development and application build is running Windows XP Pro and Office XP. Any suggestions??

    I am having the exact same problem but with LV 6.1 and M/S WORD 2000. It appears that the "New Report.vi" is trying to open "C:\APP.DIR\Word_Open.vi" and "C:\APP.Dir\Word_Open_Document.vi" by reference. The "OFFICE 2000.TXT" says that "_exclsub.llb and _wordsub.llb must be added as support files when building an application or a dynamic link library with the application builder." I added them as Support Files and I copied them to the "C:\TESTER\" where the TESTER.EXE is and I still get ERROR 7 in "NEW REPORT.VI" at VI OPEN REFERENCE.
    Do I need to make a "C:\TESTER\DATA\" sub-dir and put the support files there?
    I am building on MY COMPUTER on F: Drive on a network and transporting files to the real Tester.
    I displayed my App.Property of APP.DI
    R at start up and it is C:\TESTER\ ! How would my application know that "Word_Open.vi" and "Word_Open_Document.vi" are actually inside the _wordsub.llb?
    Any ideas ?
    Greg Klocek

  • Communication problem with FP-2000 and Labview 7.1

    I am using the FP-2000, DO-403 and TC-120 modules with Labview 7.1, being run on a host computer via an ethernet connection.  The setup has operated flawlessly for ~2 years; recently, I have noticed that while a VI is running on my PC, connection to the FP module ceases.  There is no pop-up "connection error" in my VI, and the all the power and status LEDs on the FP modules are normal.  I can still interact with the VI on the front panel while it is running, but nothing is passed to the FP module.  I have tried communicating with MAX when this occurs and receive a "no connection" message.  Is there a simple solution here, such as rebooting the FP module?  Or is it time to invest in new hardware?
    Justin

    I am having a very similar problem with my cFP-1804 and Labview 7.0.  I have no problem initiating communication over ethernet, and the vi will run perfectly for hours or even days.  Inevitably, however, at some point there is no longer communication with the cFP-1804.  Data is no longer being passed either to or from the cFP-1804, and MAX will show that the device is not connected.  I also will not see any pop-up error message that says the connection has been lost, either from Windows or from Labview.  The vi will just continue to run as if nothing has happened.  Turning the power to the cFP-1804 off and then on will always resolve this problem.  Since I am using the cFP-1804 primarily for data logging, however, this behavior is particularly problematic as the system will record nothing but zeroes until I discover that the communication has been severed and perform the manual power cycling.
    I have been unable thus far to determine the cause of this problem.  Any help would be greatly appreciated.
    Thanks,
    Derrick

  • I have a problem with simulation in Matlab 6.5 and LabVIEW for PID controllers

    I have a problem with simulation in Matlab 6.5 and LabVIEW. I have some methods for granting PID controllers in MATLAB to go but not LabVIEW. International Teams degree to transfer two but when I go past the fourth degree no longer work. We changed the formula for calculating the parameters for grade four and gave me some good values for Matlab award but when I put on LabVIEW have not settled. formulas are available in PDF and are. Please help me and me someone if possible. Thanks
    lim.4 generation parameters in MATLAB program and comparison methods are for second-degree transfer function.
    Solved!
    Go to Solution.
    Attachments:
    Pt net.zip ‏2183 KB

    This is the VIs what i try to make,but is not work. This pdf. document was used to create last VIs PID. thenk you for your colaboration.
    Attachments:
    PID create by me.vi ‏312 KB
    tut_3782.pdf ‏75 KB

  • Google Earth Plugin and LabVIEW: High CPU usage when adding placemarks

    Hi,
    I posted this question on stackoverflow earlier this week but feel it might be better suited to the LabVIEW community specifically so I'm reposting here:
    I'm writing an application which uses the Google Earth Plugin to display events on the globe. Each event is a single point kml placemark with an icon which is a 3kb png file. Placemarks are uploaded to the plugin as they are received by the software. I am experiencing increasing CPU usage with the number of placemarks that are added.
    I have tested displaying a new placemark every second and running until the software running the plugin completely froze (graph attached). The GEPlugin (green trace) stopped responding (i.e. the globe did not respond to the mouse) at around 1200 placemarks added and CPU usage was at ~30%. When the software itself (red trace) froze the plugin was using around 50% CPU and ~3700 placemarks had been added). After the freeze, no new placemarks were added which caused the software to respond (but not the plugin) so I could clear all the placemarks. After the placemarks were cleared from the globe, the CPU usage of the plugin returned to around 5% CPU.
    So what I've seen is that GEPlugin CPU usage increases linearly with each kml placemark added. Is this the expected behaviour/ a normal limitation of the plugin? If not is there a more efficient way of adding many placemarks to the globe?
    I am using GEPlugin version 7.1.1.1580 (API version 1.010) and LabVIEW 12.0f3
    Please see the test results attached. Any input greatly appreciated!
    Original stackoverflow post:
    http://stackoverflow.com/questions/20994323/google-earth-plugin-with-labview-high-cpu-usage-when-add...
    Attachments:
    Performance Log 020114_095115.png ‏82 KB

    Hello,
    I have had a look at your graphs and understood what you are trying to do. To me it seems that as the image gets more complex it gets harde to render which wold likely cause increase in CPU usage resulting in the freeze. I would suggest you try running the program on anoher computer to check on the RAM front of things. If this is a limitation of the GE Plugin then unfortunately I can not do much to help, but if you think this is a problem coming from your LabVIEW code then you can post your code here and I can take a look.

  • Run-time engine problem in Labview 2009 and Labview 2010

    I have a problem with Labview 2009 and Labview 2010. I updated my Labview 2009 into 2010. But it turned out to be a trial one, because i did not have the serial number. So I uninstalled the Labview 2010. however, the funny stuff came over. I cannot use my Labview 2009. So i uninstalled Labview 2009 again. But eventually, I could not reinstall Labview 2009. Every time i had a runtime error and i could not proceed with the installation. in addition, any installation  related to Labview is not permitted and the same error came up every time. it is very annoying.
    So, What is the deal?
    I attached the error here. Any comments or advice are welcomed and appreciated.
    Attachments:
    error.docx ‏2305 KB

    By chance is this machine's language set to any non-English locale?  You would check the locale setting by:
    Opening Control Panel.
    Opening "Regional and Language Options".
    Looking Under "Regional Options" >> "Standards and Formats"
    If it is set to something besides English, trying setting it to English and please report back what locale it was set to (or if this even solves the problem).
    Regards,
    - WesW

  • Report Generation Toolkit and Labview 8 ?

    I currently have Labview 7.1 on my computer, and Labview 8.  I have installed the Report generation Toolkit 1.1.1, but when I try to open several of the excel and word vi's in labview 8, I get the error that "constants wired to case structures were changed to a hidden control to maintain compatibility with labview 7.1 and earlier".  And therefore the subvi's don't run.  Labview 7.1 doesn't have the toolkit installed, and I've even tried uninstalling 7.1 but nothing works.  How do I fix this?  Relinking to subvi hasn't work because it is not offered on any of the errored vi's or subvi's.  The broken wires are going into invoke node vi's, but I don't know how to fix them.  It doesn't allow me to rewire them.

    If you dont want to use it in LabView 7.1, Just make a mass compile with LabView8.0 on the directory of you package.
    It could solve a lots of that kind of problem.
    Be sure that you do net need it with your previous version.
    Benoit
    Benoit Séguin
    Software Designer

  • How to acquire data through multiple channels in parallel using PXI 6070 E, PXI 4071 and LabVIEW?

    Hi,
    I am using NI LabVIEW, NI PXI 4071, and NI PXI 6070E to measure current through a variable resistance. Now, I am using one channel from SCB-68, but I want to add another channel in parallel so that I can have two resistors instead of one that I cam measure current through them.
    I have attached a Pdf file showing the setup for hardware in use and LabVIEW code also.
    Can anyone look at these files and give me guide lines or ideas that can help me resolving this issue, please.
    Thanks in advance.
    Best Regards,
    Shaheen.
    Solved!
    Go to Solution.
    Attachments:
    IV copy for HS.vi ‏55 KB
    Layout of NI Cards.pdf ‏248 KB

    Your 4071 can only do one measurement at a time. Your DAQ cannot measure resistance either nor has it got any analogue inputs.
    However, you could use a multiplexer and multiplex your 4071 DMM. This wont give you simultaneous measurements but can acquire data one after the other, the speed is obviously dependent on the multiplexer you choose!
    Hope this helps.
    Beginner? Try LabVIEW Basics
    Sharing bits of code? Try Snippets or LAVA Code Capture Tool
    Have you tried Quick Drop?, Visit QD Community.

  • MATLAB and LabVIEW Communication Optimal Performance

    I have tried my own code,  searched through forums and examples to try and figure out best method to communicate between LabVIEW and MATLAB.  Most of the information I found was over a year old and I was wondering if there was a better current solution.  My goal is to work in LabVIEW to collect the data, process in MATLAB and return the results to LabVIEW.  I have encountered some difficulty in my search and before I delve even further in to one in particular, I was wondering if anybody had an optimal solution with this communication protocol or solutions to my errors encountered thus far.
    I have looked at the following methods.
    1)TCP/IP and a very good example found here: http://www.mathworks.com/matlabcentral/fileexchange/11802-matlab-tcp-ip-code-example
    When I try to adjust even the example and communicate for my own purposes I get the following errors
    Error 63 if MATLAB server not running
    Error 66 occurs if the TCP/IP connection is closed by the peer. In this case, Windows notices that no data is returned within a reasonable time, and it closes the TCP/IP connection. When LabVIEW attempts to communicate after Windows closes the connection, error 66 is the result. 
    However, the example itself works perfectly and does not get these errors
    2)Math Script Node, works but the post below states that MATLAB Node is faster.
    "computing fft of a 1024x1024 matrix ten times (attached code). Result is that Matlab node version takes 0.5s versus 1.6s for Mathscript node version."
     http://forums.ni.com/t5/LabVIEW/Why-are-mathscript-performances-much-below-matlab-s/m-p/2369760/high...
    3) MATLAB Node, which states it uses ActiveX Technology seemingly works well, but loses time for data transfer.
    4) Trying to use the ActiveX functions or if there is other Automation potential.
    5)Other solutions that I have not found that might be better suited.
    Thank you for any help or suggestions in advance. 

    Barp and Mikeporter,
    Thank you for your assistance:
    The reason I need to do the processing in matlab is as you mentioned the processing script is coming from another person who has already developed it in matlab.  I almost have to treat it as a black box.
    The TCP/IP method was interesting is that none of the errors show up when I run the example but if I try to modify it in a new VI I get the errors.
    I have attached a simple program that just has a basic butterworth low pass filter I am trying to confirm if it works in the Matlab node.  I have done other simple codes which work, and this one does not seem the implement  the appropriate filter.  The LabVIEW signal and LabVIEW filter seem to work at the default values (but not if I change sampling rate) for the Simulation of signal, Matlab signal and Matlab filter work, but the Labview signal processed in Matlab is not working...
    Ideally it would be bandpass filtered (0.1-30) at sampling rate of 256 Hz and further processed from there, but I can't even seem to get low pass to work in the matlab to labview communication.
    Any help would be greatly appreciated.  Once I have that working I will have more of an idea of the constraints of the actual processing Matlab Code I will be using.
    Thank you again.
    -cj18
    Attachments:
    labview_matlab_filter.vi ‏70 KB

Maybe you are looking for