Compare Labview and Lookout

I have one water distribution system which calls for Labview and another which calls for Lookout. What is the specific difference between the two and what are the applications where one would work better than the other.

Hi,
There are so many differences between the two packages that it will be difficult to describe. One main difference is ease of programming versus programming capability. Lookout gives you the ability to get up and running with a program very quickly. However, Lookout is not easily customizable. This is where LabVIEW shines. LabVIEW can do anything that Lookout can do ( as long as you have the Datalogging and Supervisory Control Module) but you can do much more. LabVIEW is also very easy to use compared to other programming languages.
I would suggest that you call us at 800-433-3488 and ask any specific questions that you may have.
Regards,
Mike
NI

Similar Messages

  • Security Vulnerabilities in LabVIEW DSC and Lookout - W32/Sdbot.worm

    The LabVIEW DSC
    module and Lookout install the Microsoft MSDE 2000 database. By default, the 'sa' password is left blank.  Several viruses exploit this this known security vulnerability in MSDE 2000.
    You can prevent infection by applying a secure 'sa' password to MSDE 2000.
    Make sure the MSSQL Server service is running
    Execute the following command line (replace new_password with the desired password):
    osql -U"sa" -P"" -Q"sp_password NULL, 'new_password', 'sa'"
    In particular, variants of the W32/Sdbot.worm virus are known to exploit this vulnerability.
    Refer to this KB or the DSC Module readme for more information.
    <http://digital.ni.com/public.nsf/websearch/42DFA4993437D7EE86256DE800570B39?OpenDocument>
    ~~

    Ben wrote:
    Thank you for getting the word out fast!
    1) Which versions of LV DSC installed "MSDE 2000"?
    2) How do I "Make sure the MSSQL Server service is running"?
    3) How do I start " the MSSQL Server service "?
    4) When you said "Execute the following command line " you mean go to Start >>>Run... and paste in the string you provided, correct?
    5) If I am using a standard firewall from MacAfee or the like, should I expect a notification when the attack occurs?
    Trying to be careful,
    Ben
    1) I believe all LabVIEW DSC systems since 7.0
    2) You should see an SQL Server icon in the system tray and selecting it you will see a green arrow if it is running and a red square if it is not. If you have the full version of SQL Server installed or a version such as 2005 this might be different. In my case for the Microsoft SQL Server Development Edition I have a separate application called SQL Server Configuration Manager inside the start menu that shows an overview over this and other things.
    3) Click on the icon in the system tray and select start, or go to above mentioned Configuration Manager if available or if you want to go on the deepest level go to the Administrative Tools Control Panel and select Services and in there make the SQL Server service started.
    4) I would think a real command console to be a bit more appropriate as you can see the feedback if something went wrong.
    5) If the Firewall is worth anything I would strongly expect this yes.
    Rolf Kalbermatter
    Message Edited by rolfk on 01-10-2007 10:58 PM
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Labview DSC 8.5 and Lookout 6.1

    Ok...so I'm moving to some new hardware and new OS's along with Labview upgrade and Lk 6.1.
    I have not separated these stages yet so I'm hoping NI can help me sort this out.
    The scenario:
    Run Windows Server 2008 and install LabVIEW 8.5, then LabVIEW DSC 8.5, then Lookout 6.1
    Lookout hangs on startup!
    Now windows Vista with SP1 is essentially the same as Win2k8 with SP1, so the question(s) is/are....
    1.)  Is Lookout 6.1 compatible with Vista SP1?
    2.)  Will Lookout 6.1 run properly installed on top of LabVIEW 8.5 w/ DSC?  and/or Can LabVIEW DSC 8.5 and Lookout 6.1 coexist and work properly?  If yes, can they run simultaneously on the same machine?
    3.) Does Vista SP1 break Lookout 6.1?  If not, why does it break on W2k8 server?
    I understand NI does not support server OS's  (can't really figure that out since Lookout actually runs more stable with a server OS....) but Lookout has always run fine on Windows Server OS's since NT4.
    I just want to know whether to bother setting up the same software scenario on Vista w/o SP1, or if you NI wizards know for sure that something else is breaking the scenario...
    Thanks,
    Ed

    1.  We can't say whether Lookout 6.1 supports Vista sp1 or not. Officially saying lookout 6.1 does support Vista, but sp1 is newly released and we haven't tested yet, so if it runs fine on sp1, it's good, but we can't guarantee. We will test the lastest os and sp for the next release of lookout.
    2. Yes. You can install both on the same machine and run both, at least on the supported os.
    3.  We haven't tested it yet.
    We don't support server os is because we have never tested the server os. But it doesn't mean lookout can't run on server os. Maybe it runs fine, maybe not. So I'm not suprised that Lookout has always run fine on Windows Server OS's since NT4.
    For Vista sp1, I'm sorry that you need to try by yourself.
    For the SPC toolkit for DSC 8.5, it's a known issue. It has been fixed in DSC 8.5.1.
    Ryan Shi
    National Instruments

  • Large applications - Labview and other programming languages

    Hello Labview Users,
    as the forum saw this very interesting thread about large applications programmed in Labview
    (see: http://sine.ni.com/niforum/niforum?requireLogin=False&forumDU=http://forums.ni.com/ni/board/message?... ) I would like to ask the community about their experiences with Labview applications in combination with other programming languages.
    In advance: I have several years of experience in programming Labview applications starting from quick-and-dirty solutions which had to run within few hours and complex test solutions. I saw Labview growing and becoming better with the released versions and lot of things I missed in former times got implemented in the meantime. Actually I have to develop a complexe ATE solution with numerous equipment to control and numerous data to be captured and archived. Despite the Verison 8 I still feel still some drawbacks of the LV concept which let me hestitate to setup the solution completly in Labview:
    1) It is alway hard to re-use code of complex applications since it is not possible to do some kind of global search an replace of functions
    variables etc. It nearly impossible to re-use approved code structures (e.g a state machine) if the "inner part" is changing more the a little bit.
    2) If the application requires a certain flexibility (e.g. exchangeable test equipment of varying vendors) this is hard to implement since you have to define a lot of parameters through your hierarchy if you dont want use global variables which slow down your application and hide
    the code functionality.
    3) Despite modern PCs the look and feel of LV applications appears somewhat slow compared to other applications. For complex user interfaces the polling methode generates a lot of complex code. (I dont have expierence with the event-structure).
    Now my questions:
    Do you have experience of implemention of complex solutions dividingthe code modules using Labview and other languages? Which other
    languages did you use? Why did you use these languages (speed, flexibility of text based code, available library functions)? Did you found out this to improve your development time and code maintainibility?
    (I concider a solution where I do the single tests with Labview-VIs but delegate all the test sequencing and data collection stuff written in PERL which allows really very compact code)
    I'm curious what your experiences are.
    rainercats

    Given that you're asking these questions in a forum for LabVIEW users the opinions given are going to be somewhat slanted towards the general like of LabVIEW. I've been working with LabVIEW for a long time, ever since 2.something on a Mac. I've written numerous large-scale applications as well as "mundane" instrument drivers. As you've noted you're experienced with LabVIEW, so you know some of its strengths and weaknesses.
    To address your specific questions:
    (1) Yes, that has always been a limitation in LabVIEW, but I don't believe it is an overriding one to choose C over LabVIEW. Putnam provided one workaround for the search and replace of VIs. Once you've programmed in LabVIEW long enough you get used to doing it this way. Is it clumsy? Yes. As for the re-use of code structures, that's not entirely true. You can create a "template" VI (a regular VI, not a .vit) that contains the code you want to re-use and place in your palette with the "Merge VI" option set. That way you can select it from your functions palette, plop it down on your diagram, and you get the "template" VI's diagram placed right into your new VI.
    (2) This is not something that is specific to LabVIEW, as this exists with any programming language. It's not the language that limits you here, it's how you've designed your code. In a language like C++ you would go with classes. You can do the same thing in LabVIEW. IVI is another option (though not preferred by me).
    (3) As Putnam mentioned, you should be using the event structure.
    Other thoughts:
    The biggest strength I see with LabVIEW is that each VI is a miniature program, which allows development and debugging of functions a snap. With a language like C you have to write another program to call that function in order to debug it. The biggest weakness? I would say user interface. Yes, even with the event structure. Don't get me wrong, the event structure has vastly improved the way user interfaces and event handling in general are done with LabVIEW, but it simply doesn't hold muster to a program written in C or VB. ActiveX anyone? LabVIEW still doesn't do ActiveX properly in terms of actually getting the controls to work. Programming ActiveX controls is just plain aggravating what with all the property nodes taking up so much diagram space.
    It certainly makes sense to use the best tools available to you to get the job done. In my recent projects I had to write software to run automated tests on some products my company made. The test modules were written in LabVIEW. The tests executive was a proprietary engine driven by a SQL Server database. I had to write a "wrapper" DLL that interfaced between the LabVIEW code and the executive since the executive hadn't been designed to call LabVIEW DLLs directly. This allowed us to use LabVIEW as the preferred language for developing the test modules and let the guys who were fiddling with the test executive do their bit. Of course, TestStand's premise is basically that.

  • Labview and multicore technology

    Greetings
    the scenario that i'm having now is as following:
    - I'm a 1st year PhD student,and am trying to figure out(find) a new,innovative and impressive project in wireless communications systems area. while am doing my literature review and exploring my favorit knowledge base www.ni.com website ,by chance, i read some titles about the new Multicore tech.then, after surfing most ni's webcasts i got many ideas on how to exploit this tech. together with LabVIEW to boost up my yet-undefined PhD wireless com.main project.in other words,i want to speed-up any existing wireless technology (e.g beamforming algorithms of smart antennas).However, while i met my supervisor i did for him a quick demo and he was wondering about which wireless topic we can make use of multicore tech together with labview,i suggested him {improving wireless systems using multicore tech. and labview coding}.But, a class A student told us that that multithreading problems of multicore tech. have already been solved by software giants, and nothing special about using labview to design parallel processing as many other text-based langauges can do the same,as their compilers already designed for multicore systems.I get depressed actually after he told me this.becuase am intending to use labview (my most favorit software with multicore tech to simulate the performance of a wireless system,and publish a paper about this topic.
    in short gentle men, i wish u guide me and tell me anything related to multicore tech with labview.any new problem you want to simulate it using labview so that i can work on it using labview.and please try to refer me to the latest updates about multicore and latest webcast as i saw almost all webcasts now available on ni website. and suggestion, any feed back, any new PhD research idea.all welcomed and appreciated.
    thanks a lot
    please help to find something related to wireless, anything new for research.....please
    Labview Lover

    Labview Lover wrote:
    labview (my most favorit software with multicore tech to simulate the performance of a wireless system,and publish a paper about this topic.
    Why not do a project using Labview to create different models of wireless technologies, and create simulations toevaluate their potential performance.  The benefit of multi-core programming would be to optimize the speed to which you canrun simulations of the models.  For instance, you can include spectral analysis using LabVIEW.  
    An idea would be to do the above for RAKE receivers or Software Defined Radios.  There are many things you can research and investigate using LabVIEW.  You can create the hardware and compare physical to the model.  There are lots of project ideas in this area.
    R

  • How to sample an analog signal, simulated on labview and get the sample values.

    My project involves this particular detail where i have to sample a simulated sine wave and get the samples and compare them so as to select a particular length for hanning window.Then it also requires me to experiment with the window size so as to get more efficient data out of the sampled signals. please help me with the sampling part and guide me as to how to perform a hanning window operation on the sampled signal. I have directly used the spectral analysis tool which involves getting a FFT spectrum by a default hanning window setting. But since i have to experiment with the window size and variations, guide me with the sampling part and applying a separate hanning window. Thank you.

    bhardoo wrote:
    My project involves this particular detail where i have to sample a simulated sine wave and get the samples and compare them so as to select a particular length for hanning window.Then it also requires me to experiment with the window size so as to get more efficient data out of the sampled signals. please help me with the sampling part and guide me as to how to perform a hanning window operation on the sampled signal. I have directly used the spectral analysis tool which involves getting a FFT spectrum by a default hanning window setting. But since i have to experiment with the window size and variations, guide me with the sampling part and applying a separate hanning window. Thank you.
    What do you mean by i have to sample a simulated sine wave?
    the sine generator will provide you with a waveform (or an array of values you can use), there are basic  vis to do all this, rigth click on the vi and select help to get detailed information.
    Post your vi if you have trouble with details.
    How much do you know about LabVIEW and signal theorie?
    Greetings from Germany
    Henrik
    LV since v3.1
    “ground” is a convenient fantasy
    '˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'

  • Bridgeview and Lookout Protocol Driver

    I am a Bridgeview 2.1.1 Full Dev. System user - was shipped an upgrade disk
    w/out the IA Server disk (4.0?). Now as I wait for it, I was trying to look
    for the driver set I had on my previous set of dsiks (Bridgeview 2.1 and
    IA Server 2.1) - and I cant find it, although I have installed it from my
    previous set before. What am I doing wrong here? Thanks.
    Sudhi

    Sudhi,
    I assume finally you got the the National Instruments IA Server CD. If not, you might contact your N.I. sales representative.
    Actually, the IA Servers (aka IAK Servers based on the configuration through NI Server Explorer) is not available anymore and is an obolete NI product. However, if you want to communicate with PLC and other 3rd party hardware, I would recommend to get an OPC server from the manufacturer of the hardware. Another option would be to get the Lookout Protocol Drivers from National Instruments which allows you to communicate through the OPC interface to different PLCs (P/N: 777616-01, Industrial Automation Servers with OPC Interface, Win NT/98/95,CD). Those drivers are included in the Control Edtion Packages for LabVIEW. Check it out: ht
    tp://amp.ni.com/niwc/suite/control.jsp?node=10418
    National Instruments has as well hardware with FREE drivers and OPC servers to make the communication between LabVIEW and the hardware very easy. E.g. the widley known distributed Fieldpoint devices and the well-known DAQ/SCXI systems will come with OPC servers.
    Please check it out: www.ni.com/opc
    Hope this helps
    Roland

  • Comparing LabVIEW OOP Class Hierarchies

    Hello all,
    The Question
    Does anyone know of a tool for comparing LabVIEW class hierarchies?  (Like "Compare VI Hierarchies" for .lvclass hiearchies?)
    For Example
    Let's say I have foo.lvclass in two projects and I want to compare the two instances and their respective members.  Does one instance have methods that the other doesn't?  Does a method or private data differ?
    Why?
    I have several applications with dozens of classes, many of which should be interchangable, and I want to quickly determine which classes are the latest, perhaps consolidating some common base code.
    I thought about posting this as a suggestion on the Idea Exchange, but I wasn't sure if I'm just not seeing functionality that's already there.
    Thanks very much,
    Jim
    Solved!
    Go to Solution.

    Hi Jim,
    There does not appear to be any way to do this. The closest functionality I could find was to use the Compare VIs/Compare VI Hierarchies to compare the .ctl files for two classes. The LabVIEW Help topic documents how to do this:
    http://zone.ni.com/reference/en-XX/help/371361L-01/lvhowto/comparing_vis/
    However, this does not really achieve the functionality you are looking for. Thank you for posting this idea to the Idea Exchange!
    Catherine B.
    Applications Engineer
    National Instruments

  • I am receiving the data through the rs232 in labview and i have to store the data in to the word file only if there is a change in the data and we have to scan the data continuasly how can i do that.

    i am receiving the data through the rs232 in labview and i have to store the data in to the word or text file only if there is a change in the data. I have to scan the data continuasly. how can i do that. I was able to store the data into the text or word file but could not be able to do it.  I am gettting the data from rs232 interms of 0 or 1.  and i have to print it only if thereis a change in data from 0 to 1. if i use if-loop , each as much time there is 0 or 1 is there that much time the data gets printed. i dont know how to do this program please help me if anybody knows the answer

    I have attatched the vi.  Here in this it receives the data from rs232 as string and converted into binery. and indicated in led also normally if the data 1 comes then the led's will be off.  suppose if 0 comes the corresponding data status is wrtten into the text file.  But here the problem is the same data will be printed many number of times.  so i have to make it like if there is a transition from 1 to o then only print it once.  how to do it.  I am doing this from few weeks please reply if you know the answer immediatly
    thanking you 
    Attachments:
    MOTORTESTJIG.vi ‏729 KB

  • Problem with Labview and an ARM Cortex

    Good morning,
    I am currently trying to use Labview with a board from ST Microelectronics (MCBSTM32) with an ARM Cortex Processor.
    I use the SDK and have followed the tutorials.
    But, when I try to launch the program (the simple loop as write in the tutorial n°2 : http://zone.ni.com/devzone/cda/tut/p/id/7029 ), Keil gives me an error via Labview:
    "Argument 'DARMSTM' not permitted for option 'device'."
    It seems that Keil does not allow an ARM Cortex from ST as the device.
    Moreover, after having this problem, I am unable to use a Keil project ,even a project which worked before, without Labview. I need to restart the computer.
    I also try to launch the Keil project generated by Labview, without using Labview, and it works. But as soon as I use Labview, I have the error.
    Did anyone already have this error or know how to solve it
    Thank you for your answer and sorry for my bad english.
    Regards,
    Raphaël VAISSIERE

    Hi Raphi,
    So let me make sure I understand,
    The project created in LabVIEW errors out with the message "Argument 'DARMSTM' not permitted for option 'device" 
    If you open the same project in Keil uVision, it runs fine
    Here are my questions:
    1. So how does the code run when run through Keil? does it deploy and run fine?
    2. Did you follow the porting procedure completely?
    Your target STM32F103RB  is techincally supported by Keil but you need to port the RTX kernel to it. This paragraph explains it:
    To determine if your target already supports the RTX Real-Time Kernel, browse to the \Keil\ARM\Startup directory, then browse to the folder that corresponds to the manufacturer of your ARM microcontroller. If there is an RTX_Conf*.c file for your target, then the RTX Real-Time Kernel has already been ported for your ARM device. If no such file exists, skip to chapter 4 for more information on the RTX Real-Time Kernel and a guide for porting RTX to your ARM microcontroller. 
    You also need to port the Real-Time agent to it.
    I just want to make sure that you have followed the guidelines. If you have and are still having problems, we will continue to explore this.
    Thanks,
    National Instruments
    LabVIEW Embedded Product Support Engineer

  • I am trying to use Labview and RP1210 compliant hardware to connect to a truck J1939 bus and receive messages.

    I am trying to use Labview and RP1210 compliant hardware to connect to a truck J1939 bus and receive messages. 
    Specifically I am attempting  to read data frames using the RP1210_READMESSAGE .   am able to configure the hardware and send a message to the J1939 bus. .    I think I have not configured something correctly.  I can use the RP1210_SENDMESSAGE and see the message I have sent on the bus using CANalyzer   When I use the RP1210_READMESSAGE   I get the timestamp from a message and the return from the function sends back the correct number of bytes (the number matches the number of bytes I sent out plus four bytes from the timestamp).  What I am having trouble with is actually receiving the data. I have had the same type of behavior from two different hardware (Vector CANcase XL and Nexiq USB Link), so I don't think the issue is vendor specific.
    Has anyone been able to make the RP1210_RECIEVEMESSAGE function work correctly?
    Thanks for any help

    Thanks
    I have already tried that.  The links are the NI RP1210 wraper. The problem I am having is using labview to interface with the RP1210 layer.  The RecieveMessage char*fpchAPIMessage this is the output which is a pointer to a cahracter array.  In this variable I can receive the timestamp of the message but not the message.  The retun showns the correct amount of bytes are aviaable (18 for a 8 byte message) but I can only get the 4 byte timestamp  I think I have to dereference this pointer to view the data.  I am not sure how to fix this. 

  • Tcp data b/w labview and c++

    Hi
     i am trying to establish TCP connection b/w LABVIEW and C++ program. Server is established in C++ while client is implemented in labview.Although connection is successfully establish b/w server and client, both are unable to correctly understand data send/receive among them. Forexample if i want to send an int type send_array from server, i use standard WINSOCK function "send" like that:
    send(AcceptSocket,(char*)send_array,129*4,0);
    but when client in labview receive this array, it shows unexpected values.As a client, I used "simple data client.vi"  with one modification i.e. as sent data size(129*4 bytes) was fixed, only one TCP read was used . 
    Same problem exists if i send data from client to server.
    Kindly help me
    Best Regards
    Solved!
    Go to Solution.

    It's probably an big/little edian problem. If you are using the Flatten/Unflatten from String functions, you can specify which to use.

  • Questions on Saving and mining data with Labview and DIAdem

    Hi,
    I am sampling two signals at 200k sampling rate. I am trying to save the data on harddisk and analyzing the data using DIAdem.
    If I save the data using mesurement file format .tdm and .lvm, the file size will be about 4 Gigabytes for only 10 mins' acquisition. It is very slow to process it.
    I used the software Clampex and pCLAMP(Axon Instruments) before. At the same 200k sampling rate and also acquiring two signals, these programs save the data as .atf format and the size is only 400 MB for 10 mins' acquisition.
    I wonder if there is also a good way to handle this situation using Labview and DIAdem, and how to do it?

    Hi, Jonathan:
    I tried the TDM binary file format. The file size is 800 MB per min acquisition and it will take a long time to just open these files. For my application, I have to take data for several hours. I am looking for a way to reduce the size of files.
    Is there any other type of file format that can reduce the size significantly and can be handled easily?

  • Please Help!!! VIPM won't connect to labview and I need to figure out why

    I am trying to download the Arduino package from the VIPM and they won't seem to connect. I have included a picture of the error and two more of my settings. I am using the student evaluation of the Labview software, but I'm not sure that it is what's causing the problems. Any help would be appreciated, as I would like to set up a nice GUI using labview and an arduino as my DAQ device. I need this for a project for my professors so I am rather desperate to get it up and running.
    Attachments:
    Labview Error.jpg ‏352 KB
    Labview Error II.jpg ‏141 KB
    Labview Error III.jpg ‏108 KB

    No, i tried using you settings and stil not working. i went through every combination of the four items I have on the machine and export list. Still nothing, do you think that its that i am using the student version?
    Attachments:
    New Labview settings, not working.jpg ‏123 KB
    New Labview settings, not working.png ‏267 KB

  • Can I use a camera for application in Labview and VBAI at the same time ?

    Dear all,
    I'm trying to save an AVI file with Labview and make an image process with VBAI at the same time, in one machine.
    The error : "Camera already in use" displayed.
    My Camera is a GIGE and I work with Imaqdx. I've test the multicast mode but it only operate with several machines.
    How can I do this ?
    Thank's to help me,
    Yoann B

    I'm not necessarily saying that.  It's been a while since I've used VBAI, so I don't remember all of the capabilities, but if VBAI can do the inspection and recording at the same time, you should be fine.
    The trick is that only one program can access the camera at the same time.  That application reserves the camera, thus making it unavailable to others.
    Chris
    Certified LabVIEW Architect
    Certified TestStand Architect

Maybe you are looking for