Run Labview without toolkits - Practice for CLD

I was wondering if there is a key combination or way to start labview without any toolkits. Only load the core components. I wish to practice for the CLD and understand only the core vi's will be loaded on the grader's computer.

As long as you don't plan to use any toolkit, you don't have to worry about that.  On another note, if you contact your local NI app engineer, he/she maybe able to come over to your company and proctor the exam for you.  The NI app engineer will usually let you use your own computer as well (as long as you don't use any existing subroutines and templates)
Kudos and Accepted as Solution are welcome!

Similar Messages

  • Do i need 2010 to practice for CLD? I currently have 2009.

    Hi All,
    Well i finally got round to doing my CLAD yesterday and passed, Yipeee! Also by coincidence became an 'Active Participant' on this forum, Like all my christmas's coming at once.
    I had to do the CLAD as i am planning on doing the CLD exam in a couple of weeks. My question is this: I have LabVIEW 2009 but the test im sure will be in 2010. Are there that many differences that i should download an evaluation version of 2010 to practice or will i be ok using what i have?
    Im just worried that i turn up and waste an hour navigating 2010. If the layout and pallets are similar to 2009 then i think i will be ok.
    Thanks in advance for your help.
    Lucither
    "Everything should be made as simple as possible but no simpler"
    Solved!
    Go to Solution.

    Thanks everybody for your input. Am now more confused then before!
    My main concern was to do with the layout of 2010. I was worried it would be like going from word 2003 to word 2010, i still struggle to use the latest version of word as i have always stuck with 2003 on my computer. From what i hear this is not going to be a problem. I am reluctant to get 2010 just for the CLD as i have no intention of buying 2010. Im sure if i downloaded the eval version i would like it and knowing my level of will-power i will be forced to part with a lot of cash.
    Thanks again for your advice.
    Regards,
    Lucither.
    "Everything should be made as simple as possible but no simpler"

  • Best practice for upgrading task definition without deleting task instances

    best practice for upgrading task definition in production system without deleting or terminating task instances
    If I try and update a task definition with task instances running I get the following error:
    Task definition 'My Task - Add User' may not be modified while there are active task instances
    Is there a best practice to handle this. I tried to force an update through the console but that didn't work. I tried editing the task from the debug page and got the same error.

    1) Rename the original task definition.
    2) Upload the new task definition with the original name.
    3) Later, after all the running tasks have timed out, delete the old definition.
    E.g., if your task definition is "myWorkflow":
    1) Rename "myWorkflow" to "myWorkflow-old-2009-07-28"
    2) Upload the new task definition as "myWorkflow".
    Existing tasks will stay linked to the original (renamed) workflow definition.
    New tasks will use the new definition.
    As the previous poster notes, depending on the changes you are making, letting the old task definitions stay active could have bad side-effects and might be better avoided.

  • Best practice for running multiple sites on 1 CF install?

    Hi-
    I'm setting up a new hosting environment (Windows Server 2008 Standard 64 bit VPS  configuration, MySQL, IIS 7, CF 9)
    Has anyone seen any docs or can anyone suggest best practices for configuring multiple sites in this environment? At this point I'm thinking simple is best, one new site in IIS for each client (domain) and point it to CF.
    Given this environment, is anyone aware of any gotchas within the setup of CF 9 on IIS 7?
    Thank you in advance,
    Rich

    There's nothing wrong with that approach. You can run as many IIS sites as you like against a single CF install.
    As for installing CF on IIS 7, I recommend that you do the following: install CF 9 without connecting it to IIS, then installing the 9.0.1 upgrade and any hotfixes, then connecting CF to IIS using the web server configuration utility. This will keep you from having to install the IIS 6 compatibility layer that's needed with CF 9 but not with CF 9.0.1.
    Dave Watts, CTO, Fig Leaf Software
    http://www.figleaf.com/
    http://training.figleaf.com/

  • Tips and best practices for translating C into LabVIEW? SERIOUS newbie...

    I need to translate a C function into LabVIEW.  This will be my *first* LabVIEW project.  I've been reading some tutorials, and I'm still struggling to get my brain out of "C/C++ mode" and learn the LabVIEW paradigms.
    Structurally, the function that I need to translate gets called from a while-loop and performs a bunch of mathematical calculations. 
    The basic layout is something like this (this obviously isn't the actual code, it just illustrates the general flow control and techniques that it uses).
    struct Params
    // About 20 int and float parameters
    int CalculateMetrics(Params *pParams,
    float input1, float input2 [etc])
    int errorCode = 0;
    float metric1;
    float metric2;
    float metric3;
    // Do some math like:
    metric1 = input1 * (pParams->someParam - 5);
    metric2 = metric1 + (input2 / pParams->someOtherParam);
    // Tons more simple math
    // A couple for-loops
    if (metric1 < metric2)
    // manipulate metric1 somehow
    else
    // set some kind of error code
    errorCode = ...;
    if (!errorCode)
    metric3 = metric1 + pow(metric2, 3);
    // More math...
    // etc...
      // update some external global metrics variables  
    return errorCode;
    I'm still too green to understand whether or not a function like this can translate cleanly from C to LabVIEW, or whether the LabVIEW version will have significant structural differences. 
    Are there any general tips or "best practices" for this kind of task?
    Here are some more specific questions:
    Most of the LabVIEW examples that I've seen (at least at the beginner level) seem to heavily rely on using the front panel controls  to provide inputs to functions.  How do I build a VI where the input arguments(input1, input2, etc) come as numbers, and aren't tied to dials or buttons on the front panel?
    The structure of the C function seems to rely heavily on the use of stack variables like metric1 and metric2 in order to perform calculations.  It seems like creating temporary "stack" variables in LabVIEW is possible, but frowned upon.  Is it possible to keep this general structure in the LabVIEW VI without making the code a mess?
    Thanks guys!

    There's already a couple of good answers, but to add to #1:
    You're clearly looking for a typical C-function. Any VI that doesn't require front panel opening (user interaction) can be such a function.
    If the front panel is never opened the controls are merely used to send data to the VI, much like (identical to) the declaration of a C-function. The indicators can/will be return values.
    Which controls and indicators are used to sending data in and out of a VI is almost too easy; Click the icon of the front panel (top right) and show connector, click which control/indicator goes where. Done. That's your functions declaration.
    Basically one function is one VI, although you might want to split it even further, dont create 3k*3k pixel diagrams.
    Depending on the amount of calculations done in your If-Thens they might be sub vi's of their own.
    /Y
    LabVIEW 8.2 - 2014
    "Only dead fish swim downstream" - "My life for Kudos!" - "Dumb people repeat old mistakes - smart ones create new ones."
    G# - Free award winning reference based OOP for LV

  • How to run Paypal without payment for test/demo products as a pass thru ?

    I have a paid report on my website whereby users go to Paypal to pay for the report first, then my backend system provides the report.  From time to time I need to run non-paid reports as teasts or demos.  Is there a way to set up a credit card on my account or a process whereby I can run Paypal without payment... as a pass thru ?

    What you're explaining is exactly what the sandbox is for:  testing and demoing PayPal solutions and the events that are triggered when transactions are completed. It's a recommended practice anyway to keep a copy of your site on a test server somewhere which mirrors everything about your live server except that it works with the sandbox instead of the live PayPal site. That way future development can be done on your test site with the sandbox and not moved to your live site until it's all demoed and proven to be fully functional as expected.  It's also a nice place to always have functional demos for any purpose at all. So yes, the sandbox will definitely suit your needs and would be the recommended way to go.   With proper integration the test/sandbox site and your live site can easily be switched back and forth using a flag in a config file or a URL parameter, for example.  On that note, you could set that sort of thing up on your live site, too, so that everything works in the sandbox when this flag is provided.  Then you wouldn't have to maintain separate servers. Again, though, it's recommended for developers to always keep separate test servers anyway.

  • Run labview program without necessity of labview license . is it possible?

    Hi
    I wonder is it possible to have a labview program that can be run without of necessity for labview license?
    now we can produce executable labview programs but they need labview licences.
    Best regards
    s Farashi
    Solved!
    Go to Solution.

    That does not make any sense. You have a license for the development system. You can create an exe and distribute as many copies as you want without further licenses. The only thing you have to distribute with the exe is the runtime engine. Where did you get the idea you needed a license for each exe?
    Message Edited by Dennis Knutson on 01-30-2010 12:11 PM

  • What is the best practice for running a long report/query against an active database?

    We are using SQL Server 2012 EE but currently do not have the option to run queries on a R/O mirror though that is my long term goal. I am concerned I may still run into the below issue in that scenario as well since the mirror would also be updating data I
    am querying.
    I have a view that joins across several tables from two databases and is used by an invoicing program on existing data. Three of these tables are also actively updated by ongoing transactions. Running a report that used this view did not use to be a problem
    but now our database is getting larger and we have run into some timeout problems for the live transactions coming in.
    First the report query was timing out so I set command timeout to 0 and reran the query which pegged all 4 CPUs 100% for 90 minutes and so I finally killed it. Strangely there were no problems with active transactions during that time so I'm wondering if the
    query was really running doing anything useful or somehow spinning and waiting. I reviewed the view and found a field I was joining on that was not indexed so created an index on that field, reran the report, which then finished in three minutes and all the
    CPUs were busy but not at all pegged out. Same data queried both times. I figured problem solved. Of course later, my boss ran a similar invoice report, with the same amount of data, and our live transactions started timing out 100% while his query was running.
    I did not get a chance to see the CPU usage during that time.
    I looked at the execution plan of the underlying view and added the suggested index but that did not help. When I run the just the view at SQL Server it does not seem to cause any problems and finished in a couple seconds. Perhaps something else going on in
    the reporting tool using the view.
    My main question is - Given I have to use the live and active database, what is the proper way to run a long R/O query/report so that active transactions can still continue to update
    tables that I am querying? sp_who2 did show transactions being blocked so I guess a long query accessing the tables blocks live transactions accessing those same tables, but certainly I'm not the only one doing this. I
    am considering adding "with (nolock)" but am hoping there is a better standard practice as that clause can return dirty data and I understand why. Thx, Dave
    Thanks, Dave
    Dave

    Hello
    You can change the DB isolation level to Read uncommitted
    http://technet.microsoft.com/en-us/library/ms378149(v=sql.110).aspx
    or use WITH (NOLOCK)
    I do use NOLOCK option for the dirty reads to avoid locks on the tables
    Javier Villegas |
    @javier_vill | http://sql-javier-villegas.blogspot.com/
    Please click "Propose As Answer" if a post solves your problem or "Vote As Helpful" if a post has been useful to you

  • Run Labview as different windows user (for database connectivity)

    I have to connect to a corporate remote MSSQL-Database on the network, using the database connectivity toolkit for LabVIEW.
    The db-authentification is realized checking the windows user (ads) who acceses the datebase (not a database user / password which is standard in the database vis)
    Therefore I have to run LabVIEW as the defined windows user, different than the logged in user on the pc.
    Is that possible to realize with LabVIEW?
    Other Windows-tools allow to define the windows user/password which "run" an application - how can I do that with LabVIEW?

    Hi Zav.  There is a windows command 'runas' which _may_ let you do what you want; I haven't
    tried what you need to do, but is has worked for other tasks for us.  Try 'runas /?' for the switches.
    You will have to build your LV program into an executable, then use runas to launch it.  Is there
    a reason you can't just login to Windows as the required user?
    If you can get your db admin to allow dbuser/password authentication that would be a much better
    way to go.
    Matt

  • Best Backup Practice for Solaris 10 running Zones

    What is the best practice for backing up a solaris 10 server with zones?
    I have solaris 10 installed here on a v240 running 4 zones. All services run off the local zones. We use Veritas backup exec 9, and I have the solaris agent. Should I backup the Data on each of the zones as if they were their own system , or should I only run the backup exec agent on the global zone?
    It would make sence to just backup from the global zone, as it is only one system to backup.
    We are working on our backup procedures now, and I want to make sure that we are doing this the best way possible.
    any thoughts?

    Hey there,
    Have a look at this excellent guide. I think that you may find what you need here. If not drop an email back. To begin with you need an extra disk(s) since you will mirror your data. This is sort of a pre-requisite.
    Good luck,
    Pierre
    http://www.adminschoice.com/docs/solstice_disksuite.htm

  • Can I run a set up without a cd for a WRT54g V5

    can I run a set up without a cd for a WRT54g V5 ?

    We could configure the Router by accessing the Router's
    Configuration Page, and not necessarily need you have to
    configure the Router only with the Setup CD
    If you face any difficulties,
    Let us know the Name of your Internet Service Provider,
    Router Model and Version Number

  • HT1414 after installing the latest version of IOS my iPhone 4s rapidly runs out of power, it also suddently (fully charged) shuts down and restarts without me asking for it. I tried to reinstall the latest version of IOS but it doesnt help. Any advice?

    After installing the latest version of IOS my iPhone 4s rapidly runs out of power, it also suddently (fully charged) shuts down and restarts without me asking for it. I tried to reinstall the latest version of IOS but it doesnt help. Any advice?

    Backup and restore your software via iTunes. If the problem continues, restore as a NEW device. If this solves it, that means there is some corruption in your backup file. If the problem is still there, you should take it to the Genius Bar at an Apple Store for evaluation.

  • My stoplight.vi for CLD practice exam

    I am preparing for CLD and looking for comments on my implementation of the "stoplight" CLD example question.  Any comment would be appreciated.  Kudos will be distributed liberally.
    The pdf for the example question is in this zip file:
    ftp://ftp.ni.com/pub/devzone/epd/2419.zip
    Attachments:
    CLD_Traffic_Light.zip ‏42 KB
    Sample CLD Exam - Traffic Light.pdf ‏350 KB

    It looks like your code does everything that the clad exam wants. I see that you documented everything but the main vi. You might just want to explain what the purpose of the mian vi is. I think you code is simple and east to read. The state machine is simple and works good and everything seems to flow nicely. You document well what is going on in each loop.
    I hope this was userful info.
    Tim
    Johnson Controls
    Holland Michigan

  • Best practices for file I/O within producer/c​onsumer loops

    I'm looking to add file recording and playback functionality to a pre-existing data collection program.  The original progam is based on a Moore-style state machine, which I have added four additional states to.  They are: Record Start, Record Stop, Playback Start, and Playback Stop.
    What I have done, and what has since been identified as poor programming practice, was to "initialize" (either create or load) the appropriate file within the state machine loop during the "Start" command (for record or playback functionality), and then provide the file reference as an indicator, which is linked to for the appropriate read or write operation(whether I'm playing back or recording).  The actual I/O occurs within the the Consumer Loop. (screenshots attached).
    This is my first labview project outside of tutorials or other small examples, so any advise and constructive criticisims are welcomed.  Specifically with regards to file IO and refnum routing (it gets a little hairy in the consumer loop)!
    I'm running Labview 8.6 on Vista business.
    Attachments:
    Playback Start.JPG ‏71 KB
    Consumer_Loop.JPG ‏122 KB
    File Playback subvi.JPG ‏14 KB

    jamoore84 wrote:
    Ben,
    Thanks for the suggestion.  I think it's a little outside my ken at this moment, but  I'll look it over.  Despite any grevious coding transgressions, I have experienced some limited success with the current setup.  While the use of Action Engines/Functional Globals may constitute the best practice, I might revise my post to read "acceptable and/or easily absorbable practices" instead.
    Are there any other opinions on this?  Let me start by listing a problem and posing a question:
    Problem:  I am able to playback a file only once.  Subsequent attempts at file playback do not work.
    Thanks in advance,
    jimmy 
    HI Jimmy,
    I don't give up easy.
    Let me try to exaplain the issue with race conditons with a contrived example, the
     "Command by Mail box" case.
    Imagine you had a job where you recieved your orders via an old fashioned mail box. You never really saw your supervisor but relied on getting orders via the mail box. Now imagine the mail box could only hold one message and any time a new message was inserted, the old one would fall into the trash.
    So you come to work each day check your mail box do what was ordered and everyone is happy.
    The next day you come in and without your knowing, you are assigned to do the work of two bosses. So as long as you check your mail box more often than the two of them assigne work everything is fine... until you take a day of vacation! So you come back in the day after vacation and check you mail box and work a way until you catch hell for ignoring orders !?! Wel it turn out the order from boss 1 was replaced by the order from boss 2 while you were on vacation. Oh bother!
    How can we fix it?
    1) Expand the mail box so it hold more than one order. You just process them in the oder they are recieved.
    2) Change to mail box to not accept a new order until you have removed the old.
    Now back to reality!
    Local variables act like the funky mail box. The last message insterted over-rides the previous.
    Multiple variable writer ar like multiple bosses.
    Queues operate like an expanded mail box, letting you handle each message in order.
    Action Engines operate like the "mutexed" mail box.
    Why I don't want to encorage you to "just patch up" what you have.
    All of the less than ideal solutions either over-sample (Check e-mail twice as often as bosses assign work, waste CPU, an exercise in futility if you are coding in a non-Real-Time envirionment like Windows) or use a mutex to control access to the shared resource ( in this case local varialbes).
    LV offers mutexes through semaphores (found on the syncronization pallete) but...
    WHY WORK SO HARD?
    In my AE Nugget I explain that the exection of an AE is automatically protected by LV. So in the long run it will actually be easier to learn how to use the AE programming construct than it will be to learn how to solve the problem without them.
    so GO FOR IT! Take the lazy route and learn how to use the AE construct. Use the Syncronization pallete for Queues.  
    Just trying to help,
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Best Practices for Using Photoshop (and Computing in General)

    I've been seeing some threads that lead me to realize that not everyone knows the best practices for doing Photoshop on a computer, and in doing conscientious computing in general.  I thought it might be a good idea for those of us with some exprience to contribute and discuss best practices for making the Photoshop and computing experience more reliable and enjoyable.
    It'd be great if everyone would contribute their ideas, and especially their personal experience.
    Here are some of my thoughts on data integrity (this shouldn't be the only subject of this thread):
    Consider paying more for good hardware. Computers have almost become commodities, and price shopping abounds, but there are some areas where spending a few dollars more can be beneficial.  For example, the difference in price between a top-of-the-line high performance enterprise class hard drive and the cheapest model around with, say, a 1 TB capacity is less than a hundred bucks!  Disk drives do fail!  They're not all created equal.  What would it cost you in aggravation and time to lose your data?  Imagine it happening at the worst possible time, because that's exactly when failures occur.
    Use an Uninterruptable Power Supply (UPS).  Unexpected power outages are TERRIBLE for both computer software and hardware.  Lost files and burned out hardware are a possibility.  A UPS that will power the computer and monitor can be found at the local high tech store and doesn't cost much.  The modern ones will even communicate with the computer via USB to perform an orderly shutdown if the power failure goes on too long for the batteries to keep going.  Again, how much is it worth to you to have a computer outage and loss of data?
    Work locally, copy files elsewhere.  Photoshop likes to be run on files on the local hard drive(s).  If you are working in an environment where you have networking, rather than opening a file right off the network, then saving it back there, consider copying the file to your local hard drive then working on it there.  This way an unexpected network outage or error won't cause you to lose work.
    Never save over your original files.  You may have a library of original images you have captured with your camera or created.  Sometimes these are in formats that can be re-saved.  If you're going to work on one of those files (e.g., to prepare it for some use, such as printing), and it's a file type that can be overwritten (e.g., JPEG), as soon as you open the file save the document in another location, e.g., in Photoshop .psd format.
    Save your master files in several places.  While you are working in Photoshop, especially if you've done a lot of work on one document, remember to save your work regularly, and you may want to save it in several different places (or copy the file after you have saved it to a backup folder, or save it in a version management system).  Things can go wrong and it's nice to be able to go back to a prior saved version without losing too much work.
    Make Backups.  Back up your computer files, including your Photoshop work, ideally to external media.  Windows now ships with a quite good backup system, and external USB drives with surprisingly high capacity (e.g., Western Digital MyBook) are very inexpensive.  The external drives aren't that fast, but a backup you've set up to run late at night can finish by morning, and if/when you have a failure or loss of data.  And if you're really concerned with backup integrity, you can unplug an external drive and take it to another location.
    This stuff is kind of "motherhood and apple pie" but it's worth getting the word out I think.
    Your ideas?
    -Noel

    APC Back-UPS XS 1300.  $169.99 at Best Buy.
    Our power outages here are usually only a few seconds; this should give my server about 20 or 25 minutes run-time.
    I'm setting up the PowerChute software now to shut down the computer when 5 minutes of power is left.  The load with the monitor sleeping is 171 watts.
    This has surge protection and other nice features as well.
    -Noel

Maybe you are looking for