LabVIEW features and converting to C or C++ codes

I am new user to LabVIEW, but many my coworkers are refusing to use the LabVIEW on any new project.  The reason is very hard to do design review when the application is adding new features or fix the bugs.  Is there anyway we can convert LabVIEW code into LabWindows and/or C/C++ codes?  Why LabVIEW is more popular than LabWindow/CVI in the industries?  How are other companies doing design or code reviews?

I think you confuse some things here:
a) Design review:
Design reviews should be part of the design phase in the development process. It is strongly suggested to perform this phase without use of programming languages (famous pen'n'paper phase!).
"Design" could incorporate flow charts, data charts or object oriented design (OOD). All those are concepts independent of any programming language.
You might refer to "code reviews" which are done during implementation phase and they have to check if the code matches
- style guides (code layout, documentation, ...)
- modularity
- desired design
They are done on code implemented in *any* programming language. LV is a programming language, just like C/C++. The basic rules for reviews is true for both. Reviews can be done with both.
b) Conversion of programming languages:
LV is a complete, self-contained programming language called "G". It is different to C/C++ in regard of syntax and semantic. Porting G-code to C-code is possible (using a module). BUT: Do you write C/C++ code just to convert it to .NET code later on? Most probable not. So why doing this to G-code???
c) Complexity of adding/changing features:
Again, this is a pen'n'paper task in the first instance. Starting of implementing changes works for small projects, but large projects suffers badly by this. Simply make it a rule: Implementing this is only small "routine work" within a project. Desinging modularization and algorithm is something which has to be done "offline" and independent of the used programming language.
LabVIEW is popular in academic and even industries since it is very easy to implement certain design desicision (e.g. flow charts, since THIS essentially is LV code!). So the "routine work" of implementation is getting shorter/faster....
Norbert 
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.

Similar Messages

  • Smart Software Engineer with LabVIEW experience (and acoustics a plus) needed in Boston, MA

    We've are looking for a staff software engineer to join our company in Boston, MA (near downtown).  We have a 3000+ vi application that has been in continuous development by multiple software engineers (currently 4 engineers + 1 intern) for 15 years.  Every year we release a new version of the software with significant new features.  An engineer with our company needs to be more than just a LabVIEW hacker.  We need a software engineering that can go into our large application, modify it, sometimes in very fundamental ways, without breaking existing functionality, and have an eye for how their changes impact the maintainability, scalability, reliability, and readability of our code.
    Candidates will likely be LabVIEW Architects or have equivalent experience if they don't have formal certification.  We lean towards candidates who have Masters Degrees in such fields as Electrical Engineering, Mechanical Engineering, and Computer Science. Interviews will be conducted over phone, web, and in person by a LabVIEW Architect and will need to be able to discuss topics such as the following:
    - coupling and cohesion in software design, how this relates to design paterns such as action engines
    - software lifecycle models- state machines, parallel loop architectures, race conditions, data structures, type definitions, Xcontrols,
    - Object Oriented design
    - importance of documentation, importance and use of source code control
    - pseudo code and its usefulness as a design tool, some exercises will require users to read and write pseudo code to solve classic computer science problems
    - tradeoffs of various file formats in terms of flexibility for future software changes
    - FFT, Frequency Response, Amplitude/phase, RMS level, dB, noise, averaging, distortion, loudness, A-weighting
    Formal job ad is below:
    To be considered for this position, please send resumé and cover letter explaining why you are the ideal candidate for this job (in Word or PDF format only) to [email protected]. Please use the subject title Software Programmer.
    Programmer for Audio Test and Measurement software - Boston
    Listen, Inc. is the market leader in PC based electro-acoustic test and measurement systems for testing loudspeakers, microphones, telephones, audio electronics, hearing aids and other transducers. We have been in business for over 15 years and our continued growth has created an opportunity for a software engineer to join our programming team. This is an exciting opportunity to work on an industry leading electro-acoustic test and measurement system used by numerous Fortune 500 companies in the field of loudspeaker, microphone, headphone, telecommunications and audio electronic manufacturing.
     This position reports to the Software Manager. Duties include, but are not limited to:
    Programming in LabVIEW
    Designing and coding new Sound Measurement and Analysis software
    Improving, reviewing and de-bugging existing code
    Preparing internal and user technical documentation
    Testing code
    Interfacing with management, sales teams and customers to define tasks
     Required skills / education
    Bachelor’s degree (Masters preferred) in electrical engineering, mechanical engineering, computer science, physics, or similar subject
    Strong background (4+ years) in programming with 1+ years in LabVIEW.
    A methodical approach to coding, testing and documentation
    The ability to work well in a small team. A willingness to challenge and discuss your own and other people’s ideas.
    Experience in acoustic engineering is a plus.  Relevant topics include FFT, Frequency Response, Amplitude/phase, RMS level, dB, noise, averaging, distortion, loudness, A-weighting
    About Listen
    Listen has been in business for over 15 years and our suite of PC & sound card audio test & measurement products is the accepted standard in many blue-chip companies worldwide. We offer the spirit and flexibility of a small company, combined with stability and an excellent externally managed benefits package which includes competitive salary, healthcare, paid vacation, retirement plan and more.
    Applicants must have authorization to work in the US. We are unable to assist with visa / work permit applications.

    we're interviewing candidates, but, this position is still available

  • How to make Adobe acrobat feature to convert SAP  Pages to PDF available for multiple users connected to the same Citrix server

    Hi,
    In my previous endeavours to solve this business requirement where multiple users will be able to use the Adobe acrobat feature to convert SAP pages inside SAP to PDF, I was told that it is not possible to do this for multiple users at the same time. However I have found an article according to which it says it is possible. Could you check it once and let me know if this article can be used for implementing the requirement stated above as this link clearly indicates that Adobe Acrobat is supported on Citrix.
    Please find the link below where it states it is adobe acrobat is supported in Citrix for multiple users.
    http://www.adobe.com/devnet-docs/acrobatetk/tools/AdminGuide/citrix.html.

    1. What is the Acrobat feature that you mean? There's nothing specific to SAP included with Acrobat.
    2. Yes, some Terminal Server configurations appear supported (check carefully). All users of the terminal server will need a license, so far as I know - total licenses = total individual people using.

  • Editing text after designing a new form and converting it in to pdf

    I designed a form and converted it in to PDF format for printing. The form has some formatting errors and needs to be edited. However attempt to do so results in a message stating that it is a secure documents and can not be edited. This is inspite of the fact that I have not placed any security restriction on it or on the folde in which it is stored. Can anyone help?
    Moneish

    Hi,
    Just providing some information in addition to George's reply.
    The PDF created by FormsCentral, assuming you are collecting responses online, is configured to enable extended features in Adobe Reader to allow respondents to save the form. To edit the PDF in Adobe Acrobat you must use the File > Save a Copy... menuitem to create a copy of the document that is not Reader-extended.
    When you have finished editing the document you should remember to re-configure the PDF for fill-in and save within Adobe Reader.
    Regards,
    Brian

  • I want to create a Pdf file - do i misunderstand the product?  I thought i could create a PDF similar to excel or word file or do i have to create there and convert to PDF? I bought the pack for $89.PDF Pack!

    I want to create a Pdf file - do i misunderstand the product?  I thought i could create a PDF similar to excel or word file or do i have to create there and convert to PDF? I bought the pack for $89.PDF Pack!

    Hi Brian,
    You can create it in Word or excel and then you can Convert it into .pdf Format
    Here's a feature list that depicts benefits of PDF pack: Convert Word to PDF, Convert PDF to Word & Merge PDFs | Adobe PDF Pack
    Please revert back if you have any other questions or need any help.
    Regards,
    Rahul

  • How to parse XSD in ABAP and convert into models

    Hello Experts,
    I have a scenario where XSD is available in ABAP and want to parse it and convert into object such as XSD Schema, XSD Complext Type, XSD Simple Type etc... (that is create a DOM model of XSD)
    Are there some standard APIs which can provide this feature?
    Thanks & Regards,
    Arpit

    Hi Arpit,
    If my understanding of your requirement is correct then the class - CL_FP_XSD_FOR_ABAPTYPES, should help you out.
    Let me know how this works out, or i am completely off your requirement.
    Regards,
    Chen

  • After scanning my document and converting to Microsoft Word the size of characters are different

    After scanning my document and converting to Microsoft Word the size of characters are different and things like puntuation are distorted. How do I get the uniformity like the original?

    Of course what lands in the Word file will differ from the viewed picture/image of text created by the scanner.
    (The output of all scanners is always an image file. For an image of textual content the best output file format is TIFF.)
    So you scan the hardcopy of text.
    The scanner output image (picture) is brought into PDF.
    At this point the only PDF page content that you can export to Word is the image (nope, no "text" just the image).
    Consequently you use Acrobat's OCR feature to do OCR of the image of text.
    With a decent paper source, proper resolution and a black and white image you'll get acceptable accuracy of recognition of the pictures of the characters.
    (the Optical Character Recognition)
    Acrobat's Searchable Image and Searchable Image (Exact) provides output that used text rendering mode 3 (no fill, no stroke for the glyphs).
    So, invsible / hidden text.
    The third OCR method is ClearScan.
    You could play with each of the three to see what goes into a Word file.
    Might try export to RTF, DOC and DOCX as well.
    Anyway -- What is exported is the OCR output; Not the image of text.
    And, of course, the image of the text is not the imprint on the paper that was scanned.
    Each step to the way you have some deviation.
    Once you have the exported PDF content in a Word file you can use Word to cleanup as desired / needed.
    OR
    Prop up the hardcopy and transcribe to a Word file.
    Be well...

  • LabVIEW 2011, and Excel 2010: saving problems

    Hello to all! I am a student of Mechanical Engineeringfthat very little has been using this software. I present the problem: for a matter which is called Biomechanics, I and my colleagues, we have developed the design of a rheometer, which is controlled by LabVIEW2011. Since we are still at the experimental stage, always has been simulated using labview: the whole experience. So, labview, at present, not only controls the stepper motor through the frequency sets, but also simulates the output (in the specific, the response to the torsion of the cartilage, simulated with a low pass filter of the second order). All this was done by my colleague more experienced than me, who today passed me the finished project that in his pc works perfectly while not in mine (we have the same version of labview, but different version of office). In practice, the program is fine (there is no arrow "Run" broken) and the simulation with the creation of the sine graph (response of the cartilage) is done perfectly. My colleague then set labview in order to save the data to an excel sheet at the end of the simulation, asking the user where to put it (the excel file must already exist, labview does not create it). The excel file should contain two columns and a scatter plot (one column represents the frequency, f, of the stepper motor, the second represent the shear modulus G, which increases as the frequency increases and the graph represent the trend of G in function of f). As soon as I decide to save the data, excel opens, for an instant appear the scatter plot and data in columns but immediately after the graph disappears and remain only the columns with data (so I have to make the graph "manually") and labview gives me the following error:
    Error -2147023170 occurred at Property Node (arg 2) in NI_ReportGenerationToolkit.lvlib:Excel_Insert_Char
    t.vi->NI_Excel.lvclass:Excel Insert Graph.vi->NI_ReportGenerationToolkit.lvlib:Excel Easy Graph.vi->SaveExcelFile.vi->Progetto 2.0.vi
    This error code is undefined. Undefined errors might occur for a number of reasons. For example, no one has provided a description for the code, or you might have wired a number that is not an error code to the error code input.
    Additionally, undefined error codes might occur because the error relates to a third-party object, such as the operating system or ActiveX. For these third-party errors, you might be able to obtain a description of the error by searching the Web for the error code (-2147023170) or for its hexadecimal representation (0x800706BE).
    What can I do?
    Thanks. Regards
    Solved!
    Go to Solution.

    The LabVIEW Report Generator in LabVIEW 2010 and more recent versions does, indeed, work very well with Excel in Office 2010.  I helped someone about two weeks ago with a problem saving a series of measurements to a (new) Excel Workbook.  Your problem differs in two ways -- you want to save in an existing Excel file (why? Aren't you worried about overwriting data, or are you trying to add a new WorkSheet, or append to the end of an existing WorkSheet?) and making a Chart.  I've not tried charts, so don't have advice right now.
    Judging by your error message, it is the Chart feature that is giving you trouble.  Can you see how your code runs if you eliminate the Chart functions?  [You can do that by putting a "Diagram Disable" Structure around the Chart code, clicking on the top of the structure and wiring the wires through in the Enable case].
    Give that a try and let us know what happens.

  • Labview Control and Function menu's stopped working.

    Most of the menus no longer work. The top level menu will pop up with a right click but the submenu's do not. I cannot get to the submenu which contains a While Loop for example. If I use the stickpin feature then I can navigate the menu's fine.
    Labviews main menu does not entirely work anymore also. Tools and Browse are nonfunctional. Sometimes if I restart labview the Tools menu will respond the first time I select it but after that it no longer shows the submenu when I click it.
    Very unusual. Thanks in advance for any help.

    I figured it out. The only changes I had made to my system was some video card drivers. I have 2 monitors and use Win2k Pro.
    I had selected an option to automagically center all popup windows. That was the problem. It was messing with labview whenever a menu/popup was attempted to be shown. Submenus in Labview must be considered or tagged as popups to the operating system. I disabled the "feature" and now everything is back to normal.

  • ITunes no longer recognizes midi files how can I play and convert them?

    iTunes 10.3 recognised midi files, I could "move" them to the iTunes library and select them. then click on "Advanced" and "Convert to AAC" (or to MP3 etc).
    iTunes 10.5 will not recognise midi files.
    I think an older version of GarageBand could also read and convert midi files, but this no longer appears to be the case either in the latest version.
    How can I now convet midi files? Has Apple decided to simply ignore them?

    Thanks to Limnos, I now have some further things to try, but in the meantime I found I had been maligning GarageBand 11. Indeed its built-in Help system doesn't seem to mention Midi files at all, but I found out how to insert them from Apple's on-line support system at http://support.apple.com/kb/PH2009 and it works superbly well - one can even change the instruments on individual channels etc and then save the lot as an AAC (m4a) or mp3 etc file.
    I also loaded up an old version of iTunes, version 10.3, which still recognises Midi files and can convert them to AAC or mp3 but in so doing some volume seems to be lost.
    But the winner so far for me has been Audacity, free software. One needs to add a LAME encoder for mp3 output, I installed Soundflower and routed the playback of the midi file available in the Finder (in fact for the whole machine, while playing the midi file) and declared Soundflower as the input for Audacity. I could then use Audacity's effects menu to increase the volume throughout the track and save it. I then, using the Sound preferences of the machine, switched back to the inbuilt speakers and was able to listen to a higher volume version, playable in iTunes 10.5. More faithful to the original midi file than the GarageBand version.
    See instructions at http://ask.brothersoft.com/tags/convert-midi-to-mp3/ and in particular one of the links http://ask.brothersoft.com/how-to-convert-a-midi-to-a-mp3-using-audacity-26125.h tml
    This is for Windows but for Mac it's very similar/ - still Audacity. BUT the instructions in the last link don't seem to quite work, which is why I had to use Soundflower. I can import the midi file, I can select it, but then the export item on the file menu remains greyed out and unuseable, so I must presumably be missing some plug-in. Also Audacity keeps telling me it can't find various ffmpeg files even though I thought I had installed them correctly. I'm obviously missing something here, but at least I can now get my midi files into aac or mp3 format again at last.
    For Soundflower see http://cycling74.com/soundflower-landing-page/ and http://kineme.net/forum/General/soundflowerforlion
    and there is a useful tutorial by Nowjobless on YouTube at
    http://www.youtube.com/watch?v=r3FGOIW08gA&feature=related
    Good luck to anyone else facing the same problems, it remains to be seen how to "convert" midi files to mp3 in Audacity without recourse to Soundflower.

  • Need Expert's Advice - How to use LabView Efficiently and to increase Readability

    My application is fairly complex. It is a real world testing applications that simultaneously controls 16 servo motors running various stress testing routines asynchronously and all at the same time. The application includes queues, state machines, sub VI's, dynamically launched VI's, subpanels, semaphores, XML files, ini files, global variables, shared variables, physical analog and digital interfaces and industrial networking. Just about every technique and trick that LabView 2010 has to offer and the kitchen sink as well.
    Still I am not happy with the productivity that LabVIEW 2010 has provided, nor the readability of my final product.
    Sometimes there are too many wires. Much of my state machines have a dozen or more wires just going from input to output, doing nothing, just because one or two states in the machine need that variable in some state. Yeah, I could spend alot of time bundling and unbundling and rebundling those values, but I don't think that would improve things much.
    We have had a long discussion about the use or misuse of Local variables in this forum and I don't want to repeat that here. I use them sparingly where I think it is relatively safe to do so. I also have a bug whenever I try and copy some code that contains one or more local variables. On Pasting the code with local variables, the result is something other than what I expected, I am not sure what. I have to undo the paste and rebuild the code one object at a time.
    I am also having trouble using trouble using Variable Property Nodes. When I cut and paste them, they often loose their reference object and I have to go back into the code and redo the Link To on each one. That wastes alot of time and effort.
    Creating subVIs is often not appropriate when the code makes many references to objects on the Front Panel. Some simple code will turn into a bunch of object references and dereferences which also tends to take alot of work to clean up and often does not help overall readability in many cases. I use subVIs when appropriate, but because of the interface overhead, not as often as I would like to. My application already has over 150 sub VIs.
    The LabView Clean Up Diagram function often works poorly. It leaves way too much empty space between objects, making my diagrams 3 to 4 24" screens wide. That is way too much and difficult to navigate effectively. The Clean Up function puts objects in strange places relative to other objects used nearby. It does a poor job routing wires and often makes deciphering diagrams more difficult rather than easier.
    My troubleshooting strategies don't work well for large diagrams and complex applications. The diagrams are so complex that execution highlighting may take 20 minutes for a single pass. Probes help, but breakpoint aren't of much use, because single stepping afterwards often takes you to somewhere else in the same diagram. I can't follow the logic well doing this.
    Using structures, I may have Case structures nested 5 to 10 levels deep inside some Event Structure inside a While Loop. Difficult to work with and not very readable.
    All and all, I can make it work, but I am not happy about the end result.
    I am hoping to benefit from some expert advice from those that are experienced in producing large complex applications efficiently, debugging efficiently and producing readable diagrams that they are proud of.
    Can anyone offer their advice on how best to use the LabView features to achieve these results in complex applications? I hope that you can help show me the light.

    I'm not an expert but I'm charged out as one at work.
    I am off today so I'll share some thoughts that may help or possibly inspire others to chime. I have tried to continually improve my code in those areas and would greatly welcome others sharing their approaches and insights.
    Note:
    I do refactoring services to help customers with this situation. What I will write does not represent what we do in a code review since our final delverable is a complete final design and that is beyond the scope of this reply.
    I'll comment on your points.
    dbaechtel wrote:
    My application is fairly complex. ...
    While watching Olympic figure skating competion slow-motion replays, I learned how the subtleties of how the launching skate is planted while entering a jump can make the difference between a good jump and a bad one.
    In software, we plant our foot when we turn from the design to the development. I have to admit that there where a couple of times when I moved from design to development too early and found myself in a situation like you have described.
    How to know when design is done?
    Waterfall says "cross every 't' and dot every 'i' " while Agile says "code now worry about design latter" and Bottom-up "says "demo working why bother designing" (Please feel free to coment on these over-simplifications gang).
    My answer is not much more helpful for those new to LabVIEW. 
    My design work is done when my design diagrams are more complicated than the LabVIEW diagrm they describe.
    dbaechtel wrote:
     simultaneously controls 16 servo motors running various stress testing routines asynchronously and all at the same time. The application includes ...and the kitchen sink as well.
    Have you posted any design documents you have? These will help greatly in letting us understand your application. More on diagrams latter.
    Anytime I see multiple "variations on a theme", I think LVOOP (LabVIEW OOP ) . I'll spare you the LVOOP sales pitch but will testify that once you get your first class cloned off and running as a sibling (or child) you'll appreciate how nice it is to be able to use LVOOP.
    Discalimer:
    If you don't already have an OOP frame of mind, the learning curve will be steep.
    dbaechtel wrote:
    Still I am not happy with the productivity that LabVIEW 2010 has provided, nor the readability of my final product.
    Sometimes there are too many wires....going from input to output, doing nothing,... spend alot of time bundling and unbundling and rebundling those values, but I don't think that would improve things much.
     Full disclaimer:
    I used to be of the same opinion and even used performance arguements to make my point. I have since, changed my mind.
    Let me illustrate (hopefully). This link (if it works for you, use lefthand pane to navigate hierachy) shows an app I wrote from about 10 years ago when I was in my early days of routing wires. Even the "Main" VI started to suffer from too many wires as this preview from that links shows.
    Clustering related data values using Type Definitions   is the first method I would would urge. This makes it easier to find the VIs that use the Type def via the browse relationships>>>callers. If I implement my code correctly, any problem I believe is associated with a particualr piece of data that is a Type def has to be in one of the VIs that use that type def therefore easier to maintain.
    When I wrote "related data" I am refering data normalization rules (which my wife knows and I picked-up from her and I claim no expertise in this area) where only values that are used together are grouped. E.G. Cluster named File contains "Path" and "Refnum" but not "PhaseOfMoon". This works out nicely with first creating sub-VI since all of the data related to file operations are right there whe I need it and it leads into the next concept ...
    When I look at a value in a shift register on the diagram taking up space that is only used in a small sub-set of states, I concider using an Action Engine . This moves the wire from the current diagram into the Action Engine (AE), and cleans up the diagram. The AE brings with it built-in protection so provided I keep all of the opearations related to the the Type def inside the AE I am protected when I start using multiple threads that need at that data (trust me, it may not make a difference now but end users are clever). So that extra wire is effective encapsualted and abstracted away from the diagram you are looking at.
    But I said earlier that I would not sell LVOOP so I'll show you what LVOOP based LV apps look like to contrast what I was doing ten years ago in that earlier link. This is what the top level VI looks like.
     And this is the Analysis mode of that app.
    I suspose I should not mention that LVOOP has wizards that automatically create the sub-VI (accessors) that bundle/unbundle the clusters, should I?
    Continuing...
    dbaechtel wrote:
    We have had a long discussion about the use or misuse of Local variables...I also have a bug whenever I try and copy some code...
    If you can simplify the code and duplicat ethe bug. please do so. We can get it logged and fixed.
    dbaechtel wrote:
    I am also having trouble using trouble using Variable Property Nodes....
    That sounds like a usage issue. Posting code to illustrate the process will et us take a shot at figuring out what is happening. 
     dbaechtel wrote:
    Creating subVIs is often not appropriate... My application already has over 150 sub VIs.
    "Back in the day..." LV would not even try to create a sub-VI that involved controls/indicators. I use sub-VIs to maintain a common GUI often but I do it on purpose and when I find myself creating a sub-VI that involves a control/indicator, I hit ctrl-z immediately! 
    I figure a way around it (AE ?) and then do the sub-VI.
    Judging by your brief explanation and assuming you do a LVOOP implementation, I would estimate that app need 750-1500 VIs. 
     dbaechtel wrote:
    The LabView Clean Up Diagram function often works poorly.... 
    THe clean-up works fine for how I use it. After throwing together "scratch code" and debugging the "rats nest" I'll hit clean-up as a first step. It guess good enough on simple digrams and in some cases inspires me to structure the diagram in a different way that I may not have thought about. If I don't like, ctrl-z.
    Good deisgn and modualr implementaion led to smaller diagrams that just don't need thrre screens.
     dbaechtel wrote:
    My troubleshooting strategies don't work well for large diagrams and complex applications....Can anyone offer their advice on how best to use the LabView features to achieve these results in complex applications? I hope that you can help show me the light.
    Smaller diagrams single step faster since the sub-VI run full speed. I cringe thinking about a 3-screen diagram with multiple probes open ( shivver!).
    Re: Nestested structres
    Sub-VIs (wink, wink, nudge, nudge)
    If it works you have prven the concept is possible. That is the first step in an application.
    I hope that gives you some ideas.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Interact Basler Camera with Labiview and convert image colours in temperature

    Hello,
    as the title suggests I need to combine a Basler camera with labview ( version 8.6 or higher) and convert the colours of the image in temperature signals. I have some thermocouples which measure temperatures of a black paper which change colour with the temperature, so I would like either to:
    1) define a calbiration curve so that for each colour corresponds a temperature
    2) dinamically calibrate, so that labview reads the temperature from the tempscan and at the same time knows the colour and derive a curve of calibration (of course we are speaking about a high number of measurements of temperature: 24).
    I guess the first method should be straightforwards, since I'm new to LV as well as to this camera I would like to know if you might give me any suggestion how to implement the curve of calibration and pick up the photo. The second method, I think is nice but quite difficult.
    Thank you very much
    Cheers,
    Antonio

    Hi Antonio,
    We have already spoken on the phone but I thought it would be a good idea to attach the very basic example code on here so that the community can see it.
    The code converts a colour into a decimal number. This number can then be used with a look-up table or calibration curve to get the respective temperature signal.
    Kind Regards
    Michael
    NIUK Application Engineer
    Attachments:
    Colour-to-Number.vi ‏12 KB

  • RTD Temperature Measurements using LabView 2013 and MyRio

    Hey everyone.  I am VERY new to LabView programming and working with a MyRio.  I need to figure out how to measure the resistance of a 2 wire RTD to find a temperature utalizing the MyRio and LabView.  I am pretty lost on how to do this.  Does anyone know some good resources for making the LabView program off of the tops of their heads?  I've figured out how to measure from specific pins, but I am not sure how to get it to constantly output a voltage from the output pins.
    Eventually, I would like to have it display the temperature as well as have it turn on or off a heater depending on that temperature, but that will come far later in this process. First things first, how do I take temperature measurements utalizing LabView 2013 and a MyRio with a 2 Wire RTD. 
    Thanks so much!

    Hi JoshEpstein87,
    The myRIO can't acquire a change in resistance directly, so you'll need to somehow convert the change in resistance to a change in voltage. There are multiple ways to do this, but you'll need to build an external circuit and then read the voltage output with the myRIO. One example of a circuit that allows you to do this can be found here. To output a voltage on the analog output pins, you should just need to set the output voltage and then it will remain at that voltage until you change it or power cycle the myRIO.
    To get started with LabVIEW and myRIO programming, see the following page:
    http://www.ni.com/myrio/setup/getting-started/
    There are some links to LabVIEW training as well as resources about RIO programming. I also highly recommend you check out the myRIO Community as there are example programs on there that you can take a look at to see how they are designed.
    Best Regards,
    Matthew B.
    Applications Engineer
    National Instruments

  • LabVIEW feature request: tunnel array concatenation

    I am requesting a LabVIEW feature that would be very convenient. The following situation is very common: I have a source of information, say an array of numbers, in a for-loop. When the for-loop has finished, I normally end up with a 2D-array (because of the index operation at the tunnel). Now, in many cases I would instead like to concatenate the arrays inside the loop, so that I end up with a 1D-array of numbers after the loop.
    It is easy to remedy the problem: use shift registers or feedback nodes together with the 'concatenate arrays' VI. An easier approach, from a programmer's point of view, would be to right-click on the loop tunnel and select an option that, instead of adding the data from each iteration into a new array dimension, would concatenate the arrays. The tunnel could have the + symbol instead of [].
    I wish I could find a more convincing argument for this feature than just convenience. But the reason for this request is just because of convenience. I find myself using the feedback nodes and the 'concatenate array' VI again, and again, and again, and again, and again, and again, and again, and again, and again... and I always need to create empty array constants for the initial value. It feels really unnecessary.
    I think this feature would make LabVIEW an even more convenient programming environment.
    I am using LabVIEW 7.1.
    BR
    Patrick

    I agree this option would be great to have built in automatically.  It might also be nice to allow auto-indixing of 2d (or more) arrays going into for loops where it will iterate through all dimensions an grab the (i,j) single elements and add an i and j index inside the loop, instead of nesting loops inside of each other.  I guess these features cater to the power users who want to save any amount of coding possible.
    Paul Falkenstein
    Coleman Technologies Inc.
    CLA, CPI, AIA-Vision
    Labview 4.0- 2013, RT, Vision, FPGA

  • Loading HD video from SonyHC3 and converting to DV..

    For the past year l have successfully converted HD to DV on my Sony HC3 and loaded to imovie 06 with ex results. Now when l startup imovie, select a new project then the format as l always have as DV widescreen, DV reverts back to HD 1025 as soon as the Mac sees the camera. l have double checked camera settings and reset/formatted it. The only thing that has changed in the past 2 months was that l tryed to load ilife 08, then read the instructions to find it was not compatible with my emac. thanks
    Dave

    Thanks again 4 reply, this camera records either in HD or DV, by recording in HD u have a much better quality video. However l do not have an HD tv and dvd's are not HD just SD so HD cameras have a feature to convert the output to SD. This feature has worked well for the last year and we are pleased with the dvd's we have made. The mini dv tapes l keep so that when l win the lotto l'll treat myself to a HD tv and a blueray hd dvd and a new mac to go with them. bye Dave

Maybe you are looking for

  • I have an older version macbook, I had an old version of firefox, it worked fine, I replaced it with the new version, now it won't open and my old version is go

    I have a OSX Version 10. 5. 7. The firefox I had was working fine, but it kept telling me to update, and it was working slow, I updated it to the newest version, it wouldn't download it unless I REPLACED, the old version, now it is gone, the new one

  • How to find a file in application bean

    Hi everyone, I need to read in a file in application bean. But I could not figure out the relative path from application bean to my file. Currently I have to use absolute path. However, it does not make sense to use absolute path because it will chan

  • Adobe Media Encoder CS5 - 5.0.1 update

    When I try to update AME CS5 using Adobe Updater, I get this error message: The following updates were unable to install. If you wish to install these updates, please fix the problems below and retry: Adobe Media Encoder CS5 5.0.1 Adobe Media Encoder

  • Problem with Grab

    When I've used Grab a number of times, I'll click on the Grab desktop icon and then find that the Grab feature turns off before I have a chance to actually choose the function I want to do with Grab. When this problem occurs, I've also noticed that t

  • Software download failure - DLL missing

    Colleagues, I have tried to upgrade my Desktop software version 5.0 version 1682 by downloading the 7.1 or earlier software. During the process I get an error message 1723 indicating there is a missing DLL file. How can I solve this problem. Thanking