Labview, degrading quality editor / compiler

Working with Labview since version 8.1, nearly 6 years ago, the functionality has been extended but the quality underneath the surface of both the editor and compiler is getting worse and worse.  Do others experience this?
Just some observations:
- The compiler has become a lot slower, as it seems to be doing continual consistency checks, at each run.
- Saving a bunch of slightly modified vi's and controls has become a lot slower, as it seems to be doing a lot of continual consistency checks and preemptive/preventive corrections before saving.
If these checks were effective I could live with the mostly unnecessarily-repetitive overhead.  But then I find that:
- Since moving to Labview 2010, 2011 and now Labview 2013 I find myself in a position where I have to do 2 (or more) compile runs (Build .exe) of several minutes to create a single .exe.  Each intermediate Build ends with an unspecific, vague error message right at the end of the Build.  As far as I have been able to determine several of these errors have to do with corruption in the underlying database which represents the Labview code.
- Even though it should be fixed I find that after re-linking a bunch of Property-nodes from one Control to a similar Control, the whole group of Property-nodes is linked to each other but not to the new assigned Control.  This is just an example.
- Labels not in their default spot are still being mishandled after conversions, re-linking Controls, etc.
Most of these issues come down to problems (or blatant corruption) in the underlying database where Labview stores your code, introduced or mishandled by either the Labview editor or its compiler.
In stead of gradually lessen I feel these issues have become worse and worse in recent Labview versions.  Combined with all the minutes wasted by the editor and compiler trying to prevent this I would rate this as a (very) poor state.
On a personal level I am getting to a point where I will have to advise my manager to move to a different environment for all non-Labview-essential coding.

Norbert_B wrote:
OK, "backwards" answer :
The term "Crosslinking" describes a situation where a component is shared between multiple projects. So obviously, you have crosslinking between at least three projects.
That being said, crosslinking does not need to be a problem; but it is something easily creating issues as it is a shared resource.
Updating your projects is a perfect example:
If you mass compile (you obviously don't do it yet) one project (e.g. "A"), all components are updated.
After doing this, the other two projects ("B" and "C") wont open up in the 'old' LV version as the crosslinked components are already recompiled for the new version.
There are three feasable options:
a) Keep crosslinking. Be aware of all issues which might occur by this, but it could be an advantage (changes dont need to be deployed to each project over and over).
b) Dissolve crosslinking by putting the shared component into each project as separate, project-specific copy. I recommend you to prepend a project abbrevation to the component during that process; otherwise, crosslinking can be easily re-introduced.
c) Separate Compiled Code from VI. This is an option NI added to LV 2010. This will add overhead to the development as each machine has to recompile the VIs after opening them (no compiled code in the VI file!). Advantage is that each compilation can be system specific and re-compiling doesnt modify the VI-file.
It sounds as if you are using implicit bound property nodes. Do you have an example VI where we can reproduce the issue you are seeing?
Yes, building an application can be tricky from time to time. Another source of errors i forgot to mention are conflicts in the project. Rule of Thumb: Never try a build if there are still conflicts displayed in the project explorer. Do you already follow that rule?
Please try running the mass compile before working on the project in the new version. Opening the top-level VI(s) and saving is most often not sufficient. This recommendation is 'invalid' regarding your base libraries if you chose to go for option c) mentioned above. Reason: Compiled code is not part of the VI-file, so the compiler has always to recompile all VIs once they are opened first time on the machine.
Norbert
Norbert,
I did a couple of mass compiles to tackle a few "Nonfatal insanity" (talk about "Backwards" ) errors I had, and which mysteriously dissapeared.
What I do is open each project seperately and save any updates it wants doing.  This -should- do the same as a mass compile.
And this brings me to one of the main issues I have with Labview:  I constantly have to avoid a straightforward solution because of shortcomings in Labview.  The alternative is just as bad: I have to create examples of bugs to be used on this forum and go through all the effort of generating bug-reports.  A process which has already amounted to several working days of my (and my companies) time so far, to discover none of these bug-reports have been solved a year later or are even mentioned in the "Known bugs" list.
I will have a look at your suggestions to see if somehow I can make this into a manageable working procedure.

Similar Messages

  • Since upgrading to LabVIEW 2013, every VI compiles every time I open it (including quick-drop).

    Hi all.  Since upgrading to LabVIEW 2013, every VI compiles every time I open it (including quick-drop).  This really slows things down!  Perhaps related, my system tells me I don't have permission/access to modify my LabVIEW ini.  Has anyone seen similar, and/or any hints towards a solution?  

    Jeff-P wrote:
    As a side note, LabVIEW.ini is a file that gets generated by LabVIEW when it is launched if the file does not exist. So if you are missing the 32-bit ini file, launch 32-bit LabVIEW and that file should be created.
    I thought that the message: "Perhaps related, my system tells me I don't have permission/access to modify my LabVIEW ini"  might indicate that the labview.ini file cannot even be created... strange...
    LabVIEW Champion . Do more with less code and in less time .

  • Is there a way to programmatically tell what version of LabVIEW a VI is compiled in?

    Is there a way to tell programmatically what version of LabVIEW a VI is compiled in?
    LabVIEW seems to know.  If you click on "List Unsaved Changes", it (sometimes?) tells you the version of LabVIEW that the VI is already saved under (i.e., 8.0 instead of 8.0.1).  So if LabVIEW knows, maybe they've provided a way for us lower-echelon folks (i.e., customers) to know too, eh?
    Is there a way to get this from, say, VI Properties (i.e., Property Node gets passed a ref to a VI & I select the right property name)?  I couldn't find any property name that has "version" or anything like that, but maybe what I'm looking for is using terms that are so different than what I'm expecting that I just can't tell.
    Or by some other method?  (I gvimmed a few .vi files, and see some stuff that looks like I *could* get the version by parsing/searching the file itself, but parsing the binary might give pretty unreliable results.)

    Just dro a proerty node on the diagram and select property: "Application...Version number" and display the resulting string output.
    Message Edited by altenbach on 07-02-2006 11:33 AM
    LabVIEW Champion . Do more with less code and in less time .
    Attachments:
    version.png ‏9 KB

  • 6.01 Degraded Quality Of Files Compared To Adobe Reader

    I have a wide format scanner, an HP Designjet 4200(815mfp), that can save scans in .pdf format. When I click on the properties tab of one of these .pdf's, this is the info: Application: CTX PDF Producer: PDFlib 4.0.3(Win32) PDF Version: 1.3. I am using Photoshop 6.01(6.0 with patch installed, provided by Adobe as ps601up.exe) running under Windows 2000.
    When I open one of these .pdf files with Photoshop a message appears at the bottom of the screen while the file loads, "rasterizing file" as the file loads. The image that then appears on the pc's monitor is greatly degraded from the scanned image, lines and text are jagged and blurry; and even if I make no edits whatsoever and print it immediately from Photoshop the quality of the printed image is horrible, as if the resolution has vastly declined. If I open the same .pdf file with Reader, the image appears in the high quality that it ought to, and Reader prints it in high quality as well. I am definitely in Photoshop not Imageready when this happens, and it is not affected by any change I make to the resolution in the "Rasterize Generic PDF Format" box that appears whenever I first open one of these files.
    Any help will be greatly appreciated, as the degraded quality of these images presented by Photoshop literally makes them unusable.

    I am replying to Mr. McCahill's & Mr. Levine's posts as one, since they address similar issues. I am constrained to use Windows 2000, as that is the system used by our LAN. So newer versions of Photoshop are not an option. The computer I'm using is a 2.80 GHz Pentium 4 with over 1 gigabyte of RAM. The fixed points of my problems are the computer, operating system, and scanner(along with its software, which dictates the type of .pdf file it can generate; it can also save scans as .tif files or .cal files; for reasons that would be much too lengthy & off-topic here, .jpg is not an option even though it can save as those as well). I'm limited to software solutions, and I
    don't think this PC is really a big contributor to any of my woes.

  • Can labview digital waveform editor provide tri state option

    hi
    Could anybody who have used labview digital waveform editor
    would let me know whether, the package supports the
    tri state opertaion also.
    rags

    Currently the LabVIEW Digital Waveform Editor does not provide tri state functionality. 
    Minh Tran
    Applications Engineer
    National Instruments

  • Performance degradation with -g compiler option

    Hello
    Our mearurement of simple program compiled with and without -g option shows big performance difference.
    Machine:
    SunOS xxxxx 5.10 Generic_137137-09 sun4u sparc SUNW,Sun-Fire-V250
    Compiler:
    CC: Sun C++ 5.9 SunOS_sparc Patch 124863-08 2008/10/16
    #include "time.h"
    #include <iostream>
    int main(int  argc, char ** argv)
       for (int i = 0 ; i < 60000; i++)
           int *mass = new int[60000];
           for (int j=0; j < 10000; j++) {
               mass[j] = j;
           delete []mass;
       return 0;
    }Compilation and execution with -g:
    CC -g -o test_malloc_deb.x test_malloc.c
    ptime test_malloc_deb.xreal 10.682
    user 10.388
    sys 0.023
    Without -g:
    CC -o test_malloc.x test_malloc.c
    ptime test_malloc.xreal 2.446
    user 2.378
    sys 0.018
    As you can see performance degradation of "-g" is about 4 times.
    Our product is compiled with -g option and before shipment it is stripped using 'strip' utility.
    This will give us possibility to open customer core files using non-stripped exe.
    But our tests shows that stripping does not give performance of executable compiled without '-g'.
    So we are losing performance by using this compilation method.
    Is it expected behavior of compiler?
    Is there any way to have -g option "on" and not lose performance?

    In your original compile you don't use any optimisation flags, which tells the compiler to do minimal optimisation - you're basically telling the compiler that you are not interested in performance. Adding -g to this requests that you want maximal debug. So the compiler does even less optimisation, in order that the generated code more closely resembles the original source.
    If you are interested in debug, then -g with no optimisation flags gives you the most debuggable code.
    If you are interested in optimised code with debug, then try -O -g (or some other level of optimisation). The code will still be debuggable - you'll be able to map disassembly to lines of source, but some things may not be accessible.
    If you are using C++, then -g will in SS12 switch off front-end inlining, so again you'll get some performance hit. So use -g0 to get inlining and debug.
    HTH,
    Darryl.

  • LabVIEW classes and mass compile

    Just wondering if anyone out there has found any problems mass compiling LabVIEW classes, as I am having what appears to be totally un-resolvable issues - I get a list of VIs that either say BadsubVI or BadVI all of which I have verified are error free.
    The only thing I can round it down to is the use of the lvclasses, and in particular the fact that I'm making extensive use of the inheritance functionality, and also a few override vi's.
    It wouldn't be such an issue except that I'm calling some of these vi's from TestStand, and am unable to get a deployment to work without a successful mass compile.
    Appreciate any responses,
    David Clark
    CLA | CTA
    CLA Design Ltd
    Hampshire, England

    I'm not sure if it's the same problem but I had a vi that would compile into an unrunable exe. I found the problem by mass compiling the project, which came up with a bad vi and bad subvi's that used that vi. Turns out my problem was that I had two classes with the same name (even though they were in separate lvlibs). And renaming one corrected my problem. LabVIEW 8.5 gives a warning and renames the compiled vi's (so if you could try 8.5, that might help you find the problem quicker assuming it's related). It seems to me that the name spaces provided by lvlibs go away when compiled into an exe or by the mass compiler (It may just affect lvlclasses).
    Matt

  • Degraded quality of photo printing book in Aperture 3

    Hi,
    I've isolated an issue in the print book function of Aperture 3 that drove me crazy in the last ten days:
    I've created a custom size book to deliver to a local print service via "save as pdf" function, but something gone wrong because the quality of the pdf isn't good as the original photos.
    After days of trials I've found where the problem rises up, so I'll try to explain step by step below:
    I've identified one critical photos that well shows the problem and I've "printed to pdf" from the normal print photo function of Aperture and the results was correct:
    So I've created a book and put that photo in a page and I've "print to pdf" the book. The results was degraded like you can see below (look at the dark areas that have lost every details):
    I've done the same operation in iPhoto, the "non Pro" minor App, and in a paradoxical way the iPhoto results PDF was perfect!
    I've also noticed that the preview image of the print dialog box of Aperture shows the defect unlike the dialog box of iPhoto:
    Aperture:
    iPhoto:
    So finally seems that something in the print engine of books in Aperture is broken and ruins the photos "before" are printed, and           apparently no setting seems to help.
    I've done this tials on a MacPro Lion 10.7.5 ad Aperture 3.4.3, MacBook Pro Lion 10.7.5 ad Aperture 3.4.3 and MacPro Snow Leopard with aperture 3.1 and 3.2 always with the same results.
    you can download the generated PDF here:
    pdf Aperture Photo:
    http://temp.paolorossi.me/ApertureImagePrint.pdf
    pdf Aperture book:
    http://temp.paolorossi.me/ApertureBookPrint.pdf
    pdf iPhoto book:
    http://temp.paolorossi.me/iPhotoBook.pdf

    I've done another test:
    I've created a sample image with 4 colour faded bars with values from 0 to 128 for RGB and gray then I've putted on a page in Aperture book and iPhoto book then I've printed on PDF.
    Below you can see the results: iPhoto is good whereas Aperture is ruined.
    iPhoto results:
    Aperture results:
    I've also noticed the Aperture results is influenced by the color profile chosed on the selector in the box, but no profile at all gives a good job , every profile returns a weird custom defect, also with "no profile" choice.

  • I am trying to call a function with LABVIEW developed in CCS compiler

    I used MPLAB (CCS compiler) to develop c code that is used to transmit and receive messages to and from an automotive display. It works great as a stand alone. I would like to call this function from LABVIEW but am having difficulty doing so. Does anyone have experience calling a MPLAB developed project from LABVIEW? Thanks in advance.
    Matt

    PTE wrote:
    I used MPLAB (CCS compiler) to develop c code that is used to transmit and receive messages to and from an automotive display. It works great as a stand alone. I would like to call this function from LABVIEW but am having difficulty doing so. Does anyone have experience calling a MPLAB developed project from LABVIEW? Thanks in advance.
    Matt
    MPLab is for programming PICs if I'm not mistaken. I'm not sure how you would want to call a function compiled in MPLab from LabVIEW directly. MPLab will create binary code for execution on PICs and there is no LabVIEW that could run on a PIC. On the other hand I didn't think you could create standard Windows DLLs, Mac shared libraries or similar in MPLab at all.
    You will have to recompile your code in a C compiler that can create the standard shared library format for the plaform you want to call it from LabVIEW.
    Rolf Kalbermatter
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Premiere Degrading Quality (after rendering) of Animation Codec mov's and 4444 Codec mov's, which were made from SWF files, originally exported out of After Effects

    I am making a cartoon, which was created in FLASH,
    Then the SWF's were put into After Effects and exported out as MOV's. (I've tried both AppleProRes4444 and Animation codecs).
    Then I put those Mov's into Premiere. Everything looks crisp until I render. Once rendering is done, the video quality degrades. Here's a screen shot of Premiere:
    On the left is the source MOV (exported from AE), on the right is the timeline viewer (after rendering). The quality on the right is degraded. Help?
    If I use MP4 versions, exported out of Flash, through Media encoder, quality is not degraded.
    (Read on for additional info)
    It was easy and quick to create my MOV's out of AE, which is why I did it this way. I'm exporting to MOV's so I can edit quicker in Premiere. I find editing to sound in AE really difficult, and I don't edit in Flash (employees provide SWF files to me).
    Here is sequence settings in Premiere (these setting were created by dragging an MOV into the sequence and choosing "change settings" to match clip settings.
    Help? Please?

    I am making a cartoon, which was created in FLASH,
    Then the SWF's were put into After Effects and exported out as MOV's. (I've tried both AppleProRes4444 and Animation codecs).
    Then I put those Mov's into Premiere. Everything looks crisp until I render. Once rendering is done, the video quality degrades. Here's a screen shot of Premiere:
    On the left is the source MOV (exported from AE), on the right is the timeline viewer (after rendering). The quality on the right is degraded. Help?
    If I use MP4 versions, exported out of Flash, through Media encoder, quality is not degraded.
    (Read on for additional info)
    It was easy and quick to create my MOV's out of AE, which is why I did it this way. I'm exporting to MOV's so I can edit quicker in Premiere. I find editing to sound in AE really difficult, and I don't edit in Flash (employees provide SWF files to me).
    Here is sequence settings in Premiere (these setting were created by dragging an MOV into the sequence and choosing "change settings" to match clip settings.
    Help? Please?

  • Multiple people on I-Chat session degrades quality

    I am trying to have upto four people conferenced on I-Chat. When I add a third party, the video signal degrades badly.
    Has anyone got some input on how to improve the video quality?
    Thanks in advance for the help.

    Hi
    Have you all set the Quicktime streaming setting on all side's, goto sys prefs/quicktime/streaming/streaming speed, set to 1.5mbps(dont use automatic)
    In ichats prefs click on video and change bandwidth limit to NONE.
    Restart iChat.
    Tony

  • Labview 2010 icon editor extremely slow

    Working on a medium size project in LabVIEW 2010.  Out of the blue the icon editor slows down.  So slow that typing in one text character takes 20 seconds.  After closing the editor there is about 3 to 5 second lag on every action I make in LabVIEW. 
    Restarting LabVIEW sets things back to normal until I open the Icon Editor.  Opening any SubVI results in this problem.  And this occurs in any project or even if no project is open and I try to edit icon in an Untitled 1 subVI.
    Anyone else seen this behavior.  I really don't want to re-install LabVIEW 2010. 
    Thanks
     

    I'd almost suggest using the old icon editor, then edit and save the VI, then try the new icon editor again.  One issue could be with keeping track of all that layer information and one of them having bad data or something.  By using the old icon editor and saving, it might save the icon as the flat image it is, and wipe away all that layer information the new icon editor uses.  
    Of course before there was an improved icon editor developers made their own, and I'm curious to know if they still work, and if they would load faster.
    https://lavag.org/files/file/100-mark-ballas-icon-editor-v24-lv2010/
    https://lavag.org/files/file/91-improved-lv-2009-icon-editor/

  • LabVIEW crashes during mass compilation every time

    Hello All
    I have installed a new LabVIEW 7.1 from February on Win XP.
    Every time when I try to mass compile LabVIEW, after few files it crashes, and only one thing I can do is to send a report to Microsoft.
    It looks like it crashes in a moment when it can not find a subvi in the directory structure.
    What is wrong with it?
    Thanks in advance.
    Pawel

    I hope you are not trying to mass compile the entire LabVIEW directory. I will definately choke on several of the files it will find. If you have upgraded to 7.1.1, you should mass compile the vi.lib, instr.lib, user.lib and examples directories.
    You could also have an error with the Paths settings. If it is searching an invalid path to find a subvi, that might cause the problem.
    I suggest that you start small and try a single subdirectory like user.lib, then move on to instr.lib, then examples.
    Michael Munroe, ABCDEF
    Certified LabVIEW Developer, MCP
    Find and fix bad VI Properties with Property Inspector

  • Will importing iMovie 08 project into iMovie HD 6 degrade quality?

    Hello,
    Didn't find this in previous posts, so here goes...
    I worked on a few projects in iMovie 08 before learning that iMovie HD 6 had better picture quality, features, etc for my needs.
    I want to combine footage from iMovie 08 and iMovie HD 6 to send to iDVD to make 1 dvd movie.
    Can iDVD handle footage from multiple projects in different iMovie versions, or do I need to have everything in one version of iMovie first? If I import the 08 clips into HD 6, will it degrade the picture further? I no longer have some of the miniDV tapes the footage was recorded on.
    AND, if I do import 08 footage into HD 6, how do I do this to preserve the best quality I can?

    Thanks.
    What is the best video compression type to use for this purpose? I have it set to MPEG-4 right now, with same frame rate as original. I don't know a lot about compression types, but would DV/DVCPRO-NTSC work or be any better?
    What about audio settings? It is currently set to 16 bit, 48 KHz, Linear PCM... the audio wasn't great from the source (internal camera mic), but I'd like to preserve what is there!

  • JAVA editor/compiler for Mac

    I am taking a JAVA programming course , and I would like to download a JAVA editing program for use on my home computer (Mac OS 10.5). I use TextPad at school, and I would like a similar program that can compile and run JAVA applications/applets.

    If you want an IDE ...
    http://www.eclipse.org/
    Of course, you can use any text editor and the command-line to compile if you don't want an IDE.
    Eric

Maybe you are looking for

  • Imported Business Services show up empty

    Hello We have a SCSM 2012 R2 and a SCOM 2012 R2 Environment. We have created DAs in SCOM and imported the relevant managementpacks to SCSM. The CI connector was configured to import the MPs  and a synchronization was done. When I look under Business

  • Dimension Doesn't Open, No Error Message

    We're experiencing an odd occurence in admin (using BPC 7.0).  We're attempting to add a member to an existing dimension.  We click on 'Maintain dimension members".  It starts working, but the dimension sheet never opens.  We receive no error message

  • 3d for 3d tv

    I use sony vegas for my native side by side 3d footage. I can add numbers,letters,words easly for the pop out effect that can be burned to blu-ray and played back on a 3d tv but sony vegas is pretty limited on 3d add ons. Very time comsuming to move

  • Shared Libraries on Mac OSX Server - Colorsync Profiles, Fonts etc.

    Hello, Our office have 10.5.5 Server running with approximately 25 clients attached (all on Mac OSX 10.5.5). We have a number of colour printers which we regularly profile, and are trying to find a way to syncronise Colorsync profiles across the clie

  • How to check parallel work items status of a workflow

    Hi Experts, In a standard workflow, there is a task which can create a workitem to different agents by agents determination. As per standard workflow, once the workitem have been opened by any user, the same workitem will disappear from all others in