Building exe file fails (not enough memory) in LV 2011

Hello,
I have a project with about 900 VIs. I'm using Win7 32 Bit with 4 GB RAM (a 32 Bit OS can address about 3,3 GB) with LabVIEW 2011 SP1.
When trying to build an executable, I allways get the error "Not enough memory to complete the operation" (or something like that).
My RAM is full at this time.
Without building the exe file I can start the project without any problems and with only half RAM usage.
What can I do to generate the exe file with my system?
Are there any options I can set, that the application builder will not use so much memory?
Regards
Matthias

Hello
"how much memory do those 900+VIs take on disk????"
53 MB
"Ctrl+Run Button is a force compile for a single VI"
I used Ctrl + Shift + Run
I also tried the mass compilation tool itself and it doesn't change anything.
"If the source distribution also plows the memory, you want to split the distribution into several packages"
That sounds very coimplicated ...
Did I understand it correctly? I have to create several source distributions (option "directory" and not "llb"). And in my build specification for my exe file, I can chose these source distributions?
Do I need to prepare my VIs or can I directly create the source distribution with the option "remove block diagram"?
"I had that problem several times in a large project. I did a mass compile, saved all the VIs, then closed out of the project and LabVIEW. I opened up LabVIEW again and loaded only the main VI. I performed a build and it worked successfully."
I tried it and a still get the same error.
My customer need to get the update because it a critical one. And I can't build it.
An update to Win7 64 bit on my machine is also not a solution because all our customers use Win7 32 bit and the 64 bit executable won't run on a 32 bit operating system.
Regards
Matthias

Similar Messages

  • Reproducible error when loading many reports (Load report failed - Not enough memory for operation)

    Environment:
    Win 7 SP 1
    Visual Studio Pro 2012 Update 4
    Crystal Reports for Visual Studios Service Pack 10 (13.0.10.1385)
    Report created in Crystal Reports XI Release 2 (11.5.8.826)
    Targeting x86 .NET 4.0
    Scenario:
    We have a program that runs and creates a large number of reports before the process is ended. When running after many hours we'd get a Load report failed/Not enough memory of operation exception. I kept removing code and found i could reproduce just using the report.Load call. I simplified the report to a completely blank report to make sure it was nothing specific to a report I was loading. (Opened Crystal Reports XI Release 2, Save As, "Blank.rpt".). I then created, loaded, and disposed of this report in a loop. I was able to cause the same exception after 32,764 iterations on my machine. I also tried using .NET 3.5 same result. I added a counter to our main program and it also went through 32,764 report loads before the same exception was thrown. Main program uses 15 or so different reports with a variable number of subreports in each.
    Sample Code to illustrate the problem:
    I did this as a WinForms project since our main program is using winforms.
    References added:
    CrystalDecisions.CrystalReports.Engine
    CrystalDecisions.ReportSource
    CrystalDecisions.Shared
    CrystalDecisions.Windows.Forms
    using System;
    using System.Windows.Forms;
    using CrystalDecisions.CrystalReports.Engine;
    namespace CrystalTest
        public partial class Form1 : Form
            public Form1()
                InitializeComponent();
                int i = 0;
                try
                    while (true)
                        i++;
                        ReportDocument report = new ReportDocument();
                        report.Load("Blank.rpt");
                        report.Close();
                        report.Dispose();
                catch(Exception ex)
                    MessageBox.Show(i.ToString() + ex.Message);
    Exception:
    CrystalDecisions.Shared.CrystalReportsException: Load report failed. ---> System.Runtime.InteropServices.COMException (0x80041004):
    Not enough memory for operation.
       at CrystalDecisions.ReportAppServer.ClientDoc.ReportClientDocumentClass.Open(Object& DocumentPath, Int32 Options)
       at CrystalDecisions.ReportAppServer.ReportClientDocumentWrapper.Open(Object& DocumentPath, Int32 Options)
       at CrystalDecisions.ReportAppServer.ReportClientDocumentWrapper.EnsureDocumentIsOpened()
       --- End of inner exception stack trace ---
       at CrystalDecisions.ReportAppServer.ReportClientDocumentWrapper.EnsureDocumentIsOpened()
       at CrystalDecisions.CrystalReports.Engine.ReportDocument.Load(String filename, OpenReportMethod openMethod, Int16 parentJob)
       at CrystalDecisions.CrystalReports.Engine.ReportDocument.Load(String filename)
       at CrystalTest.Form1..ctor() in c:\Test Projects\CrystalTest\CrystalTest\Form1.cs:line 27

    int = Int32. No it's not the "counter" that's causing the problem. The max size of an int32 is far far larger than 32764.
    I am disposing and cleaning up the datasets in the main app. That is why I didn't include them in this test; they aren't relevant.
    I'm unsure why this test program is completely irrelevant. It throws the same exception, at the same count, as the main program. It does it in substantially less lines of code than our main program. I spent days running long tests to figure out exactly what I needed to make the problem appear so I could post a clean and precise post on these forums. I then created the test program to illustrate that.
    The "real" code as I said does stuff in sections and in a certain order.
    For each report I need to export based on rows in a table
    1. Creates a new Report Document
    2. Loads the report document with the report
    3. Creates a dataset of the data to display
    4. Calls SetDataSource
    5. Calls Report.ExportToDisk
    6. Disposes DataSets
    7. Closes/Disposes Reports
    To help isolate the problem I first took out the export to disk part (Step 5). The problem still occurred. I then took out everything related to our data. (Step 3, Step 4, Step 6). The problem still occurred. And yes I commented out this code in our main real program. This left me with:
    1. Creates a new Report Document
    2. Loads the report document with the report
    7. Closes/Disposes Reports
    At this point i had to prove it was not dependent on the report. This makes sure it's not a database connection, or pulling too much data into the report. The most efficient test for this is a blank report.
    So my order of operations becomes...
    1. Creates a new Report Document
    2. Loads the report document with a blank report
    7. Closes/Disposes Reports
    So you'll see this is exactly why I wrote this test the way that I did.
    I've had a run where it error on iteration 32761. My last runs have errored on 32764. I have had many runs over many weeks that all error with the same exception.
    There are no temp files left behind, With a test running you can see the temp files being added but they are immediately removed.

  • Could not open the file. Not enough memory

    I have always used fireworks cs4 normally, but today, i experienced a bit of trouble. there is one png file that when i click to open it the fireworks loads but then a message pops up (could not open the file. Not enough memory) after a bit of researching, i found out that this error happens when i bring to stage one of the common libraries menu bars. (menu bar 4). Does anyone know how to fix this?
    ps: I am able to open this file via file-open-*png. Please i really need help. Is the library corrupted maybe?

    Hi,
    It opened on mine (1.2Gb). I don't think it's a memory issue. I've used FWCS4 with 512mb RAM, never had such a problem. (of course it was very slow at the start, and i used to leave for a cup of coffee first and then come back when it was ready for work. That's why i upgraded to 1.2G .)
    Import this menu bar from the common library, save it again as a png and then open it with fireworks the way you did, to see if the same error occurs.
    Has this happened with other files as well?

  • I see message "Cannot open the file. Not enough memory to open the file"

    When I'm trying to open file from adobe.com on my tablet, I see massage " Cannot open the file. Not enough memory to open the file" . I've got 10 gb free memory , so I dont know, why...

    Which system are you running (is this information correct in your profile)?
    Are there complicating issues to this question such as you transferring libraries between computers?
    Have you upgraded iTunes recently?
    Troubleshooting iTunes installation on Mac OS X - http://support.apple.com/kb/HT2311  - read section on Opening iTunes for information about running multiple copies of iTunes.

  • Can't open large layered pdf files. Not enough memory

    I have a large 300 M layered pdf file that when I attempt to open eventually pops up the "not enough memory (RAM) message.
    I am on a windows XP Power Spec computer with 1.97 GB RAM I have allocated 1106 MB for memory useage, 12 history states and 8 cache level.
    I have a 187 GB Seagate exterior harddrive as my main scratch disk and a mirrored main drive C with 154 GB of free space as my second scratch disk.
    I also experience a canvas disapearance occasionally when using the polygonnal lasso tool. I can be clicking along and all of the sudden the image disapears until I click again. Annoying but not unworkable.
    Any ideas?

    I can open psd and other file formats of equal size but not pdf's. I switched the scratch discs around to see if that helped but still am getting the same issues

  • Export several Fuji X-Trans Files got "Not Enough Memory" Error

    When I tried to export several (say 15) Fuji X-Trans files at the same time, I got a "Not Enough Memory" Error. This problem only occurred after after upgrading to Lightroom 4.4. I can now only export a few (3-5) X-Trans files at the same time. Does anyone encounter this problem and know how to fix it?
    I am using Windows 7 32-bit with 4GB memory, SSD System drive, and Radeon 7700 Graphics card. For Virtual memory, I set "no paging file" on C drive and "system managed size" on another SSD drive.
    Thanks in advance.
    Wilson

    One might expect an Export function is single-threaded so the number of files exported wouldn’t matter, but perhaps if you are very close to the limit, memory fragmentation comes into play, where LR is requiring contiguous blocks and there aren’t any big enough.  It’s probably time to upgrade to a 64-bit Windows system and start getting used to the Metro interface which isn’t bad as long as your frequently-used programs have tiles on the first screen.
    Go to Help / System Info… and see how much memory LR has available.  On my 32-bit system with 4GB of RAM installed LR only has 716.8MB available which is typical for 32-bit Windows applications:
    Built-in memory: 3327.0 MB
    Real memory available to Lightroom: 716.8 MB
    Real memory used by Lightroom: 160.5 MB (22.3%)
    Virtual memory used by Lightroom: 160.8 MB
    Memory cache size: 33.2 MB
    You could change your VM settings to allocate a multi-gigabyte minimum pagefile size to see if it makes any difference, in case LR is waiting on the VMM to allocate more and then gives up, but I’d expect LR is requiring actual RAM for processing images, not VM which is much , much slower.

  • File Error - FCP-6 - "not enough memory" ---DVDPro disk is slightly jerky..

    I have a macbook. 2GHz Intel Core Duo, 1 GB 667 DDr2 SDRAM. Final Cut Pro 6.0.1
    OS 10.4.11
    I have a project file - HD that has worked fine up until now. And wouldn't you know it - it's just
    when I need to be able to burn to quicktime that this occurs.
    There appears to be a continuing "file error: not enough memory" - anytime I open a sequence - now in any project.
    I have done a disk repair with disk utility. I zapped the P-RAM. I've tried dragging the project files to my laptop desktop. Nothing works.
    I have a Lacie external drive with over 150GB of memory left. Another project was shot in HDV -
    and it seems this started to occur after working on that project.
    What else is there? Why is it doing this - of all things, now?
    Please advise - ASAP!?
    Also, one other small detail. If I have my sequence settings in FCP at DVCPRO-HD720p60, and I set my compressor file at 23.98fps - could that be responsible for a subtle jerking in the DVD Studio Pro Disk print? When I pull up the file in DVDpro - and play it on the simulator, the jerking is not present, only in the disk burn.

    The memory that FCP is referring to is RAM. If you're running with less than a couple of gigs, larger projects will report that there's not enough memory... a gig isn't really enough to run FCP anymore. Tiger and Leopard use a LOT of it up so you probably need more RAM.
    The other things that can use up RAM fast are graphics from photoshop, and if they have a blank layer in them, you'll get an out of memory error too. So might check that out, but my guts say you need more RAM for your laptop.
    Jerry

  • Message "Could not save because there is not enough memory (RAM)"

    Running Mac OS9.2.2 Suddenly start to get and error message " An error occured saving the enclosure "file name" Not enough memory" when trying to save an attachment in Microsoft Outlook. Same thing with Adobe Photoshop - when trying to save an any new file - message "Could not save because there is not enough memory (RAM)".
    Already Tried:
    During several freezes, suddenly my mac stop recognize any extensions. When used Microsoft Outlook and tried download an attachment got an error message type 3. So I create a new set in extensions manager and get ride of Type 3 error, but get a new error message "Not enough memory" when try to save an attachment, or "Not enough RAM" - when try to save an EPS file in Adobe Photoshop.
    In Outlook ("Get Info" section)changed memory size:
    Suggested Size: 7168 K
    Minimum Size: 16000 K
    Preferred Size: 16000 K
    In all mailboxes - there are not that many messages and they are not heavy at all, I checked them all before. And I rebuilt/compressed Outlook database - holding Option while restarted Outlook...
    Adobe Photoshop - started acting weird at same time. Even if I want to open a file through the menu: File - Open - it gives me the same message.
    Who has any clue - please help!!!

    Hi, Mike -
    I have Outlook Express's Preferred memory allocation set to 45000K, and don't have any problems with attachments. Don't be afraid of increasing OE's Preferred memory allocation (nor IE's, either - I have that set to 65000K).
    How large is the Messages file used by the account you're using in Outlook? You can examine that file here - (hard drive) >> Documents >> Microsoft User Data >> Identities >> (account folder, either the default "Main Identity" or one you've created) >> Messages.
    It seems that the Messages file grows with each email received; but does not shrink when an email is deleted. Over time, especially when lots of attachments are involved, that file can get huge.
    One way to help it not get too big is to -
    • delete attachments after copying them to the desktop. In case you're not familiar with the easy way to copy an attachment to the desktop - select the email so it opens; click the small triangle next to the legend "Attachments" between the upper pane and the lower pane in Outlook's main window; drag the attachment from the list out onto the desktop. It will be extracted and copied to that location. Once that has been done you can delete the attachment from the email with which it arrived.
    • after purging your emails of unneeded emails and attachments, compact the file. To do that, quit Outlook if it is running. Then hold down the Option key and start up Outlook. A splash screen will appear asking if you want to compact its database files - click yes. Follow the prompts.
    Compacting OE's database can also fix some instances of a damaged database that can't be used.
    If you could answer a few more questins, perhaps we can offer additional suggestions -
    1) How much RAM do you have installed? That means the physical RAM, not counting Virtual Memory in use, if any.
    2) How full is your hard drive - how big is it, how much free space is left unused?
    3) When was the last time you rebuilt the desktop file?

  • ITune fails to burn large audio books:The iTune library file cannot be saved. There is not enough memory.

    Hi,
    I am trying to burn large audible audio book to virtual CD and iTune 11.2.2 keeps crashing midway. After spending hours trying to get information off technical forum, here is the description and possible explanation of what is happening:
    The burn start normally, iTune  create 10 tracks of 8 mn per CD.
    The burn progress pass 10 CD (up to 30 CD).
    The a message box with: "The iTune library cannot be saved. There is not enough memory" appears.
    In the back ground the process keeps going for another 2 or 3 CD then stops.
    This is NOT a memory problem:
    My computer still has over 100 Gb of HDD left
    On my i7, only 4 of the 8 core were used.
    The whole process didn't go over 3.5 Gb of RAM (out of 8).
    I reinstalled, iTune.
    I re-imported the files.
    I raised the processes priority to High.
    From the forums, This problem started past iTune 7.x and x64 bit processors whenever iTune used more that 1.5 Ghz of RAM.
    It does follow the behavior of a 32bit bit system going Out Of Memory.
    The way I see this: Apple has to either create a fix to reuse the current iTune memory properly or create an update to 64bit so it can use whatever RAM is available in the computer.
    Does anyone else have any other idea?
    For those who want to play lawyers, I am not American. In my country, it is legal to remove DRM off content that you legally own. The fact that Audible softwares were made by drunk monkeys is enough of a reason.

    Well, I installed the latest iTunes this AM, works perfectly.  Although for some reason it did lock up after playing a song today.
    Horror stories?  That's a bit overly dramatic.
    No, it isn't.
    The issue you are experiencing is one that has occurred in multiple versions of iTunes, a simple search would have revealed this.
    A simple search was conducted, no one seemed to have an answer.  Do you always adopt such a haughty attitude?  Or was I just lucky.

  • Building an Executable - Not Enough Memory to complete this operation

    I am trying to build an executable from my project, it is quite a large program.  During the build, within the 'initializing build' phase, I get the error message 'not enough memory to complete this operation'.  I have a decent PC running XP and 4GB ram. LabVIEW is using about 1.2GB of ram when the problem occurs.
    This is an application that I've been working on for some time and I have experienced similiar problems before.  To get round it I have added the /3GB switch to the boot.ini file.  However that is no longer working, it complains of a problem loading a dll which seems to have only started happening since moving to LV2012.
    Does anybody have any experience with solving this problem please? It is a bit of a showstopper for me at the moment.
    Thank you,
    Martin

    Hi Martin,
    Adding the 3GB switch is a solution which should still work on your system. 
    I'm a little confused, is the system still complaining about not having enough memory? Or is it just the dll problem?
    And is the error message occuring during the build, or when trying to run the .exe?
    The dll in question is associated with DAQmx. 
    If it is a runtime problem rather than a build problem, it may be that you haven't included the DAQmx driver on the target machine, in which case the answer lies in this document: http://www.ni.com/white-paper/5406/en
    If I've misunderstood, please clarify the exact issue which is currently occuring, and we can continue to troubleshoot.
    Ian S
    Applications Engineer CLD
    National Instruments UK&Ireland

  • HELP: ld: fatal: file /dev/zero: mmap anon failed: Not enough space

    Trying to build a debug version of our code using Solaris 10 and SunStudio 12, and we get this error:
    ld: fatal: file /dev/zero: mmap anon failed: Not enough space
    Any idea?
    The ld process grows to ~4GB then dies.
    The compile server has plenty of memory and swap available.
    Any idea? Our program is getting quite large, but I am sure there are larger ones out there.

    The only suggestion I have is to switch to stabs debug format (-xdebugformat=stabs compiler option) and don't specify -xs so that most of debug info is stored in .o files. Maybe it'll save some linker address space. Other than that, try asking on linker forum:
    http://www.opensolaris.org/jive/forum.jspa?forumID=63
    I know that it's a hot topic and you will probably get an answer as soon as someone from linker team take a look.

  • Fork Function Failed.  There is not enough memory available.

    Hello I have gotten the following error with my Application 10g AS server:
    Fork Function Failed. There is not enough memory available.
    I have set the ulimit to unlimited for all categories for the Oracle User. I have set the number of processes per user to 2048 processes. I have 4GB of RAM and 6GB of paging space.
    I have also noticed that once I open the opmn services, even if no one is executing any commands, the Oracle User's processes are growing. I ran the following command to determine it: ps -ef | grep oracle user | wc -l. I also monitored some of the processes for sometime and they are not terminating in order to release the physical memory or the paging space.
    Can you please assist me with this problem?

    user483766,
    You will probably want to file a TAR and work through this with Oracle Support.

  • Error msg - not enough memory available for this task - chm file

    Hi All
    I am using RoboHelp for HTML 8.0 and creating chm file.  I am using context sensitive help and the F1 key functionality.  I have in the past created some map ID's with F1 and they have worked properly with our company software application.  Now, I have a different database with our company and a larger help file and it is not working.  I am told that the database should work.  I also have lots of memory.  I am working off of my hard drive which is where the .exe is for the company application and my chm file.  I can open my .chm file.  When I go into the application and press F1 where I created the map ID;s I get the message, not enough memory available for this task and also topic not available. Quit one or more programs to increase memory.
    I have lots of memory and my file is only 598kb.  I also only have RoboHelp and the application running.  When I did a search, I found an article for a fix from 2004 from Adobe - Rb_46124 - not sure if this is the fix but the steps dont match since it is old.
    Please help me.
    Thanks
    Caryn

    Hi there
    Actually, if you were to use FlashHelp output, you might find that the problems are compounded, as now you have the Flash Player and all its idiosyncrasies in the mix.
    Cheers... Rick
    Helpful and Handy Links
    RoboHelp Wish Form/Bug Reporting Form
    Begin learning RoboHelp HTML 7 or 8 within the day - $24.95!
    Adobe Certified RoboHelp HTML Training
    SorcerStone Blog
    RoboHelp eBooks

  • When opening files "Not enough memory to load TIFF file."

    Mac just had a new build of OSX 10.6.8.
    Total memory 4gb of which over 3gb is free.
    Open any image gives a memory error "Not enough memory to load TIFF file"

    Open the linked tiff file(s) in photoshop, and resave with LZW Compression (OFF I Believe). I know you can place an lzw tiff adn usually works fine, but I believe i had this in the past and turned it off/chanegd the RLE settings and Illustrator was made happy. Also look for if the image is  photoshop >> image >> 8 bits/channel.

  • Photoshop CS2 Cannot read PNG files--"Not Enough Memory"

    Hi! I have a bit of a problem. I've scoured the interwebs for days but can't find a solution.
    My copy of PSCS2 was able to open PNGs in the past, but it just up and decided it doesn't want to anymore. I'm serious--I can't think of any major overhaul or anything I did to cause it, but now it won't read them correctly. It says there's "not enough memory," which doesn't really make sense, when I have a Macbook Pro and nothing running in the background. On top of that, the particular files I've been trying to open are barely 100kb. I've used canvases upwards of ten thousand square pixels in dimension with no problem.
    This is with any PNG file. It's not an "incorrect extension" problem, because I made these files and just need to reopen them to edit things.
    Here are some screenshots that show what's been happening; nothing exciting, but just in case someone notices something I don't:
    That's the picture I want to open.
    And that's what pops up.
    :/ I'm a little dumbfounded. Halp?

    Now, this is strange.
    I restarted my computer and now the message has changed--to a simple one, but albeit still confusing.
    Sigh.
    I didn't do anything but restart my computer.

Maybe you are looking for

  • How do I access the Calendar app while on phone call in iOS 7?

    With the new IOS7 I wonder how I can access my calendar to schedule while I am on a call?

  • Cisco ACS 5.3 - How to only allow specific AD groups to login

    Can anyone help me figure out what I have wrong or have missing? I've configured three specific AD groups, Admin, Storage, and HelpDesk, with their own commands sets. This seems to be working fine, but everyone can log into everything, but they can't

  • Trying to update to iTunes...  Error Messages

    When attempting to upgrade to the latest version of iTunes, I encounter this error message: Error writing to file: C:\Program Files\Common Files\Apple\Mobile Device Support\etc\zoneinfo\Antarctica\Casey. Verify you hvae access to that directory. Does

  • Problem with Bank of America website

    Flash Player interferes with my opening Bank of America site. Internet Explorer says it cannot open the page. I have done the diagnostics on the connection. There is no problem and every other website works fine. I have to uninstall Flash Player, and

  • Dump when I activate a transformation with a formula.

    Hi All, I 've different problems with transformations.... Particulary when I activate a transformation with a formula, it terminates in dump with exception "ASSERTION_FAILED" in class CL_RSRAN_FOBU_APPL method GET_CODE. I' ve read the note 1050275 (A