Why does system exec cause a memory leak to occur?

Sorry if this has been posted before, but I can't quite figure out why this VI is causing a gigantic memory leak (losing nearly 500K of memory a second). It's a simple system exec call, and I even trivialized everything in order to try and discover the problem.
Does anyone see a huge problem here? I've tried calling up taskkill afterward as well, though that seemed to make little difference.
If you don't trust my trivial program (and you probably shouldn't), I've also included the source files for the program - it's a one-line C++ program that returns immediately.
Any insight is greatly appreciated.
Thanks!
Attachments:
Dilution.vi.zip ‏11 KB

Try putting a wait - even a 0 second wait - in your loop and see if it makes a difference.  Without a wait, LabVIEW will try to run that loop as fast as possible, possibly so fast that it never yields time to free resources.

Similar Messages

  • Whether our application caused the memory leak

    whether have the command to find out whether our application caused the memory leak !
    We are using Jdev10g + JHeadstart(Release 9.0.5.1, BC4J + Struts + JSP) to build our enterprise application,
    but when the application running some days in our Application Server(RedHat AMD64, with 8GB RAM), the memory consumed was growth seriously in production environment (one OC4J instance will caused almost 2G memory ulilization after running 2 days) that is caused us to restart the instance each day ! we doubt and worry
    about maybe our application suffer the memory leak problem, so we want to know have the command to find out whether our application caused the memory leak
    issue and which program is the major murderer for memory.

    Ting,
    Unfortunately there is no 'command' that will show you the location of a memory leak. First I would scrutinizing your code for any obvious 'leaks'. Then, you should obtain some statistics about the usage of your system. One important aspect is how long a HttpSession usually lives. If you have thousands of users that stay online the entire day and never 'time out', and if you have users on the system 24 hours a day, then the sheer number of HttpSessions might be a problem.
    JHeadstart 9.0.x tends to have rather 'heavy' session objects. This can easily be solved by adding some actions to clear up DataObjects and DataObjectSets of previous 'Groups' when entering a new Group. A good place would be between the 'DynamicActionRouter' and the 'ActionRouters'. Just before entering the 'EmployeeActionRouter', you could remove all DataObjects and DataObjectSets that might be left on the Session by actions of the other ActionRouters.
    Also it would be interesting to see if the garbage collector can do its thing when the system is less busy. For instance, if your application has a smaller load during the weekend, what is the memory usage on Sunday at midnight compared to, say, Friday at noon. Has the memory load dropped consistently with the decreased number of online users, or does too much memory stay allocated? If so, then there's more going on than just HttpSession objects.
    If all this does not lead to a solution, I suggest using a profiling tool such as OptimizeIt to investigate the memory usage of the application in detail.
    Kind regards,
    Peter Ebell
    JHeadstart Team

  • Why does iTunes Match cause iTunes to crash ?

    Why does iTunes Match cause iTunes to crash ?

    My problem is similar.  It began right after I allowed an Adobe update to run.  (Adobe Flash Player version 10.0.45.2 is recorded in my Registry.) Now every single website I try to open triggers a window (see below) and repeats many times.
    It makes no difference whether I "Allow" or not.  I think it runs this window for every flash object on that website.  It is occurring while I enter this post!
    It makes using the Internet MOST challenging.  Surely this is causing major problems for everyone.
    I have IE8 on Windows Vista.
    Here's the text from the message:
    "A website wants to open web content using this program on your computer.
    This program will open outside of Protected Mode. Internet Explorer's Protected Mode helps protect your computer. If you do not trust this website, do not open this program.
    Name:  Adobe Flash Player
    Publisher: Adobe Systems Incorporated
    (Checkbox) Do not show me the warning for this program again.
    buttons:  Allow         Do Not Allow"
    Should I appeal to Microsoft or Adobe for a solution?  Is is possible to install the previous version?
    Thanks for any wisdom.

  • By any chance long living filestream object cause unmanaged memory leak?

    Hi,
    I am running my application where in I have one file stream object which live for long duration, say its life time is equivalent to life time of application. So by any chance this kind of object causes unmanaged memory leak in the application. Because when
    I see private bytes for my application, it is higher than the all bytes in heap counter. That mean there is a unmanaged memory leak. So does such filestream object cause such behaviour?
    Thanks and Regards,
    Lucas

    What makes you think it's the file stream that leaks unmanaged memory? There could be other parts of your applications that do that.
    FileStream doesn't usually allocate unmanaged memory. Its internal buffer is a GC heap allocated byte[] array. The only unmanaged memory allocation that I know of in FileStream is a OVERLAPPED structure used for async I/O. If you use BeginRead/Write make
    sure you call EndRead/Write correctly, otherwise that structure may be leaked.
    @Joel Engineer: "When a stream uses ASCII encoding under certain cases nulls are added to the data for block alignment."
    FileStream doesn't have anything to do with text encoding and I've told you that before. Next time when I run into a post that makes such a claim I'll simply delete it. This display of ignorance has lasted more than enough.

  • Find out whether our application caused the memory leak

    whether have the command to find out whether our application caused the memory leak !
    We are using Jdev10g + BC4J + Struts + JSP to build our enterprise application,
    but when the application running some days in our Application Server(RedHat AS3-86, with 4GB RAM), the memory consumed is serious ! we doubt and worry
    about maybe our application suffer the memory leak problem, so we want to know have the command to find out whether our application caused the memory leak
    issue and which program is the major murderer for memory.

    The second scenario is, as we perform some deployment activity on 10g, the memory usage chart shows a sharp consumption of memory about 1.5GB .the loss memory almost be display in "other" legend chart !
    Please give us some advice , thanks in advance

  • FB70 why does system show warning message "enter true account assignment" ?

    FB70 why does system show warning message "enter a true account assignment object with revenues" ?
    i enter profit center but i still get warning message "enter a true account assignment object with revenues" 
    what should i do ?
    my  system have CO-PA .

    Dear,
    Please see that the profit center which you are entering in FB70 must have segment.
    still it gives warning message do worry enter it will post the transaction.
    bsrao

  • Why does system preferences crash every time i try to change the wallpaper?

    Why does system preferences crash every time i try to change the wallpaper? Any way to fix or reset the app?

    We're sorry to hear that Firefox is crashing. In order to assist you better, please follow the steps below to provide us crash ID's to help us learn more about your crash.
    #Enter about:crashes in the address bar (that's where you enter your website URL) and press Enter. You should now see a list of submitted crash reports.
    #Copy the 5 most recent crash ID's that you see in the crash report window and paste them into your response here.
    Thank you for your cooperation!
    More information and further troubleshooting steps can be found in the [[Firefox crashes]] article.

  • Why Does 3D Reposse cause crashes and work so slow?

    Why Does 3D Reposse cause crashes and work so slow?   I get the pinwheel of death when I use this feature.  It works at a snails pace.  I liked the 3D Type effects in iIllustrator.  Why couldn't Adobe have added all the Reposse features to Illustrator instead of craming all these new features into Photoshop?
    They're worthless if the program needs an hour to process every selection.

    What are your OS specs, your Video Card and your Preferences > Performance settings?
    Why Does 3D Reposse cause crashes and work so slow?
    My guess would be because it has to create a whole bunch of polygons and your computer-set-up may be insufficient for the task.

  • Why does Final Cut cause a hard system crash after upgrading computer's memory?

    2x2.8 gHz Quad-Core Intel Xeon MacPro that orginally came with 2gBs of memory. We upgraded to 4gBs of memory by adding 2 1gB ram chips. Ever since, Final Cut Pro has been causing hard system crashes. Does anybody have an idea why this is happening?

    Installing memory is very specific
    the correct type of DIMMs must be installed in the correct slots dependant on the type of processor fitted
    install exactly as per the Mac Pro guidelines here:
    Mac Pro Memory DIMM Instalation
    you will have to reload the web page a few times after the page warning to down load correctly (command R)
    if it is per the instructions and your having trouble, the DIMMs are most likely faulty

  • Why does my mac use virtual memory when I still have free physical memory?

    I have a 2011 i7 quad core mac, I was hoping it would scream. Most of the time it does. However when trying to edit within FCPX I get a very disappointing experience with many pauses and pin wheels if I don't close every single other program.
    I have 8GB of physical memory and when i'm experiencing these problems I see that i still have 1-2 gb of physical memory free or inactive. At the same time FCPX is only using 2gb of memory. I just happened to keep an eye on the VM page in/outs and noticed them going up.
    Right now i'm doing some browsing and emailing, that's about it.. its sat with over 4gb of memory free or inactive and yet still the page in/outs is still going up occasionally. It's currently at over 2 million page ins, and over 1 million page outs.
    So with so much physical memory free why is this happening!? At the moment the mac feels nice and responsive, but if i start trying to use FCPX i'll start to experience these slowdowns, stalls... whenever i see these i see my main hdd is being accessed whilst the pinwheel is displayed.. I mean i get it, its VM, the hdd is too full, a bit fragmented perhaps, its stalling... but i've got gigs of memory sitting free or inactive... why wont the OS use it!!!
    Would my experience improve if i took the plunge and got 16gb of memory instead of 8gb!?
    Thanks for your help!

    Because without virtual memory, managing computer RAM is a royal pain in the ...
    Virtual memory cost you nothing, and gains you huge benefits, even if you do not notice it
    What cost you is when you need more real RAM than is available, and things are thown out of RAM, either back to the original file it came from (Read Only information), or pushed out to the swapfiles (/var/vm/*).  Then the system has to wait for slower disk access.  But even this is better than not being able to run the apps until you quit something else.
    (speaking as someone that starting his professional life working with 1" punch paper tape, 80 columns cards, 7-track and 9-track mag tapes, 1MB disks (you heard me right 1 Megabyte), etc..., and trust me when I tell you that virtual memory is a god send to software development).
    There are a lot of problems running a modern operating system with out virtual memory.  For example all the shared libraries and frameworks that provide services to an application would all need to be compiled into the application, which means every application gets bigger and instead of having a single copy of the shared library or framework, you would have dozens of copies wasting your RAM.
    Without virtual memory, you would be required to find a contiguous chunk of RAM to run your application.  Think of this like going out to dinner by yourself, you can find any available table, but if you go to dinner with your extended family, you need a table for 10 to 15, and if you are going to dinner with your high school graduation class, you will need hundreds of seats all next to each other and a very large table.  In the later situations you have to wait until the resturante has enough contiguous space, which means you have to wait until other diners finish.  There may be lots of empty tables, but they are not together, and your group wants/needs to sit together.  Virtual memory allows gathering any 4K chunk of RAM, building a virtual memory map for all those random 4K chunks, and make it look like one big contiguous chunk of RAM, so you can run your application right away, no waiting.
    Going back to shared libraries and frameworks.  This code will need to have addresses resolved so they branch to the correct locations during execution, and it will need to have addresses resolved on where its program variables are located in RAM.  Using virtual memory, you can local a shared object into RAM, then place it in everyone's virtual memory map at the exact same RAM address.  This means everyone can use the exact same code, and since everyone is using it at the same RAM address, it makes life so much easier for the operating system (translation, less work, less wasted CPU time, faster execution).
    When a program wants to grow, for example a web browser loading a web page (and its images) into RAM, it needs to allocate additional RAM.  In the contiguous RAM model, you need to get control of the RAM that imediately following your program, but if that RAM is being used by someone else, you have to wait until that program goes away.
    Virtual memory provides protection from another program looking at and modifying your program's RAM.  Malware would just love for virtual memory to go away.
    You want virtual memory.  What you do not want is excessive paging activity.
    If you are concerned, then you can launch Applicaitons -> Utilities -> Terminal.  Once you have a terminal command prompt, enter the following command:
    sar -g 60 100
    which will tell you the number of 4k pages written to /var/vm/pagefile ever minute for 100 minutes (modify the numbers to suit your tastes).  You can then go about your normal usage, and come back later to see how much you have been using the pagefiles.  If you have mostly zeros, and an occasional small burst, this is noise, and not worth worrying about.  If you have sustained pageout activity, with higher numbers, then you should either consider running less things all at the same time, or looking for an application that is being greedy with its memory use (or has a memory leak), OR get more RAM for your Mac if you need to do all those things at once.
    But do not complain about virtual memory.  Life would be much worse without it.  Then again if you have a better idea, write a research paper, and get operating system vendors (as well as hardware vendors) to implement your ideas.  I am serious, as I've seen many accepted computing ideas be overturned by good new ideas.

  • After 5 Releases, Why Doesn't Mozilla Care About Memory Leaks?

    I'm baffled as to why this is still a problem. After a few hours of use, my memory usage while using Firefox 5.0 is at 1.5GB. I've followed every FAQ, disabled every extension, done everything to stop the memory leaks when using this browser and I'm growing very tired.
    I have a Core i7 MacBook Pro with 4GB RAM and Firefox still manages to bring it to it knees. I've been a devoted Firefox user since 2002-3 and I've just about lost my faith in this browser.
    Chrome is a featureless, ugly, dinky browser that I hate to use, but it leaves Firefox in its dust performance-wise. Where is the happy medium? I don't get it.
    My favorite answer is always, "disable your extensions." Here are the problems with that:
    1. Without extensions, Firefox is nothing. I might as well use Chrome.
    2. It never seems to help, and when it does a little, it is difficult to figure out which extensions are doing the most damage. Why doesn't Firefox provide a way to look at which extensions are using the most memory?
    3. Firefox should lay that smack down on extensions that could potentially leak memory, and yet, nothing. It should at least steal memory back when it gets out of control, regardless of what extension is using it.
    4. Mozilla recommends some extensions that are supposed to help reduce memory usage, but none of them work on OS X.
    I'm exhausted. I shouldn't have to restart my browser a million times day to get anything done. Where are the real solutions? How do years go by with problems like this still getting worse? Firefox 5 was supposed to be better at handling memory, but it's only gotten worse for me.
    When will the madness end? We don't want new features, we want performance! I've always loved this browser, but is it really a surprise that Chrome is taking over?
    To sum it up, if your browser is slower than Internet Explorer, you need to hurry up and fix the problem or pack it up and go home.

    My sentiments exactly!! I have all the exact same complaints and concerns, and I've also tried the solutions provided at no avail.
    This the only beef I have with FireFox, but it's a bad one and I've been shopping around for a better browser. Chrome is the best alternative I've found, but it still isn't quite at parity yet.
    Please fix this issue or at least make an attempt at it to let your users know it's somewhat of a future priority.
    Attached a screen shot of memory usage after 1 hour, and this is the new FF 5 update.

  • Possible causes for memory leak in Java and Tomcat

    I would like to enquire what are the typical mistakes done by programmers that will cause memory leak in Java and Tomcat.

    Please refer the below site. It will give more points about the memory leak and how to rectify it.
    http://www.experts-exchange.com/Software/Server_Software/Application_Servers/Q_20981562.html?cid=336

  • How does System Exec VI identify Standard Error within cmd code?

    I am using the System Exec VI to control a USB to serial adaptor program header, I have sucessfully written a .BAT file to call the CMD commands (the .exe I am running uses "-option" commands and the help file reccomends to do so) and it functions perfectly. My only issue is identifing errors. The "Standard Error Out" on the System Exec VI never outputs anything. Yes the wait until completion is TRUE and my standard output functions fine. I am curious as to how the System Exec VI  identifies errors from the command prompt and  why my errors are not showing up. I am currently using multiple match pattern string functions to identify the possible errors from my standar output for the time being but I would like to simplfy my code a bit and clean it up if at all possible. Not to mention there are most likely several other errors that could occur that I may have not identified. Some examples of stanadrd output errors i can get include include:
    {C:\Documents and Settings\owner\Desktop\RACK LINK>C:\DCRABBIT_10.66\Utilities\clRFU.exe "" -s "0":115200 -v -vp+ -usb+
    .bin not found
    C:\Documents and Settings\owner\Desktop\RACK LINK>pause
    Press any key to continue . . . }
    or
    {C:\Documents and Settings\owner\Desktop\RACK LINK>C:\DCRABBIT_10.66\Utilities\clRFU.exe "C:\Documents and Settings\owner\Desktop\RACK LINK\Calibration_v030.bin" -s "4":115200 -v -vp+ -usb+
    Rabbit Field Utility v4.62
    Installing Calibration v0.3.0
    Sending Coldloader
    Error: No Rabbit Processor Detected.
    C:\Documents and Settings\owner\Desktop\RACK LINK>pause
    Press any key to continue . . . }

    I think you should use error handling in batch programming, see this link http://www.robvanderwoude.com/errorlevel.php
    CLA 2014
    CCVID 2014

  • Firefox 21.0 on OS X 10.8.4 still has a memory leak that occurs with no Add ons, 1 tab and leads to 3GB of memory usage in 10 minutes.

    While using Firefox 21.0 on OS X 10.8.4 with 4GB RAM, Firefox memory usage marches up and up until at 3GB (on my iMac) Firefox hangs and is 100% non-responsive and must be killed with Force Quit. It takes 10 minutes to occur.
    No Add Ons
    One tab

    John99,
    Unfortunately, the problem was WORSE in FF 22.0. I regressed to FF21.0 and shut off automatic updates. I may even go back to FF3.6, which is the last one where I didn't see this problem.
    Thank you for trying but as a longstanding developer I know that pervasive memory leaks is a sign of a broken development process. Thank you for trying...

  • Does OutputStream.write() has a memory leak on Linux?

    I write a piece of java code to create 500K small files (average 40K each) on CentOS. The original code is like this:
         package MyTest;
         import java.io.*;
         public class SimpleWriter {
    public static void main(String[] args) {
      String dir = args[0];
      int fileCount = Integer.parseInt(args[1]);
      String content="@#$% SDBSDGSDF ASGSDFFSAGDHFSDSAWE^@$^HNFSGQW%#@&$%^J#%@#^$#UHRGSDSDNDFE$T#@$UERDFASGWQR!@%!@^$#@YEGEQW%!@%!!GSDHWET!^";
      StringBuilder sb = new StringBuilder();
      int count = 40 * 1024 / content.length();
      int remainder = (40 * 1024) % content.length();
      for (int i=0; i < count; i++)
       sb.append(content);
      if (remainder > 0)
       sb.append(content.substring(0, remainder));
      byte[] buf = sb.toString().getBytes();
      for (int j=0; j < fileCount; j++)
       String path = String.format("%s%sTestFile_%d.txt", dir, File.separator, j);
       try{
        BufferedOutputStream fs = new BufferedOutputStream(new FileOutputStream(path));
        fs.write(buf);
        fs.close();
       catch(FileNotFoundException fe)
        System.out.printf("Hit filenot found exception %s", fe.getMessage());
       catch(IOException ie)
        System.out.printf("Hit IO exception %s", ie.getMessage());
    You can run this by issue following command:
      java -jar SimpleWriter.jar my_test_dir 500000
    I thought this is a simple code, but then I realize that this code is using up to 14G of memory.  I know that because when I use free -m to check the memory, the free memory kept dropping, until my 15G memory VM only had 70 MB free memory left.  I compiled this using Eclipse, and I compile this against JDK 1.6 and then JDK1.7. The result is  the same.  The funny thing is that, if I comment out fs.write(), just open and close the stream, the memory stabilized at certain point.  Once I put fs.write() back, the memory allocation just go wild.  500K 40KB files is about 20G.  It seems Java's stream writer never deallocate its buffer during the operation.
    I once thought java GC does not have time to clean.  But this make no sense since I closed the file stream for every file.  I even transfer my code into C#, and running under windows, the same code producing 500K 40KB files with memory stable at certain point, not taking 14G as under CentOS.  At least C#'s behavior is what I expected, but I could not believe Java perform this way.  I asked my colleague who were experienced in java.  They could not see anything wrong in code, but could not explain why this happened.  And they admit nobody had tried to create 500K file in a loop without stop.
    I also searched online and everybody says that the only thing need to pay attention to, is close the stream, which I did.
    Can anyone help me to figure out what's wrong?
    Can anybody also try this and tell me what you see?
    BTW, some people in online community tried the code on Windows and it seemed to worked fine.  I didn't tried it on windows.  I only tried in Linux as I thought that where people use Java for.  So, it seems this issue happened on Linux).
    I also did the following to limit the JVM heap, but it take no effects
        java -Xmx2048m -jar SimpleWriter.jar my_test_dir 500000

    Good point, I actually run cat /proc/meminfo. And I got the following.  Why I have so big of Inactive(file) allocation?
    MemTotal:       15239492 kB
    MemFree:           83424 kB
    Buffers:           76012 kB
    Cached:         13920152 kB
    SwapCached:            0 kB
    Active:           391104 kB
    Inactive:       14181268 kB
    Active(anon):     288124 kB
    Inactive(anon):   288268 kB
    Active(file):     102980 kB
    Inactive(file): 13893000 kB
    Unevictable:           0 kB
    Mlocked:               0 kB
    SwapTotal:        500432 kB
    SwapFree:         500432 kB
    Dirty:           2568700 kB
    Writeback:          6064 kB
    AnonPages:        576292 kB
    Mapped:             9884 kB
    Shmem:               140 kB
    Slab:             472340 kB
    SReclaimable:     421692 kB
    SUnreclaim:        50648 kB
    KernelStack:        1192 kB
    PageTables:         4132 kB
    NFS_Unstable:          0 kB
    Bounce:                0 kB
    WritebackTmp:          0 kB
    CommitLimit:     8120176 kB
    Committed_AS:     835900 kB
    VmallocTotal:   34359738367 kB
    VmallocUsed:       38908 kB
    VmallocChunk:   34359687700 kB
    HardwareCorrupted:     0 kB
    AnonHugePages:         0 kB
    HugePages_Total:       0
    HugePages_Free:        0
    HugePages_Rsvd:        0
    HugePages_Surp:        0
    Hugepagesize:       2048 kB
    DirectMap4k:    15728640 kB
    DirectMap2M:           0 kB
    BTW, my test used up all the memory again, and free -m show this
                 total       used       free     shared    buffers     cached
    Mem:         14882      14805         76          0         80      13583
    -/+ buffers/cache:       1142      13740
    Swap:          488          0        488
    Any hint what's going on?

Maybe you are looking for

  • Where can I buy Tiger? Upgrading from 10.2.8 to 10.4

    I want to upgrade my eMac from 10.2.8 to 10.4 but I don't know where to buy the Tiger software. The Apple store in Soho (NY) does not carry it anymore. I want to use my new nano and let me know if you can help. Where can I purchase Tiger at a low cos

  • 6630 USB problem

    I have a big problem with my usb connection. Very often it happens (generally the bigger the file the more often it is) that indicator of copying to phone stops... th only i can do is to plug off the phone and plug it in again hoping this time it wou

  • How to restrict delegated administrators to modify their own OIM accounts

    Hello - I have a requirement where we need to create delegated administrators for each department. The Delegated Admins are allowed to create and manage user accounts in OIM and OID resource. They can modify user accounts that are part of the organiz

  • Provision for to skip certificate check based on Vendor

    Hi All, We are procuring 'X' material from Vendor 'V1' and Vendor 'V2'. Is it possible to have a certificate check confirmation while posting MIGO only for Vendor 'V2' and not for Vendor 'V1' using functionality of QM procurement key in material mast

  • Technical Comm. Suite 1.3 - Download Saga (ongoing)

    Has anyone here experienced what I describe below and, if so, have you been able to achieve any resolution. On Monday, Sept. 8, I purchased the download version of the 1.3 upgrade to the Technical Communication Suite. On the download software page th