Memory leakage when using Ini-file VIs

I'm using the Configuration File Vis to read and write data to different .ini files. The files contain both standard keys and clusters written as a segment using the Open G toolkit. Instead of opening the files and keeping them in the memory of the Config VIs I'm just using them to read and write, decode and encode...the references are all closed using the Close Config Data.vi. The problem is that even though immediately close the config data the application keeps grabbing more and more data...every time a configuration file is open, read or written to and the closed everything from 4K to 50K of additional memory has been allocated by the application (this is a stripped down application that only deals with the config files, so
there are no other sources for the memory leak).
Has anyone else experienced this? How can you repeatedly open and close config file slike this without it continoulsy allocating more memory?
Attached is a copy of the VIs, the directory structure must be kept intact if the ini file is to be read correctly.
I've been stearing so hard on this the whole day that I might just be overlooking something obvious...
In the full application the VI init and write operations are only done when the user reconfigures the system, which may be a couple of times per month...so the memory leak would not cause a problem right away, but it would not be healthy to leave it there...
MTO
Attachments:
Memory_Leak_Demo.zip ‏1391 KB

Could you post a 6.1 version?
LV7 is still about two weeks away for me.
Does the problem show up in 6.1?
I ran across an error while writting to a FP output that was not configured that would cause a "drop of memory" to leak every time the VI performed the write. The leak did not show up in the profiler but windows would show te memory foot print growing continually as long as the writes continued. The work around was "don't do that!".
I bring this up because I found and reported this just prior to LV7 release and the featur may still be present in LV7. I also believe that Jean-Pierre used a "write and check" metod to detail with the unknown data types of of complex data structures.
If you just read does it leak?
If you just use simple data types do
es it leak?
Is the ini file growing?
I really appreciate the effort you have been putting into the Dev-Exchange Mads! I wish i could do to more to help.
Keep us posted.
Ben
Ben Rayner
I am currently active on.. MainStream Preppers
Rayner's Ridge is under construction

Similar Messages

  • 3?'s: Message today warning lack of memory when using Word (files in Documents) something about "idisc not working" 2. Message week ago "Files not being backed up to Time Capsule"; 3. When using Mac Mail I'm prompted for password but none work TKS - J

    3 ?'s:
    1  Message today warning lack of memory when using Word (files in Documents) something about "idisc not working"
    2. Message week ago "Files not being backed up to Time Capsule";                                                                                                                                             
    3. When using Mac Mail I'm prompted for password but none work
    Thanks - J

    Thanks Allan for your quick response to my amateur questions.
    Allan:     I'm running version Mac OS X Version 10.6.8     PS Processor is 2.4 GHz Intel core 15 
    Memory  4 gb  1067   MHz  DDr3  TN And @ 1983-2011 Apple Inc.
    I just "Updated Software" as prompted.
    Thanks for helping me!    - John Garrett
    PS.
    Hardware Overview:
      Model Name:          MacBook Pro
      Model Identifier:          MacBookPro6,2
      Processor Name:          Intel Core i5
      Processor Speed:          2.4 GHz
      Number Of Processors:          1
      Total Number Of Cores:          2
      L2 Cache (per core):          256 KB
      L3 Cache:          3 MB
      Memory:          4 GB
      Processor Interconnect Speed:          4.8 GT/s
      Boot ROM Version:          MBP61.0057.B0C
      SMC Version (system):          1.58f17
      Serial Number (system):          W8*****AGU
      Hardware UUID:          *****
      Sudden Motion Sensor:
      State:          Enabled
    <Edited By Host>

  • Memory problems when using alias channel strips (3.0.2)

    I'm having memory issues when using alias channel strips.  What is happening is I create one instance of the plugin, say Massive.  Then make a bunch of other alias on different patches.  When looking at the memory usage everytime I add and alais it increases the memory usage by an amount the same as the origional plugin.  ie. add 3 alias and you have 3 times more memory usage.  See attached photos.
    Original plugin instance ~80mb memory usage
    Adding 3 Alias channel strips.  ~330mb memory usage
    By comparison here is the usage by adding 3 brand new instance of the plugins. ~160mb memory usage
    By comparison it is using less memory by having 4 seperate instances of the plugin compared to using 3 alias's of the origional.  This doesn't make sense since the whole point of an alias is to save memory by only loading one instance of the plugin. 
    Anyone have any thoughts?
    Jon

    What's interesting is that if I, starting with a blank concert, add a Massive to a patch and then make an alias the memory is as you say. But if I make a duplicate of the inital patch and then make alias's of that the memory is lower.
    Are you seeing the same with the built-in plug-ins?

  • High Virtual memory usage when using Pages 2.0.2

    Hey there,
    I was just wondering whether there had been any other reports of unusually high memory usage when using Pages 2.0.2, specifically Virtual memory. I am running iWork 06 on the Mac listed below and Pages has been running really slowly recently. I checked the Activity Monitor and Pages is using hardly any Physical memory but loads of Virtual memory (so much so that the Page outs are almost as high as the Page ins (roughly 51500 page ins / 51000 page outs).
    Any known problems, solutions or comments for this problem? Thanks in advance

    I don't know if this is specifically what you're seeing, but all Cocoa applications, such as Pages, have an effectively infinite Undo. If you have any document that you've been working on for a long time without closing, that could be responsible for a large amount of memory usage.
    While it's good practice to save on a regular basis, if you're making large amounts of changes it's also a good idea to close and reopen your document every once in awhile, simply to clear the undo. I've heard of some people seeing sluggish behavior after working on a document for several days, which cleared up when the document was closed and reopened.
    Titanium PowerBook   Mac OS X (10.4.8)  

  • Error when using %@ include file="/test.jsp"%&

    UsingNitrox version 2.1 M3 (build 419 06022005):
    with jdk version: 1.5.0_03 and Tomcat 5.5.9
    This is the error :cry: when using <%@ include file="/test.jsp"%> in jsp:
    Severity     2
    The type java.lang.Object cannot be resolved. It is indirectly referenced from required .class files (test.jsp)
    It used to work in previuos version of Nitrox. I can't use struts tiles because the tiles content is dynamic (run time) and I need it to be static (at compile time).
    Have you encountered this problem? What is the fix? Nitrox bug?
    I need your help please,
    Alberto

    M7,
    I found the problem :wink: . In the java Build path having the default (ALL) is not picking up the content of the package. I had to use add multiple and include all the folders and subfolders (many L) in my packages. After that I added *.java and *.properties to select all the java files. Now it is working. I assume this is a bug in Nitrox. The default ALL should include the all the files in the path.
    Thanks,
    Alberto

  • Use configuration file VIs to set AppFont in INI

    I am using the following code in an attempt to set AppFont, SystemFont, and DialogFont in an executable's .ini file all to Segoe UI 15. I was disappointed to discover that the configuration file VIs don't seem to write the key correctly. When I use this code, I get the following in the .ini file:
    AppFont = ""Segoe UI" 15"
    SystemFont = ""Segoe UI" 15"
    DialogFont = ""Segoe UI" 15"
    What I really need is:
    AppFont = "Segoe UI" 15
    SystemFont = ""Segoe UI" 15
    DialogFont = "Segoe UI" 15
    In other words, the configuration file VIs add an extra set of quotation marks. With this extra set of quotation marks, the executable ignored these settings. The "write raw string?" input didn't seem to affect this behavior. 
    Does anyone know of a way to get the configuration file VIs to write this key/value pair correctly, or do I need to write extra code to either remove the quotations or do the whole thing myself? It seems like the configuration file parsing/editing VIs that NI provides should be able to parse and edit NI-provided configuration files...

    There is no way to do this with the current config file API. That "write raw string" input only pertains to escaping certain characters. We also have an internal API for writing data specifically to the LabVIEW.ini file, but it has the same problem with extra quotes.  For now, you'll need to either refrain from using the config file VIs, or add some post-processing code to go in and remove the extra quotes.
    Darren Nattinger, CLA
    LabVIEW Artisan and Nugget Penman

  • "Memory effect" when loading .xls file information using PropertyLoader

    I have a TestStand 3.1 application, in which the sequence start off loading a number of configuration settings embedded in different .xls files using the PropertyLoader.
    Unfortunately, Teststand sometimes loads the previously used .xls files (same name, located elsewhere), rather than those it were supposed to. In particular, if a .xls file is missing, Teststand will often (always?) load a previously used file with the same name, but located elsewhere. VERY inconvinient when testing ...!
    Is there any way to remove this unfortunate "memory effect"?

    Where are your sequence files located? If the .xls files are relative to them you might want to use a more fully specified relative path to the files (for example: bin\config\filename.xls) rather than just filename.xls. Then becareful to remove search directories (especially recursive ones) that you might have added to find these files. It's very easy to get into problems with recursive search directories or by adding too many search directories if you have lots of files with the same names, by instead using paths relative to the sequence file you can avoid the need to add search directories in many cases.
    Hope this helps,
    -Doug

  • Using config file VIs and keep getting dequeue error

    So I am using the config file VIs and I made some of my own VIs with additional functions for the progarm I'm working on.  When I use the config file VIs on their own everything seems to work fine but then I use them from one of my custom VIs to another I keep getting a dequeue error in the "Save and Close INI.vi" VI.  Below is how I have it set up right now... basically as simple as I can make it using my VIs.  I attached a zip file with my subVIs and alsow the config file I was using if anyone finds time to look.
    I've been trying to figure this out for a while and I j ust can't see to find any reasion why I would get a queueing error.  I even dug down into the config file VIs and I'm really not sure why but it seems there is nothing in the queue.
    Thanks in advance.
    PS - don't mind the random notes and I know the disable is "messy" but I was just pulling it down and out of the way of what I was paying attention to.
    Attachments:
    INI VIs.zip ‏93 KB
    config.ini ‏1 KB

    It's unlike me to just give up on a problem so I kept looking at it and it turned out that my config file had a section label with no keys or values under it.  This is what caused all my problems.  Everything seems to be working 100% now.

  • Need help with internal HD memory problems when using Premiere Pro?

    When using PP I keep loosing memory on my HD.
    Now this seems strange to me since I have every thing, all my video and audio files on external HDs.
    Each time I time I make a new project I end up with less space on my internal HD.
    Information related to these projects is somehow remaining on my internal HD.
    Anyone got any ideas about what I might be doing wrong?
    Dimitrije

    Premiere will slowly compile various files to help the project along, and the default place is usually your internal hard drive. Make sure your scratch disks are pointed to an external hard drive if that is what you want, also, make sure the Media Cache Files are being created on your external as well (and not the default location which is on your local drive).
    Premiere Preferences > Media
    Media Cache Files & Media Cache Database should be changed to an external disk if you don't want them created on your local disk. There are many tutorials and explanations about all of these aspects of Premiere on these forums and from other sources. Hope that helps!

  • Bug report & possible patch: Wrong memory allocation when using BerkeleyDB in concurrent processes

    When using the BerkeleyDB shared environment in parallel processes, the processes get "out of memory" error, even when there is plenty of free memory available. This results in possible database corruption.
    Typical use case when this bug manifests is when BerkeleyDB is used by rpm, which is installing an rpm package into custom location, or calls another rpm instance during the installation process.
    The bug seems to originate in the env/env_region.c file: (version of the file from BDB 4.7.25, although the culprit code is the same in newer versions too):
    330     /*
    331      * Allocate room for REGION structures plus overhead.
    332      *
    333      * XXX
    334      * Overhead is so high because encryption passwds, replication vote
    335      * arrays and the thread control block table are all stored in the
    336      * base environment region.  This is a bug, at the least replication
    337      * should have its own region.
    338      *
    339      * Allocate space for thread info blocks.  Max is only advisory,
    340      * so we allocate 25% more.
    341      */
    342     memset(&tregion, 0, sizeof(tregion));
    343     nregions = __memp_max_regions(env) + 10;
    344     size = nregions * sizeof(REGION);
    345     size += dbenv->passwd_len;
    346     size += (dbenv->thr_max + dbenv->thr_max / 4) *
    347         __env_alloc_size(sizeof(DB_THREAD_INFO));
    348     size += env->thr_nbucket * __env_alloc_size(sizeof(DB_HASHTAB));
    349     size += 16 * 1024;
    350     tregion.size = size;
    Usage from the rpm's perspective:
    The line 346 calculates how much memory we need for structures DB_THREAD_INFO. We allocate structure DB_THREAD_INFO for every process calling db4 library. We don't deallocate these structures but when number of processes is greater than dbenv->thr_max then we try to reuse some structure for process that is already dead (or doesn't use db4 no longer). But we have DB_THREAD_INFOs in hash buckets and we can reuse DB_THREAD_INFO only if it is in the same hash bucket as new DB_TREAD_INFO. So line 346 should contain:
    346     size += env->thr_nbucket * (dbenv->thr_max + dbenv->thr_max / 4) *
    347         __env_alloc_size(sizeof(DB_THREAD_INFO));
    Why we don't encounter this problem earlier? There are some magic reserves as you can see on line 349 and some other additional space is created by alligning to blocks. But if we have two processes running at the same time and these processes end up in the same hash bucket and we repeat this proces many times to fill all hash buckets with two DB_THREAD_INFOs then we have 2 * env->thr_nbucket(37) = 74 DB_THREAD_INFOs, which is much more than dbenv->thr_max(8) + dbenv->thr_max(8) / 4 = 10 and plus allocation from dbc_put, we are out of memory.
    And how we will create two processes that end up in the same hash bucket. We can start one process (rpm -i) and then in scriptlet we start many processes (rpm -q ...) in loop and one of them will be in the same hash bucket as the first process (rpm -i).
    I would like to know your opinion on this issue, and if the proposed fix would be acceptable.
    Thanks in advance for answers.

    The attached patch for db-4.7 makes two changes:
      it allows enough for each bucket to have the configured number of threads, and
      it initializes env->thr_nbuckets, which previously had not been initialized.
    Please let us know how it works for you.
    Regards,
    Charles

  • Indesign cs5 'out of memory' error when using preflight

    I have been regulary getting an 'out of memory' error when i choose to use my bespoke preflight profile.
    I have 4gig of ram and run Indesign CS5 on OS 10.6.8.
    Does anyone know a work around?
    As soon as I select from the basic default profile, i get the beach ball from hell for 10mins, then it kindly lets me know that I am out of memory, sends a crash report to Adobe and then asks if I want to relauch. I'm stuck in a vicious circle. I must of sent my 4th crash report now and no feedback from anyone at Adobe.

    I have replaced my preferences, but still the problem persists. I have tried switching my view from typical display to fast display before i selected a profile. I thought this may give me the extra memory I needed to avoid the enevitable crash. I learnt that 2 files were indeed rgb instead of cmyk before it crashed again. So I switched them to cmyk and tried again, selected my bespoke profile, but yet again it crashed. I think the problem lies with the file, not Indesign, as i have tried the same profile on a different file and the program doesn't crash and runs as it should. So if in future I need to use said crashing file again, firstly i will need to try Peter's isolate fix method. Otherwise i'll never be able to progress to successful a pdf.

  • How to set "Files of type" when using a "File Browse" item.

    Apex 4.0.2
    Internet Explorer 7 +
    I have a "File Browse" item on a page and need to limit the types of files display to just "text (.txt)" files. How can this be done? Currently, the "Files of type" list shows "All File (*.*)", "Pictures (*.gif,*.png)", and HTML (*.htm,*.html)". In the best case, I would like to not have the "Files of type" list and have the user just limited to text files. However, adding Text files (*.txt)" to the "Files of type" list is ok.
    thanks,
    William

    Thought i'd do a bit of research after seeing Scotts wonderful ideas.
    So it turns out, IE made the file item read only from version 8, for security reasons. Read more: http://blogs.msdn.com/b/ie/archive/2008/07/02/ie8-security-part-v-comprehensive-protection.aspx
    File Upload Control
    Historically, the HTML File Upload Control (<input type=file>) has been the source of a significant number of information disclosure vulnerabilities. To resolve these issues, two changes were made to the behavior of the control.
    To block attacks that rely on “stealing” keystrokes to surreptitiously trick the user into typing a local file path into the control, the File Path edit box is now read-only. The user must explicitly select a file for upload using the File Browse dialog.
    Additionally, the “Include local directory path when uploading files” URLAction has been set to "Disable" for the Internet Zone. This change prevents leakage of potentially sensitive local file-system information to the Internet. For instance, rather than submitting the full path C:\users\ericlaw\documents\secret\image.png, Internet Explorer 8 will now submit only the filename image.png.To resetting the actual items, suggestions I found were to replace the actual item. So instead of using $s, I just replace the element, with the existing element, causing it to re-initialise.
    var htmldb_delete_message='"DELETE_CONFIRM_MSG"';
    function fileCheck(el){
        if(el.value){
            var validFile = false;
            var validExtensions = ["csv"];
            var filename = el.value;
            var fileExtIndex = filename.lastIndexOf(".");
            var fileExt = filename.substring(fileExtIndex+1, filename.length);
            for(i = 0; i < validExtensions.length; i++){
                if(validExtensions[i] == fileExt){
                    validFile = true;
                    break;
            if ( !validFile || fileExtIndex == -1) {
                alert("Invalid Extension. Permitted files must end with: " + validExtensions.toString());
                var htmlContents = el.outerHTML || new XMLSerializer().serializeToString(el);
                $('#P16_BINARY').replaceWith(htmlContents);
    }(obviously, replacing what you need to, to suit your page - i prefer scotts idea of passing in supported file types in the function, so would just pass in an array instead; but this is just for demonstration)
    with an onchange="checkFile(this)" on the element attributes.
    On a slightly un-related note, I found out IE doesn't support the wonderful indexOf function on arrays, that checks for the existence of the value in an array. Sucks.
    Edited by: trent
    Ah well, jQuery is there, maybe i should use that for searching arrays in the future.
    http://api.jquery.com/jQuery.inArray/
    Edited by: trent
    Forgot a demo link, for csv files: http://apex.oracle.com/pls/apex/f?p=45448:16
    Edited by: trent
    Modify function. Didn't work in Firefox

  • My Application gets Error 37 when using Serial Port VIs.

    My Application works fine developed with LV 6.1. Then I updated to LV 7.1 and now the Executable of my application gets error 37 when using the old serial port functions. The "serpdrv" file is located in the application directory and is from 02/2002. The application runs well in the development environment after I placed the "serpdrv" file in the LabView directory. I tried to put a line with "serialDevices=..." in the .ini-file of my application but this did not cause an improvement. I also tried to rebuild the EXE with this line in the "labview.ini" file but nothing changes.
    Does anybody have another idea ?
    Many thanks for help
    Thomas

    >Did you actually replace the old serial functions that come with 7.1 with the functions from 6.1?
    Yes. E.g. "open serial driver.vi" uses "open device" with "serpdrv" as device string.
    I had the same problem running my application as VI´s. After the copy of serpdrv into the LV 7.1 directory everything works fine.
    I don´t understand why the application running as a .exe doesn´t find the serpdrv when it is in the same directory....
    Nevertheless many thanks for your answer
    Thomas

  • Are there any memory restrictions when using Invoke-Command?

    Hi, I'm using the Invoke-PSCommandSample workbook to run a batch file inside a VM.
    The batch file inside the VM runs a Java program.
    The batch file works fine, when I run it manually in the VM.
    However, when I use the Invoke-PSCommandSample workbook to run the batch file, I get the following error:
    Error occurred during initialization of VM
    Could not reserve enough space for object heap
    errorlevel=1
    Press any key to continue . . .
    Does anybody know if there are any memory restrictions when invoking commands inside a VM via runbook?
    Thanks in advance.

    Hi Joe, I'll give some more background information. I'm doing load testing with JMeter in Azure and I want to automate the task. This is my runbook:
    workflow Invoke-JMeter {
    $Cmd = "& 'C:\Program Files (x86)\apache-jmeter-2.11\bin\jmeter-runbook.bat'"
    $Cred = Get-AutomationPSCredential -Name "[email protected]"
    Invoke-PSCommandSample -AzureSubscriptionName "mysubscription" -ServiceName "myservice" -VMName "mymachine" -VMCredentialName "myuser" -PSCommand $Cmd -AzureOrgIdCredential $Cred
    This is my batch file inside the VM:
    set JAVA_HOME=C:\Program Files\Java\jdk1.7.0_71
    set PATH=%JAVA_HOME%\bin;%PATH%
    %COMSPEC% /c "%~dp0jmeter.bat" -n -t build-web-test-plan.jmx -l build-web-test-plan.jtl -j build-web-test-plan.log
    Initially I tried to run JMeter with "-Xms2048m -Xmx2048m". As that didn't work, I lowered the memory allocation but even with "-Xms128m -Xmx128m" it does not work. I have tried with local PowerShell ISE as you suggested, but I'm
    running into certification issues. I'm currently have a look at this. Here's my local script:
    Add-AzureAccount
    Select-AzureSubscription -SubscriptionName "mysubscription"
    $Uri = Get-AzureWinRMUri -ServiceName "myservice" -Name "mymachine"
    $Credential = Get-Credential
    $Cmd = "& 'C:\Program Files (x86)\apache-jmeter-2.11\bin\jmeter-runbook.bat'"
    Invoke-command -ConnectionUri $Uri -credential $Credential -ScriptBlock {
    Invoke-Expression $Args[0]
    } -Args $Cmd
    With this, I get the following error (sorry, it is in German):
    [myservice.cloudapp.net] Beim Verbinden mit dem Remoteserver "myservice.cloudapp.net" ist folgender Fehler
    aufgetreten: Das Serverzertifikat auf dem Zielcomputer (myservice.cloudapp.net:666) enthält die folgenden
    Fehler:
    Das SSL-Zertifikat ist von einer unbekannten Zertifizierungsstelle signiert. Weitere Informationen finden Sie im
    Hilfethema "about_Remote_Troubleshooting".
        + CategoryInfo          : OpenError: (myservice.cloudapp.net:String) [], PSRemotingTransportException
        + FullyQualifiedErrorId : 12175,PSSessionStateBroken

  • "Out of memory" message when rendering - MP4 files imported

    I imported MP4 files into FCP7 and I get an "out of memory" message when I try to render a sequence. I don't think it's the size of the file because I often work with much larger files without any problem. I even saved the MP4 to MOV files with QT and imported those, and still it gives me the  "out of memory" message. VERY ANNOYING.  There seems to be something in those original MP4 files that FCP doesn't like. Any idea how to get around this?

    Not to be rude or insult intelligence, but What format is the timeline in. Not the video files (It's great that you put them to PR422, but if your timeline's native format is something other than PR422, Final cut is going to try to render files in that format... )
    And Again please define "Huge"... How big of a hard drive are you using, and how much RAM space do you have? I checked your profile for any further information, but found none. these are key things to helping figure out your problem.

Maybe you are looking for

  • Client import taking too much time

    hi all, i am importing a client , i it has complete copy table 19,803 of 19,803 but for last four hours its status is processing scc3 Target Client           650 Copy Type               Client Import Post-Proc Profile                 SAP_CUST Status 

  • HT1338 How to download Arabic fonts for win word 2011?

    Hi After I bought the Mac Office 2011, I found out that i couldnt text in Arabic, Please dvise how to do ?

  • How come my iCloud, messages, twitter, and facebook settings will not let me select them?

    Ever since I upgraded to iOS 6 for my 4th gen iPod I cant acsess my iCloud, Messages, Twitter, and Facebook settings what is happening and how do I fix it.

  • Data Not Populating for report

    Hi, i am using Oracle BI Applications 7.9.6, i just loaded few sample data using universal adaptor. In "Financials GL - Budget and Expenses" subject area, there are two facts: Facts-Budget, Facts-Actuals. I could get the data with dimensions across t

  • Files are copied from the server very long

    After upgrading to OS X Lion to Mac pro, very long are copied large files from a network server. Instead of 4 minutes, 13 minutes is up. How to speed up the process of transferring a file? Sorry for my English .