Out of memory issues with PSE 8

I am using PSE 8 64 bit on a Dell desktop computer with Intel (R) Core (TM) i7 CPU 920 at 2.67 GHz. with 8 GB of ram. My operating system is Windows 7, 64 bit.
My problem is that I get out of memory or insufficient ram messages in PSE Editor with some PSE tools when my CPU resource utilization reaches 37 to 38%. In other words, even though my computer is telling me I have almost 4 GB of memory left, PSE is saying it does not have enough memory to complete the operation. It looks to me as if PSE is only using 4GB of my 8 GB of Ram. Is this true and what do I need to do to allow PSE to utilize all of my available ram.

Thanks, that does answer what the problem is, but not necessarily a solution. I like working with 8 to 10 pictures (files) in the editor tray at a time. I make whatever changes needed to each and then group 4 or 5 into an 8.5 X 11 collage. Each picture in the collage is a separate layer and each separate picture may multiple layers of its own. I print the collage on 8.5 x 11 photo paper and then put the page in a photo album. I like the pictures in different sizes, orientations and sometimes shapes, so the album and multiple picture options offered in PSE are not much help. My process eats a lot of memory, which I mistakenly thought, my 8 gb of ram would solve.
Anyway, now that I know the limitations, I can adjust the process to avoid the memory issue and hopefully, a future version of Elements will accommodate 64 bit.
Maybe, I am wondering, do I need to look at other programs or am I missing a PSE function that would make my chore easier.

Similar Messages

  • CC 2014 Progs leave me plagued with out of memory issues which the previous versions don't exhibit.

    The new CC 2014 suite of progs seem rather memory hungry? I am plagued with out of memory issues trying to use them. The old CC progs work just fine! Is there a new minimum spec for memory now? For now I am forced to use the old versions as the new ones are just un useable...some 'upgrade'!
    Phil

    Me too!  Seems whenever I run more than one CC app I get out of memory errors.  I have Win 7 with 32GB ram.  Only have this problem with CC 2014, not CS6.

  • Issue with PSE 9 Editor

    Running PSE 9 on a new iMAc with 8 Gig memory, 1 TB hard drive.  When trying to save an edited file, the Editor consistently becomes unresponsive when I try to close the saved file requiring that I force Editor to close.  Not only with not Editor close the saved file but the Editor cannot be closed normally.

    I appreciate the help, I found out how to change paper type and it seems to work great.  The remaining question, when I've finished editing an image and attempt to "Save" or "Save As" the system shows a box "Adobe Photoshop Elements 9 quit unexpectedly".  If I go to "Reopen" it reopens the PSE at the organizer level and hasn't saved the image.  Also at this level, "Save" is grayed out and unavailable.  I sure appreciate someone that seems to know a lot more than the "Techs" at Adobe.Don
    Date: Thu, 7 Apr 2011 20:40:38 -0600
    From: [email protected]
    To: [email protected]
    Subject: Issue with PSE 9 Editor
    The second thing, opening the print dialog box does not open the same one as Windows, and allows no selection of paper type.
    That is just a difference between macs and windows. On a mac, you choose the paper and print quality in the OS X system print window that opens after you click the Print button in the PSE window:
    http://forums.adobe.com/servlet/JiveServlet/showImage/64219/ScreenSnapz.jpg
    If you only have the little stub window, click the triangle to the right of the printer's name to expand the dialog box to see all the menus.
    >

  • Out of memory Issues

    Hi,
    Weblogic version is 10.3, DB - Oracle
    We have environment like 4 servers One server with Admin server 4 managed servers remaining 3 servers with each server with 4 managed servers.
    Each managed server have 2 GB memory.
    Connection pools are setup Initial capacity 0 maximum capacity 15
    Our applications are developed on Pega, Currently we are getting Out of memory issues. F5 node send alerts like
    SEVERITY: Error
    Alert(432526): Trap received from ttnny-cse-f5node1: bigipServiceDown -- Bindings: sysUpTimeInstance = 1589988172, bigipNotifyObjMsg = Pool member 172.22.110.45:8002 monitor status down., bigipNotifyObjNode = 172.22.110.45, bigipNotifyObjPort = 8002 (Fri. 02/12/2010 15:01 America/New_York - Sat. 02/13/2010 15:59 America/New_York)
    SEVERITY: Error
    Alert(432524): Trap received from ttnny-cse-f5node2: bigipServiceDown -- Bindings: sysUpTimeInstance = 1589982333, bigipNotifyObjMsg = Pool member 172.22.110.45:8002 monitor status down., bigipNotifyObjNode = 172.22.110.45, bigipNotifyObjPort = 8002 (Fri. 02/12/2010 14:59 America/New_York - Sat. 02/13/2010 15:59 America/New_York)
    SEVERITY: Error
    Alert(432527): Trap received from ttnny-cse-f5node1: bigipServiceUp -- Bindings: sysUpTimeInstance = 1589988572, bigipNotifyObjMsg = Pool member 172.22.110.45:8002 monitor status up., bigipNotifyObjNode = 172.22.110.45, bigipNotifyObjPort = 8002 (Fri. 02/12/2010 15:01 America/New_York - Sat. 02/13/2010 15:59 America/New_York)
    SEVERITY: Error
    Alert(432525): Trap received from ttnny-cse-f5node2: bigipServiceUp -- Bindings: sysUpTimeInstance = 1589982733, bigipNotifyObjMsg = Pool member 172.22.110.45:8002 monitor status up., bigipNotifyObjNode = 172.22.110.45, bigipNotifyObjPort = 8002 (Fri. 02/12/2010 14:59 America/New_York - Sat. 02/13/2010 15:59 America/New_York)
    When we checked at that time server is up and running with some pega exceptions JVM Shows 10 % after some it will go 30 % .
    Can you see below alert confirms JVM down so we are restarting the server at this point.
    SEVERITY: Alert
    Alert(432565): Threshold triggered -- ttappapp01's 8003's Port Availability: 0.00 Percent < 100 Percent averaged over 1.00 minutes (Fri. 02/12/2010 17:15 America/New_York - Fri. 02/12/2010 17:15 America/New_York)
    SEVERITY: Alert
    Alert(432564): Threshold triggered -- ttappapp01's 8003's Port Availability: 0.00 Percent != 100 Percent averaged over 1.00 minutes (Fri. 02/12/2010 17:15 America/New_York - Fri. 02/12/2010 17:15 America/New_York)
    We took thread dump and heap dump at that time can any one please give some suggestions. why server going out of memory.
    *1, Any issue with connection pools*
    *2, Please give suggestion on design.*
    Thanks,
    Raj.

    Hi Raj,
    Did you checked the system.out and Weblogic managed server logs?
    Also you have to check the GC logs, to see if there is a memory problem or not.

  • Lightroom 3.2 out of memory issues

    I had been using the beta version of Lightroom 3 without issues.  Once I installed the shipping version I get out of memory messages all the time.  First I noticed this when I went to export some images.  I can get this message when I export just one image or part way though a set of images ( this weekend it made it though 4 of 30 images before it died ).  If I restart Lightroom it's a hit or miss if I can proceed or not. I've even tried restarting the box and only having Lightroom running and still get the out of memory issue.
    I've also had problems printing.  I go to print an image and it looks like it will print but nothing does.  This does not generate an error message it just doesn't do anything.  So far restarting Lightroom seems to fix this problem.
    When in the develop module and click on an image to see it 1:1 at times the image is out of focus.  If I click on another image and then go back to the original it might be in focus.
    I have no idea if any of this is related but I thought I'd throw it out there.  I've been using Lighroom since version 1.0 and have had very good luck with the program.  It is getting very frustrating trying to get anything done.  I search though the forum but the memory issues I found were with older versions. I'd be very grateful if anyone could point me in the right direction.
    Ken
    System:
    i7 860
    4g memory
    XP SP3

    Hi,
    You can get the HeapDump Analyzer for analyzing IBM AIX heapdumps from the below mentioned link.
    http://www.alphaworks.ibm.com/tech/heapanalyzer
    http://www-1.ibm.com/support/docview.wss?uid=swg21190608
    Prerequistes for obtaining a heapdump:
    1.You have to add -XX:+HeapDumpOnOutOfMemoryError to the java options of the server (see note 710146,1053604) to get a heap dump on its occurance, automatically.
    2.You can also generate heapdumps on request :
    Add -XX:+HeapDumpOnCtrlBreak to the java options of the server
    (see note 710146).
    Send signal SIGQUIT to the jlaunch process representing the
    server e.g. using kill -3 <jlaunch pid> (see note 710154).
    The heap dump will be written to output file java_pid<pid>.hprof.<millitime> in:
    /usr/sap/<SID>/<instance>/j2ee/cluster/server<N> directory.
    Both these parameters can be set together too to get the benefit of both the approaches.
    Regards,
    Sandeep.
    Edited by: Sandeep Sehgal on Mar 25, 2008 6:51 PM

  • Has anyone experience iMac memory issues with One Drive (previously Sky Drive) - uses 5GB of my 8GB memory!

    I recently started experiencing memory issues with my iMac due to One Drive using up to 5GB of my iMac memory (total of 8GB). Every time I start up my iMac Activity Monitor shows One Drive uses around 4-5Gb of the internal memory for no reason. Previously Onde Drive seemed to work fine, it's only since about 2 weeks I notice this. It's really annoying as the iMac freezes up regulalrly and I need One Drive to synchronise my files across different devices.
    Has anyone experienced this too and are there any known fixes?

    Yes this is happening to me too, since the last upgrade. Eventually MacOS starts insisting I "Force Quit" all my applications as I have run out of application memory.
    OneDrive doesn't appear in that list because it's not an application... but Activity Monotor clearly shows it's the culprit, taking up 5GB+ of RAM.
    I've raised the issue on Microsoft's forums:
    http://answers.microsoft.com/en-us/onedrive/forum/sdperformance-sdother/onedrive -for-macos-memory-leak/fe04cf60-949f-47b6-b886-10539b5c5cc3?tm=1397636232934

  • Memory issue with Podcast !!!

    Hi,
    I have several issues with the Podcast App.
    - i cannot sync podcast that i have downloaded on my laptop using itunes to the iphone,
    - when i download some podcast directly on the iphone, i cannot play them, i still have to stream them,
    - i have uge memory issues and i think that the downloaded version of the podcast might be somewhere on the iphone, but cannot access / delete them (over 16 GO, i have more than 4.5 GO used for 'other', that are not related to anything...)
    i have tried to delete / reinstall the app, but this does not change anything...
    Any help ?
    Rgds

    Quote
    I have a MSI 975X Platinum rev 2 mb with a e6300 processor overclocked to 2.24ghz.
    If your CPU operates @2.24 GHz, it means that you increased FSB frequency to about 348 MHz.
    To make sure there is no misunderstanding here, let me point out the following:
    Memory frequency is ALWAYS linked to FSB frequency by the FSB/DRAM Ratio you can select in BIOS.  If you select 1:1 for example, the BIOS will show that memory frequency is 533 MHz. This value however, only applies to the default FSB clock speed of 266 MHz:
    266 MHz x 1 = 266 MHz (or 533 MHz effective DDR2-frequency).
    If your system is overclocked, what counts is only FSB/DRAM ratio, not the memory speed displayed in BIOS.  That means, if you set FSB/DRAM ration to 1:1 and FSB frequency to 348 MHz, your RAM will operate @:
    348 MHz x 1 = 348 MHz (or 696 MHz effective DDR2-frequency).
    The main question is:
    What are you talking about when you say, you can't make your RAM operate @800 MHz?
    The BIOS does not offer a proper divider to get your RAM to 800 MHz @348 FSB clock speed to begin with.
    You have the following choices if your overclock your system:
    FSB=300 + FSB/DRAM ratio = 1.33
    300 MHz x 1.33 ~ 400 MHz (or 800 MHz effective DDR2-frequency)
    FSB=320 MHz + FSB/DRAM ratio = 1:1.25
    320 MHz x 1.25 = 400 MHz (or 800 MHz effective DDR2-frequency)
    FSB=400 MHz + FSB/DRAM ratio = 1:1
    400 MHz x 1 = 400 MHz (or 800 MHz effective DDR2-frequency)
    Use CPU-Z to monitor the DRAM frequency that is actually set if you are overclocking.

  • Result Set Causing out of memory issue

    Hi,
    I am having trouble to fix the memory issue caused by result set.I am using jdk 1.5 and sql server 2000 as the backend. When I try to execute a statement the result set returns minimum of 400,000 records and I have to go through each and every record one by one and put some business logic and update the rows and after updating around 1000 rows my application is going out of memory. Here is the original code:
    Statement stmt = con.createStatement();
    ResultSet   rs = st.executeQuery("Select * from  database tablename where field= 'done'");
                while(rs!=null && rs.next()){
                System.out.println("doing some logic here");
    rs.close();
    st.close();
    I am planning to fix the code in this way:
    Statement stmt = con.createStatement(ResultSet.TYPE_FORWARD_ONLY,
                          ResultSet.CONCUR_UPDATABLE);
    stmt.setFetchSize(50);
    ResultSet   rs = st.executeQuery("Select * from  database tablename where field= 'done'");
                while(rs!=null && rs.next()){
                System.out.println("doing some logic here");
    rs.close();
    st.close();But one of my colleague told me that setFetchSize() method does not work with sql server 2000 driver.
    So Please suggest me how to fix this issue. I am sure there will be a way to do this but I am just not aware of it.
    Thanks for your help in advance.

    Here is the full-fledged code.There is Team Connect and Top Link Api being used. The code is already been developed and its working for 2-3 hours and then it fails.I just have to fix the memory issue. Please suggest me something:
    Statement stmt = con.createStatement();
    ResultSet   rs = st.executeQuery("Select * from  database tablename where field= 'done'");
                while(rs!=null && rs.next()){
                 /where vo is the value object obtained from the rs row by row     
                if (updateInfo(vo, user)){
                               logger.info("updated : "+ rs.getString("number_string"));
                               projCount++;
    rs.close();
    st.close();
    private boolean updateInfo(CostCenter vo, YNUser tcUser) {
              boolean updated;
              UnitOfWork unitOfWork;
              updated = false;
              unitOfWork = null;
              List projList_m = null;
              try {
                   logger.info("Before vo.getId() HERE i AM" + vo.getId());
                   unitOfWork = FNClientSessionManager.acquireUnitOfWork(tcUser);
                   ExpressionBuilder expressionBuilder = new ExpressionBuilder();
                   Expression ex1 = expressionBuilder.get("application")
                             .get("projObjectDefinition").get("uniqueCode").equal(
                                       "TABLE-NAME");
                   Expression ex2 = expressionBuilder.get("primaryKey")
                             .equal(vo.getPrimaryKey());// primaryKey;
                   Expression finalExpression = ex1.and(ex2);
                   ReadAllQuery projectQuery = new ReadAllQuery(FQUtility
                             .classForEntityName("EntryTable"), finalExpression);
                   List projList = (List) unitOfWork.executeQuery(projectQuery);
                   logger.info("list value1" + projList.size());
                   TNProject project_hist = (TNProject) projList.get(0); // primary key
                   // value
                   logger.info("vo.getId1()" + vo.getId());
                   BNDetail detail = project_hist.getDetailForKey("TABLE-NAME");
                   project_hist.setNumberString(project_hist.getNumberString());
                   project_hist.setName(project_hist.getName());
                   String strNumberString = project_hist.getNumberString();
                   TNHistory history = FNHistFactory.createHistory(project_hist,
                             "Proj Update");
                   history.addDetail("HIST_TABLE-NAME");
                   history.setDefaultCategory("HIST_TABLE-NAME");
                   BNDetail histDetail = history.getDetailForKey("HIST_TABLE-NAME");
                   String strName = project_hist.getName();
                   unitOfWork.registerNewObject(histDetail);
                   setDetailCCGSHistFields(strNumberString, strName, detail,
                             histDetail);
                   logger.info("No Issue");
                   TNProject project = (TNProject) projList.get(0);
                   project.setName(vo.getName());
                   logger.info("vo.getName()" + vo.getName());
                   project.setNumberString(vo.getId());
                   BNDetail detailObj = project.getDetailForKey("TABLE-NAME"); // required
                   setDetailFields(vo, detailObj);//this method gets the value from vo and sets in the detail_up object
                   FNClientSessionManager.commit(unitOfWork);
                   updated = true;
                   unitOfWork.release();
              } catch (Exception e) {
                   logger.warn("update: caused exception, "
                             + e.getMessage());
                   unitOfWork.release();
              return updated;
         }Now I have tried to change little bit in the code. And I added the following lines:
                        updated = true;
                     FNClientSessionManager.release(unitOfWork);
                     project_hist=null;
                     detail=null;
                     history=null;
                     project=null;
                     detailObj=null;
                        unitOfWork.release();
                        unitOfWork=null;
                     expressionBuilder=null;
                     ex1=null;
                     ex2=null;
                     finalExpression=null;
    and also I added the code to request the Garbage collector after every 5th update:
    if (updateInfo(vo, user)){
                               logger.info("project update : "+ rs.getString("number_string"));
                               projCount++;
                               //call garbage collector every 5th record update
                               if(projCount%5==0){
                                    System.gc();
                                    logger.debug("Called Garbage Collectory on "+projCount+"th update");
                          }But now the code wont even update the single record. So please look into the code and suggest me something so that I can stop banging my head against the wall.

  • 'Out Of Memory' issue and over large Project File...

    Hi everyone, my issue is a common one but yet to find out reasons why after trying many suggestions.
    I am working in Final Cut Pro 6 on an edit 35 minutes long. It's a holiday video edit of a snowboarding trip filmed on GoPro HD and a small handy Sony Handycam. All rushes are transcoded to ProRess 422. I have cut the edit and doing the final touches with the colour using Synthetic Appertures Color Finesse 3. My project file starts off at a mere 17MB before colour grading/correction from what I can tell seems pretty normal. By the time I get to around half way my Project ile has increased greatly to over 70MB, FCP becomes slow and unstable then I start to get crashes and reloads of the project take around 15 minutes or more to open. This seems quite a jump to me once grading starts?
    Originally I did have around 6 sequences of footage (one for each day of shooting on holiday) where each contained all the usable clips I could possibly use for the edit. Then a Master sequence where I would paste my desired clips to for trimming and editing to music. (all music in AIFF)
    I have deleted all folders in the browser that arent in use now and the only sequence I have now is the master, nothing else at all! No Motion graphics but some text at the start of the edit, nothing animated. No stills, but I do have some freeze frames but I'd say no maore than 20 over the whole 35 minutes. I have deleted unused render files and trashed preferences.
    I did see somewhere that someone suggested splitting the edit in to two different projects? There wasnt a clear explanation of this.
    I'm running....
    iMac 24 inch - early 2009
    2.66 GHz Intel Core 2 Duo
    8GB 1067 DDR3
    NVIDIA GeForce 9400 256 MB
    OS X10.8.2
    Hope this is all you need to help me get this edit completed and exported off. Any more info you need just ask!
    Thanks in advance.
    GK

    Hi Nick
    I have already successfully rendered 20 minutes of the project with no issues. As the project file has increased it has become troublesome. Beach balling when pasting the colour finesse attributes to my next clip for instance.
    When I attempt to render now its instantly fails with a "Out Of Memory" error and eventually crashes FCP.
    Thanks.

  • Hash Table Infrastructure ran out of memory Issue

    I am getting ORA-32690 : Hash Table Infrastructure ran out of memory error, while executing an Informatica mapping using Oracle Database ( Test Environment)
    The partition creation is as shown below.
    TABLESPACE MAIN_LARGE_DATA1
    PARTITION BY LIST (MKTCD)
    PARTITION AAM VALUES ('AAM')
    TABLESPACE MAIN_LARGE_DATA1,
    PARTITION AHT VALUES ('AHT')
    TABLESPACE MAIN_LARGE_DATA1,
    PARTITION GIM VALUES ('GIM')
    TABLESPACE MAIN_LARGE_DATA1,
    PARTITION CNS VALUES ('CNS')
    TABLESPACE MAIN_LARGE_DATA1,
    PARTITION AOBE VALUES ('AOBE')
    TABLESPACE MAIN_LARGE_DATA1,
    PARTITION DBM VALUES ('DBM')
    TABLESPACE MAIN_LARGE_DATA1
    Could you please provide me with a solution to this problem asap?

    SQL statement and execution plan? Is there a server-side trace file created for the session?
    From the brief description, it sounds like bug 6471770. See Metalink for details. The workaround for this particular bug is to either disable hash group-by, by setting +"_gby_hash_aggregation_enabled"+ to FALSE (using an ALTER SESSION statement . Or by using a NO_USE_HASH_AGGREGATION hint.
    Suggest you research this problem on Metalink (aka MyOracleSupport at https://support.oracle.com)

  • IDCS4 V6.0 memory issue with preflight

    When I create a custom profile for preflight and run it i encounter memory issue. Hard disk starts running indefinitely and after a while InDesign crashes. I have try, by steps, to make the profile less demanding (I should try the reverse way) but instead of crashing, indesign finally sends memory error message and can crash later. Apple Activity monitor shows that sometimes indesign requires all the memory available (1,5Go)
    Any of you have ever try to set a custom preflight profile ? The basic one is really too tolerant.
    Thank you.
    I'am on Imac 3,06 2 Go

    I've spent quite a bit of time working with custom preflight profiles on my MacBook Pro, 2 Gb memory, Mac OS X 10.5.5. I have never run into a memory error. In fact, I've never ever run into a memory error in InDesign. Memory errors can occur because of defective fonts. Have you tried with different fonts? Have you tried on a different computer?
    Yes, the [Basic] preflight checks only for missing fonts, missing or modified graphics, and overset text. You need to work with custom preflight profiles depending on your particular workflow.

  • TV-Out and analogue issues with NX6800GT TD256

    Hi,
    I bought this card about two months ago and am having issues with it...
    Firstly, on Sunday the analogue output for the monitor stopped working correctly. All of the pictures coming through had this kind of green overlay (so the text even in bios boot that would ordinarily be white looked kinda green). I tested with another monitor, it was the same. I tried putting it on DVI, and it worked properly.
    Now, I want to use TV-Out, but when I connect the TV-Out to the card while I have DVI plugged in, I can't get the TV to display at the same time as the monitor is plugged into the DVI connector (it works when plugged into the HD-15, though again, looks green).
    Anyone have any suggestions for what I should do? Can the card display both tv-out and DVI at the same time?
    Andrew

    Anyone have any suggestions about this?

  • Out of Memory issue in crystal report 2008 SP1

    Hi ALL,
    I am facing serious issue in crystal report 2008 SP1.
    When i am click the page setup in crystal report 2008 ,it is prompting like "Out of Memory".
    Because  of this i am not able to see the default printer in the page setup.
    Please give the suggestions to resolve this issue.
    Thanks and Regards,
    Vinay

    Hi Ed,
    What printer are you using as your default printer?
    What happens if you change your default printer to Microsofts generic print driver? Only as a test to remove the printer being the cause.
    Also, Go into Page Setup and check on Dissociate Printer..... and see if that fixes the issue.
    Also, include all your OS version and patch level, Status of DEP, anti-virus turned off ( disconnect from your network while doing this test ) and Windows Firewall or any third party firewall and close all other software running.
    Thanks again
    Don

  • Out of memory Error with jdk 1.6

    Hello,
    I have a swing application launched on the client with the help of Java web start. The applications works fine in jre 1.4 and jre 1.5. The heap sizes are :
    initial-heap-size="5m" max-heap-size="24m"
    But when I run this using jre 1.6.0_05-b13, I am getting Out of memory Error, java heap size. And I see the memory usage is growing rapidly which I didn't notice in other jre versions (1.4 and 1.5).
    Does anyone have any idea on this?
    Thanks in advance,
    MR.

    Thanks for your response Peter. During my continuous testing I identified that error happens on jdk 1.5 also. And I have increased the min-heap-size to 24 MB and max-heap-size to 64 MB. But in that case also I noticed the out of memory error. The interesting thing is, the min-heap-size is never increased from 24MB and lot of free memory also.
    Memory: 24,448K Free: 12,714K (52%) ... completed.
    The Outofmemoryerror triggers from the reader thread which does the job of reading the data from the InputStream. One thing is we are continuously pushing more data on the output stream from the other end. Is that any limitation on InputStream which can hold some definite amount of data.
    Please throw some light on this.

  • Out of memory prblem with threads

    class B implements Runnable
    If in class A I do
    while(true)
    B b = new B();
    b.run()
    It can create 10000 instances of B, with no problem. Since b is in the scope of the while, the old instance is cleaned up by the garbage collector every loop.
    But if I do:
    while(true)
    B b = new B();
    Thread t = new Thread(b);
    t.start();
    after around 400 loops I get out of memory.
    If I add a finalize() method to B , I see it never called until the out of memory happens. This means the instances of the thread are not cleaned by the garbage collector.
    Does anyone know why this is happening and what I can do to solve it?

    An object is collectable if no path extists to it. One set of path is from the stack frames of all live threads. As each thread effectively holds a reference to itself, it is not suprising that the Thread instances aren't collected until the threads all end. Your rapid loop is obviously creating threads faster than they can complete (you don't show us what they are doing) so running out of memory is guaranteed.

Maybe you are looking for

  • How To Download a File into your desktop application

    Hello, I am attempting to have an AIR application use a webservice XMLCollection when online, but what I would like to do is also provide an AIR application for download. I think that the AIR application is using the files installed locally, what I w

  • My ipad mini asks my iCloud pasword few times a day..

    When i press the home button, there is a warning ' verify iCloud Password' at the locked screen, after i slide to unlock the ipad, im asked to enter my Apple ID password. Mostly i press cancel but I did enter few times my password and it still keep a

  • Sending same text twice

    why is the same text sending 2x to the same person? how do I fix this?  Curve 9360

  • Premiere CC crash while playing and zooming timeline

    Regularly crashing Premiere CC now while playing back the timeline, I zoom in and out on the timeline using my +/- hot keys and Premiere CC will crash.  I hate to say it but Premiere is starting to become unstable like my prior copy of Sony Vegas Pro

  • Safari cannot open the page because it could not load any data

    I had a odd thing happen with Safari today I get a "Safari cannot open the page because it could not load any data" msg when going to a saved site with a book mark but I can get the page to load by going to google.  It dosn't happen for the same site