ACR 8.1 "Out of Memory" issue when processing over 70 - 24mb files

Good Morning Folks,
I've recently upgraded my computer to Win 7 and then to PS CS6 Extended.  Now when I process RAW files in ACR 8.1, I get an "Out of Memory" error if I do more than 70 ~ 80 - 24 mb files.  Also, it's running really slow.  It's taking nearly 50 seconds to process 1 - 24 mb file where PS CS3 and ACR 4.X it took only 17 for the same size file.
Computer Specs; 3.40Ghz, 4GB RAM (max for the mother board), 120 GB HD "C", 40 GB Scratch Drive, Win 7
What I have done is made sure that the scratch drive is listed first for cache in both PS and ACR, then in ACR, set the cache to nearly 500,000. (I think these numbers are correct, I'm not at the offending computer now)
So the questions are; do I need to make any changes to specific setting?  Does PS CS6 and ACR 8.1 require more RAM?
Any suggestions would be helpful.
Thanks,
Captain 1854

When getting a new computer get a big enough C drive to hold all the programs and still have at least 50% free for future expansion (should always have 15% free). 
Best if you have a separate internal HD for scratch, so computer can read program and write to scratch at same time.
Get a mid range video card with 1 gig VRAM.
Win7 better than Win8.

Similar Messages

  • Out of memory issue when executing wlappc ant task

    I am using weblogic 8.1 sp1 and doing a compilation for a little bit large EAR file using ant task. But it always throws out of memory error when the wlappc invoked the compiler to compile the jsp files.
    According to BEA's documentation<CR104610>, I put the runtimeflags with "-J-ms256m -J-mx512m" option into the wlappc tag and Ant seems recognizing this option but it didn't work. I tried every possible memory size to get rid of this problem, but the process still failed with the error message:
    The system is out of resources.
    Consult the following stack trace for details.
    java.lang.OutOfMemoryError
    - with nested exception:
    [Compilation errors : ]
    My computer has 1G memory, so it shouldn't be the hardware problem.
    Does anyone have an idea on this?
    Thanks in advance,
    Jacky

    Hi,
    Ant seems to use its own JVM and thus in the javac options during runtime,try specifying ANT java options for setting the memory parameters.
    ANT options would be memoryInitialSize and memoryMaximumSize.
    http://ant.apache.org/manual/index.html
    Hope this helps.

  • Out of memory error when opening an .xlsx file in numbers v1.6.2 on ipad 2

    The file sizes are too small (38KB) for this to be a true "out of memory" issue. Any suggestions? I didn't have any problems before updating numbers to the current version.
    I haven't downloaded the new iOS 6 operating system. Is that the problem?

    Ended up being a problem with the Excel file we were using. Recreated the file and all was ok.

  • Top Link Causes out of memory issue when millions of records need to update

    Hello everyone,
    I am using TopLink 9.0.4 in a batch process. The batch process reads from the temp table(temp table has millions of records one month worth of data which need be updated). The database being used is sqlserver 2005. Below is the snippet of code. It works for 6-7 hours and crashes after that due of out of memory:
    ExpressionBuilder expressionBuilder = new ExpressionBuilder();
    Statement stmt = con.createStatement();
    ResultSet rs = st.executeQuery("Select * from database tablename where field= 'done'");
    while(rs!=null && rs.next()){
    *//where vo is the value object obtained from the rs row by row*     
    if (updateInfo(vo, user,expressionBuilder )){
                   logger.info("updated : "+ rs.getString("col_name"));
                   projCount++;
    rs.close();
    st.close();
    private boolean updateInfo(ProjectVO vo, YNUser tcUser,expressionBuilder ) {
              boolean updated;
              updated = false;
              try {
                   updated = true;
              } catch (Exception e) {
                   logger.warn("update: caused exception, "
                             + e.getMessage());
              return updated;
    Edited by: user8981696 on Jan 14, 2010 1:00 PM

    Thanks for your reply.
    Please find below the answers to you suggestions/concerns:
    You seem to be using raw JDBC to select all of the records in a single result set, not sure if this may be causing a memory issue. You could try paging through the results instead.
    Ans: I have modified the code to get me 1000 records each time and I am getting the ResultSet by using PrepartedStatement instead of regular Statement object.
    What type of caching are you using?
    Ans: No caching is being used. If you have some thoughts on caching please suggest or put some sample code. Again there is no AppServer is being used, its just a regular java process(Batch process) so I dont know how to do caching in a simple java process.
    You may also wish to try the latest 9.0.4 patch release, or try the 10.1.3 version, or the latest EclipseLink 2.0 release.
    Ans: Where can I find the latest patch release 9.0.4?
    Any help/suggestion is really appreciated!

  • Out of memory error when Applet transferring a file of size 600 MB

    Hi ,
    I have an Applet which transfer files from client machine to server using streams.
    I have a servlet which reads data sent from Applet.
    This is working fine when i Transfer files of size 10 MB
    when I chose a file of 600 MB the following error is thrown at Applet side after some time.
    java.lang.OutOfMemoryError: Java heap space
         at java.util.Arrays.copyOf(Unknown Source)
         at java.io.ByteArrayOutputStream.write(Unknown Source)
         at sun.net.www.http.PosterOutputStream.write(Unknown Source)
         at java.io.DataOutputStream.write(Unknown Source)
         at Apple.FileUploadApplet.actionPerformed(FileUploadApplet.java:53)Here is my Applet Code
    if(ae.getSource() == jbutton)
         JFileChooser jfc=new JFileChooser();
         jfc.showOpenDialog(null);
    File f=jfc.getSelectedFile();
    try
    FileInputStream in = new FileInputStream(f);
    byte[] buf=new byte[1024];
    int bytesread = 0;
    String toservlet = "http://9.122.18.115:8080/FileTransfer/FileUpload?"
                                  + URLEncoder.encode("name") + "="
                                  + URLEncoder.encode(f.getName());
    URL servleturl = new URL(toservlet);
    URLConnection servletconnection = servleturl.openConnection();
    //servletconnection.setRequestMethod("GET");
    servletconnection.setRequestProperty("Content-type",
                "application/octet-stream");
    servletconnection.setDoInput(true);
    servletconnection.setDoOutput(true);
    servletconnection.setUseCaches(false);
    servletconnection.setDefaultUseCaches(false);
    DataOutputStream out = new DataOutputStream(servletconnection.getOutputStream ());
    while( (bytesread = in.read( buf )) > -1 )
    out.write( buf, 0, bytesread );
    // out.flush(); // tried this to flush the data but didn't work
    System.out.println("writing data");
    out.flush();
    out.close();
    in.close();
    DataInputStream inputFromClient = new DataInputStream(servletconnection.getInputStream() );
    //get what you want from servlet
    inputFromClient.close();
    catch(Exception e)
    e.printStackTrace();
    }My Servlet Code
    public void doPost(HttpServletRequest req,HttpServletResponse res)
    ServletContext sc = this.getServletContext();
    try
    String path = "C:\\Downloads\\";
    String  fileName=req.getParameter("name");
      System.out.println("name="+fileName);
    File yourFile = new File(path+fileName);
      System.out.println("name="+yourFile);
    FileOutputStream toFile = new FileOutputStream( yourFile );
    DataInputStream fromClient = new DataInputStream( req.getInputStream() );
    byte[] buff = new byte[1024];
    int cnt = 0;
    int k=0;
    long st,et;
    st=System.currentTimeMillis();
    while( (cnt = fromClient.read( buff )) > -1 ) {
    toFile.write( buff, 0, cnt );
    //System.out.println("writing data=="+ k++);
    et=System.currentTimeMillis();
    toFile.flush();
    toFile.close();
    fromClient.close();
    int tt=(int) ((et-st)/1000);
    System.out.println("total time for "+fileName+" Download ="+tt+" secs");
    System.out.println("total time for "+fileName+" Download ="+(tt/60)+" mins");
    catch(Exception e)
    e.printStackTrace();
    }Please Help me out...

    yeah its working fine. Thank you very much..
    But it would be better if we provide the chunklength rather than using a zero to upload the file fastly.

  • Out of memory Issues

    Hi,
    Weblogic version is 10.3, DB - Oracle
    We have environment like 4 servers One server with Admin server 4 managed servers remaining 3 servers with each server with 4 managed servers.
    Each managed server have 2 GB memory.
    Connection pools are setup Initial capacity 0 maximum capacity 15
    Our applications are developed on Pega, Currently we are getting Out of memory issues. F5 node send alerts like
    SEVERITY: Error
    Alert(432526): Trap received from ttnny-cse-f5node1: bigipServiceDown -- Bindings: sysUpTimeInstance = 1589988172, bigipNotifyObjMsg = Pool member 172.22.110.45:8002 monitor status down., bigipNotifyObjNode = 172.22.110.45, bigipNotifyObjPort = 8002 (Fri. 02/12/2010 15:01 America/New_York - Sat. 02/13/2010 15:59 America/New_York)
    SEVERITY: Error
    Alert(432524): Trap received from ttnny-cse-f5node2: bigipServiceDown -- Bindings: sysUpTimeInstance = 1589982333, bigipNotifyObjMsg = Pool member 172.22.110.45:8002 monitor status down., bigipNotifyObjNode = 172.22.110.45, bigipNotifyObjPort = 8002 (Fri. 02/12/2010 14:59 America/New_York - Sat. 02/13/2010 15:59 America/New_York)
    SEVERITY: Error
    Alert(432527): Trap received from ttnny-cse-f5node1: bigipServiceUp -- Bindings: sysUpTimeInstance = 1589988572, bigipNotifyObjMsg = Pool member 172.22.110.45:8002 monitor status up., bigipNotifyObjNode = 172.22.110.45, bigipNotifyObjPort = 8002 (Fri. 02/12/2010 15:01 America/New_York - Sat. 02/13/2010 15:59 America/New_York)
    SEVERITY: Error
    Alert(432525): Trap received from ttnny-cse-f5node2: bigipServiceUp -- Bindings: sysUpTimeInstance = 1589982733, bigipNotifyObjMsg = Pool member 172.22.110.45:8002 monitor status up., bigipNotifyObjNode = 172.22.110.45, bigipNotifyObjPort = 8002 (Fri. 02/12/2010 14:59 America/New_York - Sat. 02/13/2010 15:59 America/New_York)
    When we checked at that time server is up and running with some pega exceptions JVM Shows 10 % after some it will go 30 % .
    Can you see below alert confirms JVM down so we are restarting the server at this point.
    SEVERITY: Alert
    Alert(432565): Threshold triggered -- ttappapp01's 8003's Port Availability: 0.00 Percent < 100 Percent averaged over 1.00 minutes (Fri. 02/12/2010 17:15 America/New_York - Fri. 02/12/2010 17:15 America/New_York)
    SEVERITY: Alert
    Alert(432564): Threshold triggered -- ttappapp01's 8003's Port Availability: 0.00 Percent != 100 Percent averaged over 1.00 minutes (Fri. 02/12/2010 17:15 America/New_York - Fri. 02/12/2010 17:15 America/New_York)
    We took thread dump and heap dump at that time can any one please give some suggestions. why server going out of memory.
    *1, Any issue with connection pools*
    *2, Please give suggestion on design.*
    Thanks,
    Raj.

    Hi Raj,
    Did you checked the system.out and Weblogic managed server logs?
    Also you have to check the GC logs, to see if there is a memory problem or not.

  • Lightroom 3.2 out of memory issues

    I had been using the beta version of Lightroom 3 without issues.  Once I installed the shipping version I get out of memory messages all the time.  First I noticed this when I went to export some images.  I can get this message when I export just one image or part way though a set of images ( this weekend it made it though 4 of 30 images before it died ).  If I restart Lightroom it's a hit or miss if I can proceed or not. I've even tried restarting the box and only having Lightroom running and still get the out of memory issue.
    I've also had problems printing.  I go to print an image and it looks like it will print but nothing does.  This does not generate an error message it just doesn't do anything.  So far restarting Lightroom seems to fix this problem.
    When in the develop module and click on an image to see it 1:1 at times the image is out of focus.  If I click on another image and then go back to the original it might be in focus.
    I have no idea if any of this is related but I thought I'd throw it out there.  I've been using Lighroom since version 1.0 and have had very good luck with the program.  It is getting very frustrating trying to get anything done.  I search though the forum but the memory issues I found were with older versions. I'd be very grateful if anyone could point me in the right direction.
    Ken
    System:
    i7 860
    4g memory
    XP SP3

    Hi,
    You can get the HeapDump Analyzer for analyzing IBM AIX heapdumps from the below mentioned link.
    http://www.alphaworks.ibm.com/tech/heapanalyzer
    http://www-1.ibm.com/support/docview.wss?uid=swg21190608
    Prerequistes for obtaining a heapdump:
    1.You have to add -XX:+HeapDumpOnOutOfMemoryError to the java options of the server (see note 710146,1053604) to get a heap dump on its occurance, automatically.
    2.You can also generate heapdumps on request :
    Add -XX:+HeapDumpOnCtrlBreak to the java options of the server
    (see note 710146).
    Send signal SIGQUIT to the jlaunch process representing the
    server e.g. using kill -3 <jlaunch pid> (see note 710154).
    The heap dump will be written to output file java_pid<pid>.hprof.<millitime> in:
    /usr/sap/<SID>/<instance>/j2ee/cluster/server<N> directory.
    Both these parameters can be set together too to get the benefit of both the approaches.
    Regards,
    Sandeep.
    Edited by: Sandeep Sehgal on Mar 25, 2008 6:51 PM

  • Out of memory error when sending messages

    Hello!
    I have OS X 10.4.10 and I have an email account setup (IMAP)in mail.
    Lately when I go to send a message, I get an "Out of Memory" error that it says is coming from the mail server, but I know for a fact that it's not a server issue (mail account works on a PC fine). If I try to send it again it goes through just fine.
    Any ideas?

    You are building an HTML table with 1500 rows? That is definitely a lot. And I assume you are also putting some styling on that table...
    These days programmers resource to [url en.wikipedia.org/wiki/AJAX]AJAX techniques for things like that.
    You can, for example, send the table data as a list of comma separated values, and then use Javascript on the browser to create a partial table with options to browse through the different pages of data.
    If you need to show all that data in one single page, you can still try sending just the data as CSV or XML and then building the table on the browser via a Javascript loop. Use CSS for the styling so you don't overload the HTML code with styling information.
    Marcos Broc
    Hi,
    I have a bit of a problem. I?m trying to generate
    quite a large JSP file (a table with about 1500
    rows). The problem is that I get an out of memory
    error when the JSP is being generated. And it is
    definitely that the JSP is too large, or rather the
    HTML being generated from it. Now I was under the
    impression that the JSP page was flush now and again,
    but I tried to set all the different flush options
    and I have also tried to manually flush the page.
    Nothing of this made any difference except for
    displaying a white page instead of an error page
    since the header was already sent. Now this suggests
    to me that the page is being cached internally by the
    server and then sent to the client. So is there
    anything I can do about this?
    Regards
    Hertz

  • Out of Memory Error when Saving SequenceFile generated using API

    I am currently developing an application that creates a SequenceFile using .NET 4.0, C# (2010), Windows 7 Pro x64, Dual Core 2.5 GHz, 3.0Gb Memory.  Using the API EngineClass, I create a complete SequenceFile based on user/file inputs and then save it to disk.  When I'm done, I save using SequenceFile.Save() and then release using ReleaseSequenceFileEx().
    For smaller files, there is no problem.  For larger files, I sporadically get the following error:
    System.Runtime.InteropServices.COMException (0xFFFFBD98): Out of memory.
    When the file does save, it is ~2.8Mb.  Breaking the file up is not an option.  I am not looping, this is one shot.  I've tried shutting down VS2010 and restarting the computer.  It is still inconsistent on saving.
    I also had to set the Embed Interop Types to False per guidance from NI.  They said that it is unstable in 4.0.
    Questions
    1. Is there a file limit size to the Save() API function?
    2. Is there a workaround to using the Save()?
    Best Regards
    Solved!
    Go to Solution.

    I am using C#.NET.  I called them in my save method immediately prior to the actual test save sequence.
    GC.Collect();
    GC.WaitForPendingFinalizers();
    GC.Collect();
    GC.WaitForPendingFinalizers();
    try
           CurrentSequenceFile.Save(outputPath);
           catch (Exceptionerror)
           FireLogEvent(ParserLogType.Parsing_Error, string.Format("{0}", error));
    Factory.TSEngine.ReleaseSequenceFileEx(CurrentSequenceFile, ReleaseSeqFileOptions.ReleaseSeqFile_UnloadFile);
    Hope this helps

  • I'm getting an error Out of memory message when I try to render - There's lots of memory

    I'm getting an error Out of memory message when I try to render - there's lots of memory and I've never gotten this before - any ideas?

    Thanks - I did go through and change all the profiles to Apple RGB, there are several to choose - but I'm still having problems - I think it has to do with corrupt  images - Im backing up and starting over - I've worked with images large and small for years and never had this problem - thanks

  • [N95 8GB]Out of Memory error when opening 15MB pdf...

    What is the file size limit when opening a pdf file? I have a 15MB pdf file that cause 'Out of memory' error when I opened it with the PDF Reader that comes with the phone.
    Thanks.

    Hi,
    It seems that problem is file specific, so could you please share a file with me. I sending you a private message.
    Regards,
    Anoop

  • Indesign cs5 'out of memory' error when using preflight

    I have been regulary getting an 'out of memory' error when i choose to use my bespoke preflight profile.
    I have 4gig of ram and run Indesign CS5 on OS 10.6.8.
    Does anyone know a work around?
    As soon as I select from the basic default profile, i get the beach ball from hell for 10mins, then it kindly lets me know that I am out of memory, sends a crash report to Adobe and then asks if I want to relauch. I'm stuck in a vicious circle. I must of sent my 4th crash report now and no feedback from anyone at Adobe.

    I have replaced my preferences, but still the problem persists. I have tried switching my view from typical display to fast display before i selected a profile. I thought this may give me the extra memory I needed to avoid the enevitable crash. I learnt that 2 files were indeed rgb instead of cmyk before it crashed again. So I switched them to cmyk and tried again, selected my bespoke profile, but yet again it crashed. I think the problem lies with the file, not Indesign, as i have tried the same profile on a different file and the program doesn't crash and runs as it should. So if in future I need to use said crashing file again, firstly i will need to try Peter's isolate fix method. Otherwise i'll never be able to progress to successful a pdf.

  • Out of memory errors when reporting on checked out files

    My Dreamweaver CC version is up to date. I haven't been able to complete a report to find checked out files. I can see it running at a speed of about 2 files per second. We have a few thousand files on the web server, but I didn't have a problem with CS5.5. I can't understand why it takes so long to simply search for .LCK files. I can search for them using Agent Ransack in a matter of seconds. I've seen some other out of memory issues posted here too, so hopefully Adobe can get this fixed and release the next update soon.
    (I haven't seen any improvement in performance with the 64-bit re-engineering.)

    Hi AdobeJobie,
    We would like to investigate your case. If you have a purchased version of DW, send me your Adobe ID, location, and contact details (email, phone number) over a private message. Click my picture and use the message option.
    Thanks,
    Preran

  • CC 2014 Progs leave me plagued with out of memory issues which the previous versions don't exhibit.

    The new CC 2014 suite of progs seem rather memory hungry? I am plagued with out of memory issues trying to use them. The old CC progs work just fine! Is there a new minimum spec for memory now? For now I am forced to use the old versions as the new ones are just un useable...some 'upgrade'!
    Phil

    Me too!  Seems whenever I run more than one CC app I get out of memory errors.  I have Win 7 with 32GB ram.  Only have this problem with CC 2014, not CS6.

  • How do I get rid of memory issue when exporting in imovie

    How do I get rid of memory issue when exporting in imovie?

    Delete the content that is filling it up. Text messages, iMessages, SMS, emails, Safari Cache; essentially all data for all of the built in apps.

Maybe you are looking for

  • Can you connect 2 apple TV's to the same iTunes account in 2 different houses ?

    Can you connect 2 apple TV's to the same iTunes account in 2 different houses ?

  • Mail Accounts Mixed by iCloud

    Since I have installed Mavericks I see a strange an concerning behavior of osx mail: sudently I can see all my business partners email address of an exchange account that is not setup on osx. The exchange account only exists on my iPhone parallel to

  • Using sub-directories in an email

    Hi I want to attach an HTML file to an email using JavaMail. The HTML file contains links to image files that are stored in a sub directory (img). I am having problems representing the sub-directory structure in an Email created using JavaMail. Any h

  • Wireless Adapter permanently disabled after sleep mode - Windows 7 64 bit

    Hello, Info: Laptop: 2014 Macbook Pro running Bootcamp (dual boot...not a VM) with Windows 7 Pro 64 bit OS: Windows 7 Pro 64 bit Network Adapter: Broadcom 802.11ac Network Adapter Driver version: 6.30.223.215 Issue: My broadcom 802.11ac wireless adap

  • BADI'S,IDOC'S

    HOW TO WORK ON BADIS AND IDOCS USING LSMW. CAN ANY ONE SEND ME DOCUMENTATION REGARDING THIS TOPIC'S AND A SAMPLE PROGRAMS ON THESE.POST ME TO [email protected]                                            THANK U