Cs6 photoshop has out of memory error 10.8.1

CS6 Photoshop gives a not enough memory error when opening a new file on a 24" 2008 iMac 3.06 GHz, 4GB Ram running 10.8.1. Can I run CS6 on a 32 bit machine.

4 GBs of RAM is not nearly enough to run both OS X Mountain Lion and  PSCS6.
OS X, by itself, can use between 3 and 4 GB of RAM for the OS to run smoothly.
Leaving only 1 GB or less for running Photoshop.
Photoshop is notoriously CPU! GPU, RAM and HD intensive.
Your iMac can take a max. RAM amount of 6 GBs.
Install the additonal 2 GBs of RAM. Hopefully this will be enough to run PS sufficiently.
You can purchase reliable RAM online from either Crucial memory or from OWC.
Here are the specs for your iMac's memory
Memory Slots     2 - 200-pin PC2-6400 (800MHz) DDR2 SO-DIMM
Here is the procedures to install the additonal RAM memory, yourself.
http://support.apple.com/kb/HT1423
Good Luck!

Similar Messages

  • Photoshop CS6 has out of memory errors under Mac OS X 10.7.4

    Has anyone who has a 2008 MacBoo Pro with the NVIDIA GeForce 8600M GT 512 MB graphics card tried running Photoshop CS6 under Mac OS X 10.7.4?
    On my MacBook Pro, PS CS6 launches; however, any action results in an out of memory error.  The CS6 versions of After Effects, Premiere Pro and Illustrator all launch and run as expected under 10.7.4.
    If I switch to my Mac OS X 10.6.8 startup volume, Photoshop CS6 runs as expected.
    I have support cases open with Adobe, NVidia and Apple.  Adobe thinks I need to update the display drivers.  NVidia says display drivers must be provided by Apple.  Apple says that the display drivers are current.  The Apple support agent said that he can escalate the case if the issue can be reproduced on more than one computer.
    I'd love to keep my MacBook Pro running another year or so without having to reboot in 10.6.8 every time I need to use Photoshop.  I'd stay with 10.6.8, but of course iCloud requires 10.7.4.
    I have not tried 10.8 yet, but it's on my troubleshooting list.
    Also, PS CS6 runs fine under Mac OS X 10.7.4 on my 2010 iMac.
    Thanks in advance for any feedback.
    - Warren

    It looks like it was indeed the display driver for the NVIDIA GeForce 8600 GT 512 MB graphics card inside my 2008 MacBook Pro.
    It seems that this driver only gets installed if you upgrade from Mac OS X 10.6.7 to 10.7.4 while the startup drive is connected to the MacBook Pro itself.  I had updated while the startup drive was connected to my 2010 iMac. Accordingly, I could have reinstalled Photoshop 100 times without it ever launching as expected.  Having used the external startup drive with three other Apple computers (all 2010 or newer machines) with Photoshop CS6 launching as expected, I had just assumed (mistakenly) that it would work as expected with my 2008 computer.
    So, this was simple enough to resolve by upgrading the OS on the external startup drive while having it connected to the MacBook Pro.   In hindsight, this makes perfect sense.
    -Warren

  • Photoshop out of memory error

    Hi, I am running in a mixed enviroment with Open Directory master for authentication on a Snow Leopard OS 10.68 server, with both Lion and Snow Leopard client machines. My Users, due to Adobe products crashing with none mobile accounts, are using external hard drives formatted OS Extended (Journaled) with their mobile account setup on that Hard Drive. This year it has been touch and go on login issues but working throught that. My problem is when a user uses a Lion machine 10.7.4 and Photoshop extended and then moves to our study lab which has Snow Leopard, photoshop workspaces are corrupted and they cannot open any photo's or settings, they get an "Out of memory" error. However when they go back to the Lion machine and after resetting their workspace they can use the software without issues. Anyone else hiving these issues? Ive tried chflags nohidden library in Snow Leopard to view settings, even deleted all photoshop settings in App data and preferences and still cannot access photoshop on SL machine.
    Thanks

    Thanks Kurt for the reply. I'll give more info. All machines have latest updates for both CS6 and current version of OS either 10.6.8 or 10.7.4.
    The only thing on the external hard drive is the user's home folder and their Data, I have to have permissions enabled so their home preferences and WGM settings work correctly. BTW all accounts have admin rights to machine, I have WGM settings preventing access to machine settings and can re-image if i get corrupted machines.
    PS is installed on each machine not the  External Hard Drive.
    All machines have the same Volume name for the internal boot drive, which is set as the PS scratch drive.
    I thought this issue was to do with the memory and may still be so.
    However when a clean profile is connected to our towers with Lion which has 12 gb ram and 1024 MB Video Memory, the settings are at 70% which is around 8 gb.
    When i take same clean profile to our other lab of iMacs which has 8 gb ram. and 512 mb video memory, PS adjusts the performance ram accordingly which is around 5.5 gb ram at 70%.
    I then take that same external drive to the problem iMac's (early 2009 and 2008) which has 4gb ram 256 video memory and is running Lion, PS adjust to 2.4 gb ram.
    Now i put that same drive on the same model, 2009 or 2008 iMac that has Snow Leopard running on it from same model Lion iMac,PS opens fine.
    However after moving from one of the other larger Lion machines and then back to this Snow Leopard machine, the profile gets corrupted, the workspace is corrupted and cannot reset it. Also I am unable to access any settings I get the "Could not complete the operation because there is not enough memory (RAM)" error.
    Now when I go back to a same model Lion machine with same minimum memory i get the same error, however when I go to a larger Lion machine all I get is Color profile cannot sync and the workspace is still corrupted and not showing but I can reset it.
    I then move the performance size to match that of the lower model’s 70%, I still get the error when I go back to the lower end Lion or Snow Leopard machine.
    I tried clearing PS preferences by opening with command+option+shift and delete PS preferences the issue is still there.
    I then remove all PS ~/Library/ settings for PS and it is still present.
    I had to re-create the profile all together to get this to work. As long as I don’t connect to a low end Snow Leopard machine things seem to be going well and PS readjust according to the machine, note, when I set the performance level to a low setting let's say 2.5 gb as on the early iMacs and plug into another machine, PS adjusts to the current machine' mamory availability and does not keep that lower setting setting.
    I have a copy of console error message below.
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x10aa114d0 of class NSCFString autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x106d65010 of class NSBundle autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x11db6aa60 of class NSCFString autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x10a98bc40 of class NSCFString autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x1371a73e0 of class NSCFData autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x1371f7320 of class NSCFData autoreleased with no pool in place - just leaking
    I have 12 Snow Leopard machines that are early 2008 and 2009 imac and no way to up upgrading to Lion and I am not ready for Mountain Lion to go into production untill I can upgrade my OD masters to Mountain Lion.
    I suspect it's not the ram settings that are affected but the video ram is not adjusting from machine to machine, is it possible to upgrade the 2009 20" iMac video cards and get proper firmware support?
    any help is appreciated.

  • CS6 Running out of memory Error while rendering

    Hey everyone,
    I recently upgraded to After Effects CS6 thinking it would be a great investment. So far, I have been horribly wrong!
    Here's my computer's specs: (this is a huge upgrade from the computer I was running 5.5 on without any problems other than it being slow)
    Intel i7-3615QM 2.3GHz Ivy Bridge processor
    8GB GDDR5 RAM
    NVIDIA GeForcs 650M w/ 2GB GDDR5 dedicated RAM
    Here's what's happening:
    I've been doing recording for my job at 1080p in front of a green screen. I record 16 minute sections. I use Keylight to replace the background with a still picture, and then export the video at half resolution, with audio as H.264 (.mp4) files. Normally I would set up the 16 minute sections to render in queue, but it kept having the "out of memory" error shortly after starting the one second in line. So then I started just doing one at a time... and now it won't even make it through one!
    I have noticed that even after rendering is done, AE is still using 6 gigs of RAM! Usually I can purge the ram, but now even doing that doesn't change anything - I have to close the program to get my RAM back. I have tried enlarging my page file, I've tried rendering in different resolutions and file types - I feel like I've tried everthing. I've looked all over the internet and I haven't found any real solutions (just talk from 2008 about "secret" menus that I can't get to work).
    Please help!!
    Thank you!!
    Dominic

    I wouldn't mind getting more RAM except that the computer I used to do this very same work with (except with 5.5) only had 6GHz of RAM and a pathetic processor and GPU in comparison.
    I've tried doing the same editing in PrP but it just doesn't have enough tools . Changing my output format hasn't helped either unfortunately. I think the program has a memory leak.
    I really think I might just go back to 5.5. I called support today and it took forever just to get the poor girl helping me to understand my name has an "m" in it. She wasn't really able to understand the problem. As far as I can tell, CS6 has been a 100% downgrade.
    Thank you both for the help!
    Dominic

  • Photoshop keeps on getting out of memory error after installing Premier Pro

    I just upgrade my CS to CC. Yesterday I installed Photoshop and did my work without any problem but today after installing Premier and After Effect, I keep on getting out of memory error when I'm working even though I don't have any other application running accept photoshop alone. The file I'm working on is a small file, iphone plus size interface. Basically I can open the file, add blur effect and try to type text I will get photoshop telling me that my system is out of memory. Restarting photoshop give the same problem, restarting my computer give the same problem, ie do one thing and next will give memory not enough.
    I don't think my system is slow as it is workstation with dual processor and 12 gig of ram, windows 7 64bits 1gb dedicated memory for graphic card.
    I uninstall Premier and After Effect and suddenly the problem go away. Photoshop work as per normal. I didn't have the time to reinstall premier again but will try to do it tonight or tomorrow
    Anyone experience such problem before?

    When you get that error leaver the error showing and use something the can show you how much free disk space there is left on Photoshop scratch disk.  It may be a problem with scratch storage space not ram storage space.  I see Photoshop use much more scratch space the ram.  I have seen Photoshop using less than 10GB of ram on my machine leaving 30GB of free ram on my system unused while using over 100GB of scratch space.

  • Out of Memory error, though nothing has changed?

    I have two large projects, 65mb each, which I have had open in FCP 7 with no problems for months.
    Then I left them closed for a month and worked on other projects.
    Now I can't open either of those two earlier projects without receiving the Out of Memory error (I'm trying to open each one individually). There are many things to look for when this error appears, as found all over this forum and others. So here is my checklist of facts:
    Neither project contains any PSD files or any sequences with text files in them.
    One of those projects contains no sequences whatsoever, only clips.
    All media for these projects is Apple AVC-Intra 100M 1280x720.
    The RAID drive containing the media for these projects has not been used since the last time the projects were open and working fine.
    The edit system has not changed in any way:
    - FCP 7.0.3
    - Mac Pro
    - OS 10.6.6
    - 2x2.66 Quad Core
    - 6 GB RAM
    - LaCie 8TB RAID, with 1.7TB free
    All RAM is functioning fine.
    I've trashed FCP preferences and received the same results anyway.
    Again, both projects were working fine on this same computer, same version of FCP, even with BOTH projects open at once.
    Also, I've been working with other projects on this same system with no problems, some nearly as large as these two.
    So... none of the obvious culprits are at work here to cause this error, unless I'm missing something.
    Anyone have any ideas? This is a bit scary, as these are documentary projects which are only going to get bigger, and I've done projects bigger than this several times without problems. Hopefully I'm missing something simple...
    Thanks.

    I think I found the culprit, and it's an obvious one:
    My original big project, 65mb, had a bin that was created by an assistant before I began work on the project and I never saw it. That bin has sequence strings for every tape/card loaded into the project, so it had about 65 rather long sequences. Once I found it, I moved that bin to its own project and deleted it from the original project.
    Now the original project opens, and I can open multiple projects at once, no Out of Memory error.
    So it's the bin full of sequences that did it. If I had seen that bin before I wouldn't have had to run this up the flagpole here. But at least it reinforces that word of advice that has appeared in many previous posts about this error: Don't overload a project with long sequences. Break up the sequences into smaller projects and archive them.
    And my own addition to that advice: If you're an assistant setting up a project for an editor, make sure your editor knows what sequences you've tucked away in bins hidden inside bins inside bins, etc.
    All good to go! (Until the next error... stand by...)

  • Acrobat XI Pro "Out of Memory" Error.

    We just received a new Dell T7600 workstation (Win7, 64-bit, 64GB RAM, 8TB disk storage, and more processors than you can shake a stick at). We installed the Adobe CS6 Master Collection which had Acrobat Pro X. Each time we open a PDF of size greater than roughly 4MB, the program returns an "out of memory" error. After running updates, uninstalling and reinstalling (several times), I bought a copy of Acrobat XI Pro hoping this would solve the problem. Same problem still exists upon opening the larger PDFs. Our business depends on opening very large PDF files and we've paid for the Master Collection, so I'd rather not use an freeware PDF reader. Any help, thoughts, and/or suggestions are greatly appreciated.
    Regards,
    Chris

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • More on Out of Memory Error

    I recently purchased the CS3 Master Suite. Am running Encore 3.0.1.008. Have a dual core with 2 gigs, Vista.
    Built a medium size movie with a 70 clips, 14 timelines, and 14 menus. Everything linked nicely, check project finds no errors. Projected size of result on DVD: 3.003 Gigs.
    Invoke build. Transcoder begins. Fails on second clip. Out of Memory Error.
    Checked forums. Note existent of these problems going back many months. Tried various suggestions. Cleared media cache. Rebooted, reentered, restarted. Same result.
    Also came across warning about 2 gig memory limit. However, note that many, many programs are written with input and output sizes much larger than resident memory. This is a CS101 design issue.
    Because I was obligated to make a release to a client base with high expectations, I scaled back the project to an embarrassingly small size.
    I would call it a demo size as opposed to an acceptable project size.
    Of course, I needed to start from scratch. All attempts to prune the original project to smaller and smaller sizes failed (even with media cache clearing, rebooting). A project with 1 menu and 2 video clips did build.
    My conclusion from this experience is that Encore was designed and tested to meet alpha-release demo standards. There is the well deserved expectation with other Adobe products: PhotoShop, Dreamweaver, etc, that production standards can be achieved.
    My forum reading suggests that Encore has been in this crippled state for months, perhaps years. My question is, when can we expect a fix? There are plenty of software engineers out there who could bring this up to Photoshop standards.
    I spent $2500 on the Master Suite with the expectation that I could do production work, not serve as unpaid QA for Adobe.

    When I try to build a DVD this error appears
    The instruction at 0x007fa94c referenced memory at 0x00000000. The memory could not be read
    Click on OK to terminate the program
    Click on CANCEL to debug the program
    What can i do to fix this problem or to build my DVD..?

  • HELP out of memory error message

    I'm using fcp 7 with 10.6 on a macbook pro.
    I just finished editing a 16 minute photo montage with about 100 stills. The pictues all of movement.
    I used some of the pictures in photoshop and AE.
    When I try and render I'm getting "out of memory" error. All of the footage is on my 1 tb external hard drive with
    450 gig not used. I've done more layered ones before with no issues.
    I tried to render indivdual clips and several clips. It worked fine for a while. The render files are about 7 gig
    so it's not much. I also tried to output the whole sequence instead of rendering and I'm still getting the
    "out of memory"  I need to output by tomorrow.
    Thanks,
    Michael

    I have done some Google searching, and FCP 7 is a 32-bit application.  So it is my guess that whatever you are currently doing has FCP 7 wanting to use more than 4 gigibytes of Virtual Address space, and it cannot grow expand any further because it is a 32-bit applications.
    This is my best guess as to why you are getting "out of memory" errors.
    I am not an FCP user so I cannot suggest any tricks to get this to work.  The only other suggestion is to get newer version of FCP which is 64-bit capable, but that might also require upgrading to a newer version of Mac OS X, so going down the upgrade road can get involved.

  • Out of memory error adobe elements 3.0

    When I try to open adobe photo elements 3.0 I get an out of memory error.  the application wont even open.  I upgraded my RAM from 2GB to 3 GB but it has made no difference

    stacey_r79 wrote:
    Hi, just wondering if you had any success in retreiving or opening your Organizer?? Im having the same problem and i have tried everything, i even rang Adobe and they told me there were no technitions able to help because Adobe Photoshop Elements 3.0 was too old. If anyone has a solution please let me know, i have 5 years worth of photos in this software, i just want to be able to open the organizer to burn all my pics to CD then i will update my software. PLEASE HELP!!!!!!!!
    First of all, your photos are not, "in this software" they are in folders on your hard drive, and PSE merely points to them. If >>you personally<< didn't delete them, or reformat your HDD, then they are still there. The default file path for your photos is this; "My Documents" > "My Photos" > "Adobe", and then a couple of different folders dependent on photo type, for example, "Scanned Photos".
    If you are using the default location, you can simply copy and paste you pictures to an external HDD, or burn them to CD or DVD with Windows. I am not recommending this, since if would not allow you to carry over information created by the organizer, and the photos would have to be reimported and retagged, by your files would be backed up and safe, the most important thing.
    There really isn't a need for all these histrionics, Even if you uninstall PSE, your photos will still be exactly where they were.  Even if you uninstall PSE, its catalog information will still be intact, and a later version will pick it up and "convert it" to its own file system.
    Support for PSE-3 has obviously ended, it's more than 5 years and 5 versions old. Just as a topic of conversation, my personal favorite version of PSE, is PSE-5.  IMO, it's the fasest importing, the most reliable, yet it has a lot more bells and whistles than PSE-3. If you can find a usable copy, I'd grab it, especially for a computer running XP.
    As to the out of memory issue there can be several issues that can affect this, for example; not enough RAM is allotted to PSE, this is independent of the amount of RAM actually installed in the computer. The scratch disc area is full or insufficient, the preferences file is corrupt.
    Everybody is complaining about not being helped in this thread, yet no one has posted back as to whether they have tried Barbara's suggestions, or even if they understand enough about Windows file system to be able to begin to help themselves.

  • Fireworks Out of Memory Error

    I guess this will be nothing more than a rant about my disappointment with this software.
    I recently introduced Fireworks into my web design workflow. I hop around PS, Ai, and Fw quite a bit. I am running a Quad-Core, 64-bit, 8GB Windows beast. I shouldn't be having this issue "out of memory error" 3-4 times a day. But I do. I searched the web for a solution, but there is none, expect the awkward - limit the number of opened files and restart every hour. Not acceptable.
    I'm truly disappointed in the lack of support or feedback that Adobe has addressed with this issue.
    If I could, I would switch software suites. This is truly an enormous oversight that has no foreseeable solution. I love Fw and Ps, but unfortunately I have lost faith in Adobe.
    I can't imagine the woes that CS 6 will produce.
    Unhappily yours.
    If there is a solution, what is it?

    My understanding is that data is collected from the bug list but generally, the engineers don't respond back. Based on the consisteny (number of similar bug reports) of a problem, it gets attention of the dev team, but then it has to be given a priority. It's unlikely the FW team has the numbers or resources that say, the Photoshop team has. So very likely it comes down to picking and choosing. What battles can be won? What is feasible to improve on, while still adding useful new features in the time span allowed? Not easy decisions and regardless, not every one will be happy with the choices.
    Based on CS5, I can see that Adobe is committed to improving performance and memory management in FW. FW CS5 was a vast improvement over CS4 in those areas. Perfect? No. But much better.
    For a detailed look at some of these improvements, check out these two articles:
    http://www.adobe.com/devnet/logged_in/bbowman_fwcs5.html
    http://www.smashingmagazine.com/2010/05/22/adobe-fireworks-is-it-worth-switching-to-cs5/

  • Acrobat XI Pro "Out of Memory" error after Office 2010 install

    Good Afternoon,
    We recently pushed Office 2010 to our users and are now getting reports of previous installs of Adobe Acrobat XI Pro no longer working but throwing "Out of Memory" errors.
    We are in a Windows XP environment. All machines are HP 8440p/6930p/6910 with the same Service pack level (3) and all up to date on security patches.
    All machines are running Office 2010 SP1.
    All machines have 2GB or 4GB of RAM (Only 3.25GB recognized as we are a 32bit OS environment).
    All machines have adequate free space (ranging from 50gb to 200gb of free space).
    All machines are set to 4096mb initial page file size with 8192mb maximum page file size.
    All machines with Acrobat XI Pro *DO NOT* have Reader XI installed alongside. If Reader is installed, it is Reader 10.1 or higher.
    The following troubleshooting steps have been taken:
    Verify page file size (4096mb - 8192mb).
    Deleted local user and Windows temp files (%temp% and c:\WINDOWS\Temp both emptied).
    Repair on Adobe Acrobat XI Pro install. No change.
    Uninstall Acrobat Pro XI, reboot, re-install. No change.
    Uninstall Acrobat Pro XI Pro along with *ALL* other Adobe applications presently installed (Flash Player, Air), delete all Adobe folders and files found in a full search of the C drive, delete all orphaned Registry entries for all Adobe products, re-empty all temp folders, reboot.
    Re-install Adobe Acrobat XI Pro. No change.
    Disable enhanced security in Acrobat XI Pro. No change.
    Renamed Acrobat XI's plug_ins folder to plug_ins.old.
    You *can* get Acrobat to open once this is done but when you attempt to edit a file or enter data into a form, you get the message, "The "Updater" plug-in has been removed. Please re-install Acrobat to continue viewing the current file."
    A repair on the Office 2010 install and re-installing Office 2010 also had no effect.
    At this point, short of re-imaging the machines (which is *not* an option), we are stumped.
    We have not yet tried rolling back a user to Office 2007 as the upgrade initiative is enterprise-wide and rolling back would not be considered a solution.
    Anyone have any ideas beyond what has been tried so far?

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • RoboHelp 9 gives an out of memory error and crashes when I try to import or link a Frame 10 file or

    I have Tech Suite 3. If I start a new RoboHelp project and try to import or link Frame files, RoboHelp tries for a while, then I get an Out of Memory error and the program crashes.
    I opened one of the sample projects and was able to link to one of my frame files without any problem, so it seems to be an issue with creating something new.
    Any suggestions?

    It happens when I create a new project and then try to import or link frame docs to make up the content. It starts scanning, then crashes. I did get it to the conversion setting page once, but no further.
    It does not happen if I open one of the supplied example projects and link a file. But then it doesn't let me choose, during import, any style mapping. And I can't delete the sample project fold
    Twice now it has told me when I tried to import (not link, but import) that my .fm file could not be opened, and told me to verify that Frame is installed (it is) and that the file is a valid frame file (it is).
    The docs and project are in separate folders on my C: drive.

  • Out of Memory Error While deploying as EAR file

    Hai,
    I was trying to deploy an EAR file of size 63 MB which inturn containing about 60 EJB.jars. No WARs. application.xml has all the entries for the JARs. While I am deploying it is giving Out of Memory Error. Is there any way to tweak this problem. I am using my own hand written java application which uses the SunONE deployment APIs for deployment. Can u please tell how to tackle this problem. I am running my application through a batch file which uses jdk1.4.
    Please help me regarding this issue.

    You can set the initial heap size and maximum heap size for the JVM, either in the app-server admin console, or maybe in one of your scripts. You look-up the syntax!...
    I had this error yesterday. I too had run out of memory (150Mb). You simply need to allocate more to the app-server.

  • Uploading large files from applet to servlet throws out of memory error

    I have a java applet that needs to upload files from a client machine
    to a web server using a servlet. the problem i am having is that in
    the current scheme, files larger than 17-20MB throw an out of memory
    error. is there any way we can get around this problem? i will post
    the client and server side code for reference.
    Client Side Code:
    import java.io.*;
    import java.net.*;
    // this class is a client that enables transfer of files from client
    // to server. This client connects to a servlet running on the server
    // and transmits the file.
    public class fileTransferClient
    private static final String FILENAME_HEADER = "fileName";
    private static final String FILELASTMOD_HEADER = "fileLastMod";
    // this method transfers the prescribed file to the server.
    // if the destination directory is "", it transfers the file to
    "d:\\".
    //11-21-02 Changes : This method now has a new parameter that
    references the item
    //that is being transferred in the import list.
    public static String transferFile(String srcFileName, String
    destFileName,
    String destDir, int itemID)
    if (destDir.equals(""))
    destDir = "E:\\FTP\\incoming\\";
    // get the fully qualified filename and the mere filename.
    String fqfn = srcFileName;
    String fname =
    fqfn.substring(fqfn.lastIndexOf(File.separator)+1);
    try
    //importTable importer = jbInit.getImportTable();
    // create the file to be uploaded and a connection to
    servlet.
    File fileToUpload = new File(fqfn);
    long fileSize = fileToUpload.length();
    // get last mod of this file.
    // The last mod is sent to the servlet as a header.
    long lastMod = fileToUpload.lastModified();
    String strLastMod = String.valueOf(lastMod);
    URL serverURL = new URL(webadminApplet.strServletURL);
    URLConnection serverCon = serverURL.openConnection();
    // a bunch of connection setup related things.
    serverCon.setDoInput(true);
    serverCon.setDoOutput(true);
    // Don't use a cached version of URL connection.
    serverCon.setUseCaches (false);
    serverCon.setDefaultUseCaches (false);
    // set headers and their values.
    serverCon.setRequestProperty("Content-Type",
    "application/octet-stream");
    serverCon.setRequestProperty("Content-Length",
    Long.toString(fileToUpload.length()));
    serverCon.setRequestProperty(FILENAME_HEADER, destDir +
    destFileName);
    serverCon.setRequestProperty(FILELASTMOD_HEADER, strLastMod);
    if (webadminApplet.DEBUG) System.out.println("Connection with
    FTP server established");
    // create file stream and write stream to write file data.
    FileInputStream fis = new FileInputStream(fileToUpload);
    OutputStream os = serverCon.getOutputStream();
    try
    // transfer the file in 4K chunks.
    byte[] buffer = new byte[4096];
    long byteCnt = 0;
    //long percent = 0;
    int newPercent = 0;
    int oldPercent = 0;
    while (true)
    int bytes = fis.read(buffer);
    byteCnt += bytes;
    //11-21-02 :
    //If itemID is greater than -1 this is an import file
    transfer
    //otherwise this is a header graphic file transfer.
    if (itemID > -1)
    newPercent = (int) ((double) byteCnt/ (double)
    fileSize * 100.0);
    int diff = newPercent - oldPercent;
    if (newPercent == 0 || diff >= 20)
    oldPercent = newPercent;
    jbInit.getImportTable().displayFileTransferStatus
    (itemID,
    newPercent);
    if (bytes < 0) break;
    os.write(buffer, 0, bytes);
    os.flush();
    if (webadminApplet.DEBUG) System.out.println("No of bytes
    sent: " + byteCnt);
    finally
    // close related streams.
    os.close();
    fis.close();
    if (webadminApplet.DEBUG) System.out.println("File
    Transmission complete");
    // find out what the servlet has got to say in response.
    BufferedReader reader = new BufferedReader(
    new
    InputStreamReader(serverCon.getInputStream()));
    try
    String line;
    while ((line = reader.readLine()) != null)
    if (webadminApplet.DEBUG) System.out.println(line);
    finally
    // close the reader stream from servlet.
    reader.close();
    } // end of the big try block.
    catch (Exception e)
    System.out.println("Exception during file transfer:\n" + e);
    e.printStackTrace();
    return("FTP failed. See Java Console for Errors.");
    } // end of catch block.
    return("File: " + fname + " successfully transferred.");
    } // end of method transferFile().
    } // end of class fileTransferClient
    Server side code:
    import java.io.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    import java.util.*;
    import java.net.*;
    // This servlet class acts as an FTP server to enable transfer of
    files
    // from client side.
    public class FtpServerServlet extends HttpServlet
    String ftpDir = "D:\\pub\\FTP\\";
    private static final String FILENAME_HEADER = "fileName";
    private static final String FILELASTMOD_HEADER = "fileLastMod";
    public void doGet(HttpServletRequest req, HttpServletResponse resp)
    throws ServletException,
    IOException
    doPost(req, resp);
    public void doPost(HttpServletRequest req, HttpServletResponse
    resp)
    throws ServletException,
    IOException
    // ### for now enable overwrite by default.
    boolean overwrite = true;
    // get the fileName for this transmission.
    String fileName = req.getHeader(FILENAME_HEADER);
    // also get the last mod of this file.
    String strLastMod = req.getHeader(FILELASTMOD_HEADER);
    String message = "Filename: " + fileName + " saved
    successfully.";
    int status = HttpServletResponse.SC_OK;
    System.out.println("fileName from client: " + fileName);
    // if filename is not specified, complain.
    if (fileName == null)
    message = "Filename not specified";
    status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
    else
    // open the file stream for the file about to be transferred.
    File uploadedFile = new File(fileName);
    // check if file already exists - and overwrite if necessary.
    if (uploadedFile.exists())
    if (overwrite)
    // delete the file.
    uploadedFile.delete();
    // ensure the directory is writable - and a new file may be
    created.
    if (!uploadedFile.createNewFile())
    message = "Unable to create file on server. FTP failed.";
    status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
    else
    // get the necessary streams for file creation.
    FileOutputStream fos = new FileOutputStream(uploadedFile);
    InputStream is = req.getInputStream();
    try
    // create a buffer. 4K!
    byte[] buffer = new byte[4096];
    // read from input stream and write to file stream.
    int byteCnt = 0;
    while (true)
    int bytes = is.read(buffer);
    if (bytes < 0) break;
    byteCnt += bytes;
    // System.out.println(buffer);
    fos.write(buffer, 0, bytes);
    // flush the stream.
    fos.flush();
    } // end of try block.
    finally
    is.close();
    fos.close();
    // set last mod date for this file.
    uploadedFile.setLastModified((new
    Long(strLastMod)).longValue());
    } // end of finally block.
    } // end - the new file may be created on server.
    } // end - we have a valid filename.
    // set response headers.
    resp.setContentType("text/plain");
    resp.setStatus(status);
    if (status != HttpServletResponse.SC_OK)
    getServletContext().log("ERROR: " + message);
    // get output stream.
    PrintWriter out = resp.getWriter();
    out.println(message);
    } // end of doPost().
    } // end of class FtpServerServlet

    OK - the problem you describe is definitely what's giving you grief.
    The workaround is to use a socket connection and send your own request headers, with the content length filled in. You may have to multi-part mime encode the stream on its way out as well (I'm not about that...).
    You can use the following:
    http://porsche.cis.udel.edu:8080/cis479/lectures/slides-04/slide-02.html
    on your server to get a feel for the format that the request headers need to take.
    - Kevin
    I get the out of Memory Error on the client side. I
    was told that this might be a bug in the URLConnection
    class implementation that basically it wont know the
    content length until all the data has been written to
    the output stream, so it uses an in memory buffer to
    store the data which basically causes memory issues..
    do you think there might be a workaround of any kind..
    or maybe a way that the buffer might be flushed after
    a certain size of file has been uploaded.. ?? do you
    have any ideas?

Maybe you are looking for

  • Error with Portal JSP Page Upload

    i've run into a problem with trying to deploy custom JSP pages to Portal. basically, if i enable JSP page types for the Page Group, generate a simple page, download it using the Download JSP link, and then try to upload it again as a JAR or WAR via t

  • Could we use embedded SQL for XML in .pc ?

    Hi, Can we use embedded SQL for XML in .pc ? <1> assume we have run SQL statements in Oracle9i: SQL>create table MY_XML_TABLE (Key1 NUMBER, Xml_Column SYS.XMLTYPE); SQL>insert into MY_XML_TABLE(key1, Xml_Column) values (1, SYS.XMLTYPE.CREATEXML ('<bo

  • Ora-1000 max openCursor exceeded

    Hi, i have a question about the ora-1000. My prog is a "C"-coded Programm. Which is connected to the database. After few days the Ora-message ora-1000 will be shown in the tracefile. The Parameter Max open Cursor is meanwhile set 1000. Knows anyone t

  • Convert Document Size for iPad

    We have a 220 page catalog that was set up for a 8-in x 11-in print publication. We are now needing to create a digital publication for an iPad of the same document. The easiest way would be to resize the document and now scale up the text and images

  • Visual Basic - Number of records exported?

    I am using Crystal Reports XI and can not find a a way to tell how many records are exported when running the exportToDisk?  Basically, I want to                 cr.SetParameterValue("Paremeter1_id", nId)                 cr.ExportToDisk(CrystalDecisi