Out of Memory error after upgrading to Reader X

We have several PDF documents that work fine on Reader 9.  But after upgrading to Reader X, the files will not open.  Instead, they report an Out Of Memory error and a blank document opens instead.  Removing Reader X and re-installing Reader 9 corrects the issue. This has been reproduced on 3 different PCs running both Windows XP and Windows 7.
Any suggestions?

Just to throw in my 2 cents... Adobe has known about the out of memory bug at least since 01/12/2011 because that is when I reported the problem to them.  I even had an escalated support case #181995624 and bug#2800823.  Our problem comes from a EFI Fiery E7000 attached to our Lanier LD460c Copier.  Any pdf's made from that copier will not open in Acrobat X, although they will open in lower versions of Acrobat just fine.  Our only workaround is to keep Acrrobat 9 on office computers, or you can open the offending pdf in Acrobat 9 or earlier and print it to pdf, and then it will open in Acrobat X!!!  They acknowledged that this was a bug, see my email chain below.  This was the last update I received from Adobe, very frustrating...
From:
Sent: Wednesday, February 09, 2011 9:12 AM
To:
Cc:
Subject: RE: Case 181995624; Unable to open PDF
Hi Phil,
We do not have this information or estimate time by our Engineering team yet.
Regards,
Neeraj
From:
Sent: Monday, February 07, 2011 8:19 PM
To:
Cc:
Subject: RE: Case 181995624; Unable to open PDF
Next major release as in the next patch for Acrobat X, or as in Acrobat 11?
From:
Sent: Saturday, February 05, 2011 4:31 AM
To:
Cc:
Subject: Case 181995624; Unable to open PDF
Hi Phil,
You can get back to us with the Bug Number provided you earlier based on which we will give you the update by our Engineering team. However, the update for now is that it is decided to fix in our next major release.
Regards,
Neeraj
From:
Sent: Thursday, February 03, 2011 1:33 AM
To:
Subject: RE: Case 181995624; Unable to open PDF
Can you send me a link to where I can find information on the bug?
From:
Sent: Tuesday, February 01, 2011 10:14 AM
To:
Cc:
Subject: Case 181995624; Unable to open PDF
Hi Phil,
Hope you are doing well.
I have researched on your issue and found that it is a BUG with Acrobat X. I have logged it on your behalf so that our Engineering team can have a look on that. Please find the Bug Number #2800823 for any update on this in future. I am closing this case on your behalf.
Have a nice day.
Regards,
Neeraj
From:
Sent: Tuesday, February 01, 2011 12:22 AM
To:
Cc:
Subject: RE: Case 181995624; Unable to open PDF
Any updates on this case?
From:
Sent: Friday, January 14, 2011 2:03 PM
To:
Cc:
Subject: RE: Case 181995624; Unable to open PDF
The EFI Fiery E-7000 Controller version is 1.2.0 and it handles the scanning and printing functionality of our Lanier LD160c copier.  I have attached two sample files.  One is a 1 page scan from our copier.  The other is a combined pdf that I just created in Acrobat 9.  The first two pages of the combined pdf consists of a webpage that I printed using Acrobat 9 and then the scan from the copier is on the 3rd page.  In Acrobat X, once you get to the 3rd page you will receive the Out of Memory error.  It will open in previous versions of Acrobat just fine though.
From:
Sent: Friday, January 14, 2011 11:52 AM
To:
Cc:
Subject: Case 181995624; Unable to open PDF
Hi Phil,
Thanks for the information.
I have the PDF file provided by you and able to reproduce the behavior. I tried to call you at 214-303-1500 but got voice mail.
Please let me know when you will be available so that I could call you and continue further with this issue.
Regards,
Neeraj
From:
Sent: Thursday, January 13, 2011 6:57 AM
To:
Cc:
Subject: Re: Case 181995624; Unable to open PDF
It is a walk up copier and we scan to email.  The EFI Fiery controller E7000 handles pdf conversion for the copier, but yes it has the latest firmware.  The bottom line is that we have 3 or 4 years worth of pdfs created from that copier rendered useless by Acrobat X.  They open fine in previous versions of Acrobat.  Did you get the test pdf file when this case was created?
-- Sent from my Palm Pre
On Jan 12, 2011 6:12 PM, Acrobat Support <[email protected]> wrote:
Hi Philip,
Thank you for choosing Adobe Support we have got your concern, we see that you are facing an issue with opening PDF files in Acrobat X, created from Lanier (Ricoh) LD160c copier. A technical support case 181995624 has been created for your reference. In order to assist you further. We would like to have the following information.
·         Are you using the latest scanner driver ?
·         What is the exact scanning workflow ?
Regards
Acrobat Support

Similar Messages

  • Photoshop keeps on getting out of memory error after installing Premier Pro

    I just upgrade my CS to CC. Yesterday I installed Photoshop and did my work without any problem but today after installing Premier and After Effect, I keep on getting out of memory error when I'm working even though I don't have any other application running accept photoshop alone. The file I'm working on is a small file, iphone plus size interface. Basically I can open the file, add blur effect and try to type text I will get photoshop telling me that my system is out of memory. Restarting photoshop give the same problem, restarting my computer give the same problem, ie do one thing and next will give memory not enough.
    I don't think my system is slow as it is workstation with dual processor and 12 gig of ram, windows 7 64bits 1gb dedicated memory for graphic card.
    I uninstall Premier and After Effect and suddenly the problem go away. Photoshop work as per normal. I didn't have the time to reinstall premier again but will try to do it tonight or tomorrow
    Anyone experience such problem before?

    When you get that error leaver the error showing and use something the can show you how much free disk space there is left on Photoshop scratch disk.  It may be a problem with scratch storage space not ram storage space.  I see Photoshop use much more scratch space the ram.  I have seen Photoshop using less than 10GB of ram on my machine leaving 30GB of free ram on my system unused while using over 100GB of scratch space.

  • Acrobat XI Pro "Out of Memory" error after Office 2010 install

    Good Afternoon,
    We recently pushed Office 2010 to our users and are now getting reports of previous installs of Adobe Acrobat XI Pro no longer working but throwing "Out of Memory" errors.
    We are in a Windows XP environment. All machines are HP 8440p/6930p/6910 with the same Service pack level (3) and all up to date on security patches.
    All machines are running Office 2010 SP1.
    All machines have 2GB or 4GB of RAM (Only 3.25GB recognized as we are a 32bit OS environment).
    All machines have adequate free space (ranging from 50gb to 200gb of free space).
    All machines are set to 4096mb initial page file size with 8192mb maximum page file size.
    All machines with Acrobat XI Pro *DO NOT* have Reader XI installed alongside. If Reader is installed, it is Reader 10.1 or higher.
    The following troubleshooting steps have been taken:
    Verify page file size (4096mb - 8192mb).
    Deleted local user and Windows temp files (%temp% and c:\WINDOWS\Temp both emptied).
    Repair on Adobe Acrobat XI Pro install. No change.
    Uninstall Acrobat Pro XI, reboot, re-install. No change.
    Uninstall Acrobat Pro XI Pro along with *ALL* other Adobe applications presently installed (Flash Player, Air), delete all Adobe folders and files found in a full search of the C drive, delete all orphaned Registry entries for all Adobe products, re-empty all temp folders, reboot.
    Re-install Adobe Acrobat XI Pro. No change.
    Disable enhanced security in Acrobat XI Pro. No change.
    Renamed Acrobat XI's plug_ins folder to plug_ins.old.
    You *can* get Acrobat to open once this is done but when you attempt to edit a file or enter data into a form, you get the message, "The "Updater" plug-in has been removed. Please re-install Acrobat to continue viewing the current file."
    A repair on the Office 2010 install and re-installing Office 2010 also had no effect.
    At this point, short of re-imaging the machines (which is *not* an option), we are stumped.
    We have not yet tried rolling back a user to Office 2007 as the upgrade initiative is enterprise-wide and rolling back would not be considered a solution.
    Anyone have any ideas beyond what has been tried so far?

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • Error after upgrading to Reader 10.0.1.

    I get this popup again after installing to a different directory and Ihad some help from Adobe Support this morning.  Error is Acrd32.exe The exception unknown software (0xc0000710) occured in the application at location 0x77a97e1d.  Click ok to terminate the program.

    In the Previous iTunes Libraries folder should be a number of dated iTunes Library files. Take the most recent of these and copy it into the iTunes folder. Rename *iTunes Library* as *iTunes Library (Failed)* and then rename the restored file as *iTunes Library*. *CONNECT YOUR EXTERNAL DRIVE*. Start iTunes. iTunes should redo the library upgrade process and connect to your files.
    tt2

  • Weblogic generating out of memory error when commandline run doesn't

    Hello,
    I am just beginning to use weblogic. I have a jar file which runs fine if run from command line with the options
    "C:\Program Files (x86)\Java\jre6\bin\java" -jar -Xms1024m -Xmx1024m Report.jarIt connects to oracle and selects some data around (500k records) and writes it to a csv.
    When I run the same jar from within a web application (I mean obvoiusly a servlet calling the jar's main method) the webapp generates
    out of memory error after 80k records itself.
    I tried changing the configuration the server Startup arguments from the console (Server Start and then Arguments) and then restart.
    I just wrote the same thing there -Xms1024m -Xmx1024m
    I guess I am missing something. Please share your answers.
    Environment :
    Win2k8, weblogic 10gR3, jdk 1.5. The application is installed as a service.
    Thanks,
    Neetesh
    Edited by: user13312817 on 5 Dec, 2011 12:15 AM

    If you are not using NodeManager, then I don't think those server settings actually control anything. If you are just exploring WLS, then I suspect you are simply starting an AdminServer in a basic domain and don't have a cluster/nodemanager based environment. I could be wrong, please correct as needed!
    If you are simply starting your WebLogic Server instance using the startWebLogic.sh|cnd script, you can set an environment variable on the command shell that will be picked up and used when the server starts.
    set USER_MEM_ARGS=-Xmx1024 -Xms1024m
    * apply appropriate *nix changes as appropriate.
    Then use the startWebLogic.sh|cmd script to start the server and test your application.
    It may very well be the case that your particular application consumes > 1GB heap so you may need more. Remember that you now have a server environment running over your "main" class, so there is bound to be more memory used that could be just sneaking your heap use over 1GB. For example.
    -steve-

  • Getting 'Out of memory' error while opening the file. I have tried several versions of Adobe 7.0,9.0,X1. It is creating issue to convert PDF into TIFF. Please provide the solution ASAP

    Hello All,
    I am getting 'Out of memory' error while opening the file. I have tried several versions of Adobe 7.0,9.0,X1.
    Also, it is creating issue to convert PDF into TIFF. Please provide the solution ASAP.

    I am using Adobe reader XI. When i open PDF it gives "OUT of memory" error after scrolling PDF gives another alert "Insufficient data for an image". after clicking both alerts it loads full data of PDF. It is not happening with all PDFs. couple of PDFs are facing this issue. Because of this error my software is not able to print these PDFS into TIFF. My OS in window7*64. I tried it on win2012R2 and XP. Same issue is generating there.
    It has become critical issue for my production.

  • Oracle Service Bus For loop getting out of memory error

    I have a business service that is based on a JCA adapter to fetch an undertimed amout of records from a database.  I then need to upload those to another system using a webservice designed by an external source.  This web service will only accept upto to x amount of records.
    The process:
    for each object in the Jca Response
          Insert object into Service callout Request body
          if object index = number of objects in jca response or object index = next batch index
               Invoke service callout
               Append service callout Response to a total response object (xquery transform)
               increase next batch index by Batch size
               reset service callout to empty body
           endif
    end for
    replace body  with total response object.
    If I use the data set that only has 5 records  and use a batch size of 2 the process works fine.
    If I use  a data set with 89 records  and a batch size of 2 I get the below out of memory error  after about 10 service callouts
    the quantity of data in the objects is pretty small, less than 1kB for each JCA Object
    Server Name:
    AdminServer
    Log Name:
    ServerLog
    Message:
    Failed to process response message for service ProxyService Sa/Proxy Services/DataSync:
    java.lang.OutOfMemoryError: allocLargeObjectOrArray:
    [C, size 67108880 java.lang.OutOfMemoryError: allocLargeObjectOrArray:
    [C, size 67108880 at org.apache.xmlbeans.impl.store.Saver$TextSaver.resize(Saver.java:1700)
    at org.apache.xmlbeans.impl.store.Saver$TextSaver.preEmit(Saver.java:1303) at
    org.apache.xmlbeans.impl.store.Saver$TextSaver.emit(Saver.java:1234)
    at org.apache.xmlbeans.impl.store.Saver$TextSaver.emitXmlns(Saver.java:1003)
    at org.apache.xmlbeans.impl.store.Saver$TextSaver.emitNamespacesHelper(Saver.java:1021)
    at org.apache.xmlbeans.impl.store.Saver$TextSaver.emitElement(Saver.java:972)
    at org.apache.xmlbeans.impl.store.Saver.processElement(Saver.java:476)
    at org.apache.xmlbeans.impl.store.Saver.process(Saver.java:307)
    at org.apache.xmlbeans.impl.store.Saver$TextSaver.saveToString(Saver.java:1864)
    at org.apache.xmlbeans.impl.store.Cursor._xmlText(Cursor.java:546)
    at org.apache.xmlbeans.impl.store.Cursor.xmlText(Cursor.java:2436)
    at org.apache.xmlbeans.impl.values.XmlObjectBase.xmlText(XmlObjectBase.java:1500)
    at com.bea.wli.sb.test.service.ServiceTracer.getXmlData(ServiceTracer.java:968)
    at com.bea.wli.sb.test.service.ServiceTracer.addDataType(ServiceTracer.java:944)
    at com.bea.wli.sb.test.service.ServiceTracer.addDataType(ServiceTracer.java:924)
    at com.bea.wli.sb.test.service.ServiceTracer.addContextChanges(ServiceTracer.java:814)
    at com.bea.wli.sb.test.service.ServiceTracer.traceExit(ServiceTracer.java:398)
    at com.bea.wli.sb.pipeline.debug.DebuggerTracingStep.traceExit(DebuggerTracingStep.java:156)
    at com.bea.wli.sb.pipeline.PipelineContextImpl.exitComponent(PipelineContextImpl.java:1292)
    at com.bea.wli.sb.pipeline.MessageProcessor.finishProcessing(MessageProcessor.java:371)
    at com.bea.wli.sb.pipeline.RouterCallback.onReceiveResponse(RouterCallback.java:108)
    at com.bea.wli.sb.pipeline.RouterCallback.run(RouterCallback.java:183)
    at weblogic.work.ContextWrap.run(ContextWrap.java:41)
    at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTuningWorkManagerImpl.java:545)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256) at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)
    Subsystem:
    OSB Kernel
    Message ID:
    BEA-382005
    It appears to be the service callout that is the problem (it calls another OSB service that logins and performs the data upload to the External service)  because If I change the batch size up to 100  the loop will load all the 89 records into the callout request and execute it fine.  If I have a small batch size then I run out of memory.
    Is there some settings I need to change?  Is there a better way in OSB (less memory intensive than service callout in a for loop)?
    Thanks.

    hi,
    Could you please let me know if you get rid off this issue as we are also facing the same issue.
    Thanks,
    SV

  • Error opening a pdf after upgrading adobe reader

    a client of mine can't open some pdf's after upgrading adobe reader. he gets the message:
    "There was an error processing a page. Invalid ColorSpace."
    What can he do? he works with windows.

    Hi,
    I like to share the discussion as I am confronted with same or similar problem. Since last Saturday the pdf attachments I sent with my e-mail messages show a blank pdf document upon opening. To make sure it really was the case I sent message with pdf attached to myself: bingo! Have recently updated to Adobe Reader 9.4.1. Using WinXp SP3 and MS Outlook 2003 SP3.
    Any suggestions?

  • Photoshop out of memory error

    Hi, I am running in a mixed enviroment with Open Directory master for authentication on a Snow Leopard OS 10.68 server, with both Lion and Snow Leopard client machines. My Users, due to Adobe products crashing with none mobile accounts, are using external hard drives formatted OS Extended (Journaled) with their mobile account setup on that Hard Drive. This year it has been touch and go on login issues but working throught that. My problem is when a user uses a Lion machine 10.7.4 and Photoshop extended and then moves to our study lab which has Snow Leopard, photoshop workspaces are corrupted and they cannot open any photo's or settings, they get an "Out of memory" error. However when they go back to the Lion machine and after resetting their workspace they can use the software without issues. Anyone else hiving these issues? Ive tried chflags nohidden library in Snow Leopard to view settings, even deleted all photoshop settings in App data and preferences and still cannot access photoshop on SL machine.
    Thanks

    Thanks Kurt for the reply. I'll give more info. All machines have latest updates for both CS6 and current version of OS either 10.6.8 or 10.7.4.
    The only thing on the external hard drive is the user's home folder and their Data, I have to have permissions enabled so their home preferences and WGM settings work correctly. BTW all accounts have admin rights to machine, I have WGM settings preventing access to machine settings and can re-image if i get corrupted machines.
    PS is installed on each machine not the  External Hard Drive.
    All machines have the same Volume name for the internal boot drive, which is set as the PS scratch drive.
    I thought this issue was to do with the memory and may still be so.
    However when a clean profile is connected to our towers with Lion which has 12 gb ram and 1024 MB Video Memory, the settings are at 70% which is around 8 gb.
    When i take same clean profile to our other lab of iMacs which has 8 gb ram. and 512 mb video memory, PS adjusts the performance ram accordingly which is around 5.5 gb ram at 70%.
    I then take that same external drive to the problem iMac's (early 2009 and 2008) which has 4gb ram 256 video memory and is running Lion, PS adjust to 2.4 gb ram.
    Now i put that same drive on the same model, 2009 or 2008 iMac that has Snow Leopard running on it from same model Lion iMac,PS opens fine.
    However after moving from one of the other larger Lion machines and then back to this Snow Leopard machine, the profile gets corrupted, the workspace is corrupted and cannot reset it. Also I am unable to access any settings I get the "Could not complete the operation because there is not enough memory (RAM)" error.
    Now when I go back to a same model Lion machine with same minimum memory i get the same error, however when I go to a larger Lion machine all I get is Color profile cannot sync and the workspace is still corrupted and not showing but I can reset it.
    I then move the performance size to match that of the lower model’s 70%, I still get the error when I go back to the lower end Lion or Snow Leopard machine.
    I tried clearing PS preferences by opening with command+option+shift and delete PS preferences the issue is still there.
    I then remove all PS ~/Library/ settings for PS and it is still present.
    I had to re-create the profile all together to get this to work. As long as I don’t connect to a low end Snow Leopard machine things seem to be going well and PS readjust according to the machine, note, when I set the performance level to a low setting let's say 2.5 gb as on the early iMacs and plug into another machine, PS adjusts to the current machine' mamory availability and does not keep that lower setting setting.
    I have a copy of console error message below.
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x10aa114d0 of class NSCFString autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x106d65010 of class NSBundle autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x11db6aa60 of class NSCFString autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x10a98bc40 of class NSCFString autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x1371a73e0 of class NSCFData autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x1371f7320 of class NSCFData autoreleased with no pool in place - just leaking
    I have 12 Snow Leopard machines that are early 2008 and 2009 imac and no way to up upgrading to Lion and I am not ready for Mountain Lion to go into production untill I can upgrade my OD masters to Mountain Lion.
    I suspect it's not the ram settings that are affected but the video ram is not adjusting from machine to machine, is it possible to upgrade the 2009 20" iMac video cards and get proper firmware support?
    any help is appreciated.

  • Frm-40024 out of memory error

    I am trying to upgrade forms 10g to 11g.
    after installing the weblogic 10.30.2 , fusion middleware 11.1.1.2.0, I compiled the old forms to a new version.
    there was no compile error and .fmx file generated very successfully.
    I runned the file in my local computer. there was no problem.
    So,
    I uploaded .fmx into the WAS Server, and when I opened the form screen in the menu, I got an error!!!!
    "frm-40024 out of memory error"
    what is the problem?

    Hi user13763783
    The error is direct and straight forward ...
    Error Message: FRM-40024: Out of memory.
    Error Cause:
    Internal error. Your computer does not have enough memory to run the form.
    Action:The designer might be able to modify the form so that it will run.
    If that is not feasible, your installation must make more memory available, either by modifying the operating system parameters or by adding more memory to the computer.
    Level: >25 Type: ErrorHope this helps...
    Regards,
    Abdetu...

  • API out of memory error

    Hi all,
    We are experiencing problems when running a report many times(20). They are basically notices that we need to send to customers, we end up getting out of memory error. I did a search on this forum an it seems there is a solution, but only for enterprise:
    XML Publisher 5.0 and higher can make use of a temporary directory in case there is not sufficient memory available. This can be achieve by defining the system-temp-dir property in the XML Publisher Configuration File called xdo.cfg (needs to be created manually). Refer to note 295036.1, Step 9 for detailed instructions. An example of xdo.cfg :
    <property name="system-temp-dir">/u01/oracle/app/ERPcomn/temp</property>
    <property name="xslt-scalable">true</property>
    How can I set this file or manually set the above 2 parameters if we are using the APIs in a webservice
    The code basically runs out of memory when running the second last line of this code which generates part of the report (PDF output)
                   //Generate data file based on Data Template
                   DataProcessor dataProcessor = new DataProcessor();
                   dataProcessor.setDataTemplate(dataTemplateName);
                   dataProcessor.setParameters(inParams);
                   dataProcessor.setConnection(conn);
                   dataProcessor.setOutput(dataFileName);
                   dataProcessor.processData(); //runs out of memory after 20
    generatePDFFromRTF(dataFileName, reportName,disablePrint);
    Any info would be highly appreciated
    Thanks

    U have to get upgraded to Latest patch 5.6.3+ release.
    There are many enhancements in this version which will help you avoid these common errors, that would save alot of time.

  • Acrobat XI Pro "Out of Memory" Error.

    We just received a new Dell T7600 workstation (Win7, 64-bit, 64GB RAM, 8TB disk storage, and more processors than you can shake a stick at). We installed the Adobe CS6 Master Collection which had Acrobat Pro X. Each time we open a PDF of size greater than roughly 4MB, the program returns an "out of memory" error. After running updates, uninstalling and reinstalling (several times), I bought a copy of Acrobat XI Pro hoping this would solve the problem. Same problem still exists upon opening the larger PDFs. Our business depends on opening very large PDF files and we've paid for the Master Collection, so I'd rather not use an freeware PDF reader. Any help, thoughts, and/or suggestions are greatly appreciated.
    Regards,
    Chris

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • Time machine fails backing up with a "backup disk not found" error after upgrade to Lion 10.7.2

    Time machine fails backing up with a "backup disk not found" error after upgrade to Lion 10.7.2. I've erased and reformatted the drive; reconnected Time Machine(it saw it) and began a new backup. The backup starts again but fails before completing with backup drive not found error. I used the 10.7.2 combo update. Any help on how to fix this problem would be appreciated.

    Arghhhhhhhhhhhhhhhhhhh!!!!!!
    After waiting since October for a fix, I have now upgraded the firmware on the Time Capsule and installed the new Airport Utility, released yesterday, and the situation is even worse.
    Up until now, the Airport Utility had an option to disconnect all drives manually and the Time Capsule would then work until the next reboot – a temporary (?) work-around.
    Now that option does not exist in the new-look Airport Utility. And the Airport Utility installation can’t be rolled back and the old utility won’t restore.
    The sparesbundle is still not accessible. Anyone worked out a work-around in the new environment yet?
    I have another Time Machine backup working fine to a trusty old Lacie Drive so erased the one on my Time Capsule to see if that works. I have renamed the Capsule and the Time Capsule Drive.  But to build another full backup will take at least two days and I shall be away from tomorrow and am reluctant to leave the Capsule and computer up and running for a week. And I’ll bet the sparsbundle will still be nowhere to be found.
    How can Apple screw up so badly when they are to become the richest company in the entire world and – soon – will have more cash in the bank than the entire United States? Can’t they afford someone who really can sort this out? Especially when a Firewire connected hard drive – my trusty and simple LaCie – works fine.
    Words, almost, fail me.

  • Uploading large files from applet to servlet throws out of memory error

    I have a java applet that needs to upload files from a client machine
    to a web server using a servlet. the problem i am having is that in
    the current scheme, files larger than 17-20MB throw an out of memory
    error. is there any way we can get around this problem? i will post
    the client and server side code for reference.
    Client Side Code:
    import java.io.*;
    import java.net.*;
    // this class is a client that enables transfer of files from client
    // to server. This client connects to a servlet running on the server
    // and transmits the file.
    public class fileTransferClient
    private static final String FILENAME_HEADER = "fileName";
    private static final String FILELASTMOD_HEADER = "fileLastMod";
    // this method transfers the prescribed file to the server.
    // if the destination directory is "", it transfers the file to
    "d:\\".
    //11-21-02 Changes : This method now has a new parameter that
    references the item
    //that is being transferred in the import list.
    public static String transferFile(String srcFileName, String
    destFileName,
    String destDir, int itemID)
    if (destDir.equals(""))
    destDir = "E:\\FTP\\incoming\\";
    // get the fully qualified filename and the mere filename.
    String fqfn = srcFileName;
    String fname =
    fqfn.substring(fqfn.lastIndexOf(File.separator)+1);
    try
    //importTable importer = jbInit.getImportTable();
    // create the file to be uploaded and a connection to
    servlet.
    File fileToUpload = new File(fqfn);
    long fileSize = fileToUpload.length();
    // get last mod of this file.
    // The last mod is sent to the servlet as a header.
    long lastMod = fileToUpload.lastModified();
    String strLastMod = String.valueOf(lastMod);
    URL serverURL = new URL(webadminApplet.strServletURL);
    URLConnection serverCon = serverURL.openConnection();
    // a bunch of connection setup related things.
    serverCon.setDoInput(true);
    serverCon.setDoOutput(true);
    // Don't use a cached version of URL connection.
    serverCon.setUseCaches (false);
    serverCon.setDefaultUseCaches (false);
    // set headers and their values.
    serverCon.setRequestProperty("Content-Type",
    "application/octet-stream");
    serverCon.setRequestProperty("Content-Length",
    Long.toString(fileToUpload.length()));
    serverCon.setRequestProperty(FILENAME_HEADER, destDir +
    destFileName);
    serverCon.setRequestProperty(FILELASTMOD_HEADER, strLastMod);
    if (webadminApplet.DEBUG) System.out.println("Connection with
    FTP server established");
    // create file stream and write stream to write file data.
    FileInputStream fis = new FileInputStream(fileToUpload);
    OutputStream os = serverCon.getOutputStream();
    try
    // transfer the file in 4K chunks.
    byte[] buffer = new byte[4096];
    long byteCnt = 0;
    //long percent = 0;
    int newPercent = 0;
    int oldPercent = 0;
    while (true)
    int bytes = fis.read(buffer);
    byteCnt += bytes;
    //11-21-02 :
    //If itemID is greater than -1 this is an import file
    transfer
    //otherwise this is a header graphic file transfer.
    if (itemID > -1)
    newPercent = (int) ((double) byteCnt/ (double)
    fileSize * 100.0);
    int diff = newPercent - oldPercent;
    if (newPercent == 0 || diff >= 20)
    oldPercent = newPercent;
    jbInit.getImportTable().displayFileTransferStatus
    (itemID,
    newPercent);
    if (bytes < 0) break;
    os.write(buffer, 0, bytes);
    os.flush();
    if (webadminApplet.DEBUG) System.out.println("No of bytes
    sent: " + byteCnt);
    finally
    // close related streams.
    os.close();
    fis.close();
    if (webadminApplet.DEBUG) System.out.println("File
    Transmission complete");
    // find out what the servlet has got to say in response.
    BufferedReader reader = new BufferedReader(
    new
    InputStreamReader(serverCon.getInputStream()));
    try
    String line;
    while ((line = reader.readLine()) != null)
    if (webadminApplet.DEBUG) System.out.println(line);
    finally
    // close the reader stream from servlet.
    reader.close();
    } // end of the big try block.
    catch (Exception e)
    System.out.println("Exception during file transfer:\n" + e);
    e.printStackTrace();
    return("FTP failed. See Java Console for Errors.");
    } // end of catch block.
    return("File: " + fname + " successfully transferred.");
    } // end of method transferFile().
    } // end of class fileTransferClient
    Server side code:
    import java.io.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    import java.util.*;
    import java.net.*;
    // This servlet class acts as an FTP server to enable transfer of
    files
    // from client side.
    public class FtpServerServlet extends HttpServlet
    String ftpDir = "D:\\pub\\FTP\\";
    private static final String FILENAME_HEADER = "fileName";
    private static final String FILELASTMOD_HEADER = "fileLastMod";
    public void doGet(HttpServletRequest req, HttpServletResponse resp)
    throws ServletException,
    IOException
    doPost(req, resp);
    public void doPost(HttpServletRequest req, HttpServletResponse
    resp)
    throws ServletException,
    IOException
    // ### for now enable overwrite by default.
    boolean overwrite = true;
    // get the fileName for this transmission.
    String fileName = req.getHeader(FILENAME_HEADER);
    // also get the last mod of this file.
    String strLastMod = req.getHeader(FILELASTMOD_HEADER);
    String message = "Filename: " + fileName + " saved
    successfully.";
    int status = HttpServletResponse.SC_OK;
    System.out.println("fileName from client: " + fileName);
    // if filename is not specified, complain.
    if (fileName == null)
    message = "Filename not specified";
    status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
    else
    // open the file stream for the file about to be transferred.
    File uploadedFile = new File(fileName);
    // check if file already exists - and overwrite if necessary.
    if (uploadedFile.exists())
    if (overwrite)
    // delete the file.
    uploadedFile.delete();
    // ensure the directory is writable - and a new file may be
    created.
    if (!uploadedFile.createNewFile())
    message = "Unable to create file on server. FTP failed.";
    status = HttpServletResponse.SC_INTERNAL_SERVER_ERROR;
    else
    // get the necessary streams for file creation.
    FileOutputStream fos = new FileOutputStream(uploadedFile);
    InputStream is = req.getInputStream();
    try
    // create a buffer. 4K!
    byte[] buffer = new byte[4096];
    // read from input stream and write to file stream.
    int byteCnt = 0;
    while (true)
    int bytes = is.read(buffer);
    if (bytes < 0) break;
    byteCnt += bytes;
    // System.out.println(buffer);
    fos.write(buffer, 0, bytes);
    // flush the stream.
    fos.flush();
    } // end of try block.
    finally
    is.close();
    fos.close();
    // set last mod date for this file.
    uploadedFile.setLastModified((new
    Long(strLastMod)).longValue());
    } // end of finally block.
    } // end - the new file may be created on server.
    } // end - we have a valid filename.
    // set response headers.
    resp.setContentType("text/plain");
    resp.setStatus(status);
    if (status != HttpServletResponse.SC_OK)
    getServletContext().log("ERROR: " + message);
    // get output stream.
    PrintWriter out = resp.getWriter();
    out.println(message);
    } // end of doPost().
    } // end of class FtpServerServlet

    OK - the problem you describe is definitely what's giving you grief.
    The workaround is to use a socket connection and send your own request headers, with the content length filled in. You may have to multi-part mime encode the stream on its way out as well (I'm not about that...).
    You can use the following:
    http://porsche.cis.udel.edu:8080/cis479/lectures/slides-04/slide-02.html
    on your server to get a feel for the format that the request headers need to take.
    - Kevin
    I get the out of Memory Error on the client side. I
    was told that this might be a bug in the URLConnection
    class implementation that basically it wont know the
    content length until all the data has been written to
    the output stream, so it uses an in memory buffer to
    store the data which basically causes memory issues..
    do you think there might be a workaround of any kind..
    or maybe a way that the buffer might be flushed after
    a certain size of file has been uploaded.. ?? do you
    have any ideas?

  • Photoshop CS6 has out of memory errors under Mac OS X 10.7.4

    Has anyone who has a 2008 MacBoo Pro with the NVIDIA GeForce 8600M GT 512 MB graphics card tried running Photoshop CS6 under Mac OS X 10.7.4?
    On my MacBook Pro, PS CS6 launches; however, any action results in an out of memory error.  The CS6 versions of After Effects, Premiere Pro and Illustrator all launch and run as expected under 10.7.4.
    If I switch to my Mac OS X 10.6.8 startup volume, Photoshop CS6 runs as expected.
    I have support cases open with Adobe, NVidia and Apple.  Adobe thinks I need to update the display drivers.  NVidia says display drivers must be provided by Apple.  Apple says that the display drivers are current.  The Apple support agent said that he can escalate the case if the issue can be reproduced on more than one computer.
    I'd love to keep my MacBook Pro running another year or so without having to reboot in 10.6.8 every time I need to use Photoshop.  I'd stay with 10.6.8, but of course iCloud requires 10.7.4.
    I have not tried 10.8 yet, but it's on my troubleshooting list.
    Also, PS CS6 runs fine under Mac OS X 10.7.4 on my 2010 iMac.
    Thanks in advance for any feedback.
    - Warren

    It looks like it was indeed the display driver for the NVIDIA GeForce 8600 GT 512 MB graphics card inside my 2008 MacBook Pro.
    It seems that this driver only gets installed if you upgrade from Mac OS X 10.6.7 to 10.7.4 while the startup drive is connected to the MacBook Pro itself.  I had updated while the startup drive was connected to my 2010 iMac. Accordingly, I could have reinstalled Photoshop 100 times without it ever launching as expected.  Having used the external startup drive with three other Apple computers (all 2010 or newer machines) with Photoshop CS6 launching as expected, I had just assumed (mistakenly) that it would work as expected with my 2008 computer.
    So, this was simple enough to resolve by upgrading the OS on the external startup drive while having it connected to the MacBook Pro.   In hindsight, this makes perfect sense.
    -Warren

Maybe you are looking for