Will 64-bit office fix out of memory error?

I've been troubleshooting an out of memory error in Excel 2010 for some time. I've read quite a few articles on forums and on MS sites (including here.) I find many hits but none seems to offer a solution that works. One idea that seems to show
up often is that 64-bit Office versions will have a LOT MORE memory to work with than 32-bit versions. I'm running a 64-bit version of Windows 7, so I'm considering giving Office 2010 64-bit a try. However, I also find a lot of caveats in those articles that
concern me. On the other hand most of those articles are 2 to 3 years old, so I'm wondering if most of those issues have been dealt with. For example, VBA issues; third-party add-ins that do not (did not) support 64-bit; ActiveX and COM issues.
Sorry to be overly verbose. My questions pretty much come down to this:
1. Is 64-bit likely to solve my out of memory problem?
2. What issues are still unresolved in 64-bit Excel with Windows 7?
TIA,
Phil

If you are an Excel power user working with huge amounts of data, then you would benefit from 64-bit Office being able to utilize more memory.
MS is recommending 32-bit Office as the default installation on both 32-bit and 64-bit Windows mainly due to compatibility with existing 32-bit controls, add-ins, and VBA.
This is really not an issue that can be resolved on Office side, it depends on whether you are using any 32-bits controls, add-ins and VBA. The question is those existing 32-bit controls, add-ins and vba need an update to adapter to the 64bit of Office.
If all your controls, add-ins and VBA is 64 bits, or designed to work with 64 bits of Office, then you are good.
Bhasker Timilsina (ManTechs Inc)

Similar Messages

  • Getting out of memory errors in Indesign 5.5. What can I do to fix it?

    Getting out of memory errors in Indesign 5.5. What can I do to fix it?

    Tell your dumb friend to pay you for a new phone as he damaged it. You cannot get help here for a phone that has been taken apart, as it is not user servicible. Your dumb friend also voided your warranty and, even if the warranty were expired Apple will never touch that phone.
    Time to get smarter friends.

  • Acrobat XI Pro "Out of Memory" error after Office 2010 install

    Good Afternoon,
    We recently pushed Office 2010 to our users and are now getting reports of previous installs of Adobe Acrobat XI Pro no longer working but throwing "Out of Memory" errors.
    We are in a Windows XP environment. All machines are HP 8440p/6930p/6910 with the same Service pack level (3) and all up to date on security patches.
    All machines are running Office 2010 SP1.
    All machines have 2GB or 4GB of RAM (Only 3.25GB recognized as we are a 32bit OS environment).
    All machines have adequate free space (ranging from 50gb to 200gb of free space).
    All machines are set to 4096mb initial page file size with 8192mb maximum page file size.
    All machines with Acrobat XI Pro *DO NOT* have Reader XI installed alongside. If Reader is installed, it is Reader 10.1 or higher.
    The following troubleshooting steps have been taken:
    Verify page file size (4096mb - 8192mb).
    Deleted local user and Windows temp files (%temp% and c:\WINDOWS\Temp both emptied).
    Repair on Adobe Acrobat XI Pro install. No change.
    Uninstall Acrobat Pro XI, reboot, re-install. No change.
    Uninstall Acrobat Pro XI Pro along with *ALL* other Adobe applications presently installed (Flash Player, Air), delete all Adobe folders and files found in a full search of the C drive, delete all orphaned Registry entries for all Adobe products, re-empty all temp folders, reboot.
    Re-install Adobe Acrobat XI Pro. No change.
    Disable enhanced security in Acrobat XI Pro. No change.
    Renamed Acrobat XI's plug_ins folder to plug_ins.old.
    You *can* get Acrobat to open once this is done but when you attempt to edit a file or enter data into a form, you get the message, "The "Updater" plug-in has been removed. Please re-install Acrobat to continue viewing the current file."
    A repair on the Office 2010 install and re-installing Office 2010 also had no effect.
    At this point, short of re-imaging the machines (which is *not* an option), we are stumped.
    We have not yet tried rolling back a user to Office 2007 as the upgrade initiative is enterprise-wide and rolling back would not be considered a solution.
    Anyone have any ideas beyond what has been tried so far?

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • Out of memory error - from parsing a "fixed width file"

    This may be fairly simple for someone out there but I am trying to write a simple program that can go through a "fixed width" flat txt file and parse it to be comma dilmeted.
    I use a xml file with data dictionary specifications to do the work. I do this because there are over 430 fields that need to be parsed from a fixed width with close to 250,000 lines I can read the xml file fine to get the width dimensions but when I try to apply the parsing instructions, I get an out of memory error.
    I am hoping it is an error with code and not the large files. If it is the latter, does anyone out there know some techniques for getting at this data?
    Here is the code
       import java.io.*;
       import org.w3c.dom.Document;
       import org.w3c.dom.*;
       import javax.xml.parsers.DocumentBuilderFactory;
       import javax.xml.parsers.DocumentBuilder;
       import org.xml.sax.SAXException;
       import org.xml.sax.SAXParseException;
        public class FixedWidthConverter{
          String[] fieldNameArray;
          String[] fieldTypeArray;
          String[] fieldSizeArray;      
           public static void main(String args []){
             FixedWidthConverter fwc = new FixedWidthConverter();
             fwc.go();
             fwc.loadFixedWidthFile();
            //System.exit (0);
          }//end of main
           public void go(){
             try {
                DocumentBuilderFactory docBuilderFactory = DocumentBuilderFactory.newInstance();
                DocumentBuilder docBuilder = docBuilderFactory.newDocumentBuilder();
                Document doc = docBuilder.parse (new File("files/dic.xml"));
                // normalize text representation            doc.getDocumentElement ().normalize ();
                System.out.println ("Root element of the doc is " +
                     doc.getDocumentElement().getNodeName());
                NodeList listOfFields = doc.getElementsByTagName("FIELD");
                int totalFields = listOfFields.getLength();
                System.out.println("Total no of fields : " + totalFields);
                String[] fldNameArray = new String[totalFields];
                String[] fldTypeArray = new String[totalFields];
                String[] fldSizeArray = new String[totalFields];
                for(int s=0; s<listOfFields.getLength() ; s++){
                   Node firstFieldNode = listOfFields.item(s);
                   if(firstFieldNode.getNodeType() == Node.ELEMENT_NODE){
                      Element firstFieldElement = (Element)firstFieldNode;
                      NodeList firstFieldNMList = firstFieldElement.getElementsByTagName("FIELD_NM");
                      Element firstFieldNMElement = (Element)firstFieldNMList.item(0);
                      NodeList textFNList = firstFieldNMElement.getChildNodes();
                      //System.out.println("Field Name : " +
                               //((Node)textFNList.item(0)).getNodeValue().trim());
                      //loads values into an array
                      //fldNameArray[s] = ((Node)textFNList.item(0)).getNodeValue().trim();
                      NodeList typeList = firstFieldElement.getElementsByTagName("TYPE");
                      Element typeElement = (Element)typeList.item(0);
                      NodeList textTypList = typeElement.getChildNodes();
                      //System.out.println("Field Type : " +
                               //((Node)textTypList.item(0)).getNodeValue().trim());
                      //loads values into an array
                      //fldTypeArray[s] = ((Node)textTypList.item(0)).getNodeValue().trim(); 
                      NodeList sizeList = firstFieldElement.getElementsByTagName("SIZE");
                      Element sizeElement = (Element)sizeList.item(0);
                      NodeList textSizeList = sizeElement.getChildNodes();
                      //System.out.println("Field Size : " +
                               //((Node)textSizeList.item(0)).getNodeValue().trim());
                      //loads values into an array
                      fldSizeArray[s] = ((Node)textSizeList.item(0)).getNodeValue().trim();   
                   }//end of if clause
                }//end of for loop with s var
                //setFldNameArray(fldNameArray);
                //setFldTypeArray(fldTypeArray);
                setFldSizeArray(fldSizeArray);
                 catch (SAXParseException err) {
                   System.out.println ("** Parsing error" + ", line "
                      + err.getLineNumber () + ", uri " + err.getSystemId ());
                   System.out.println(" " + err.getMessage ());
                 catch (SAXException e) {
                   Exception x = e.getException ();
                   ((x == null) ? e : x).printStackTrace ();
                 catch (Throwable t) {
                   t.printStackTrace ();
          }//end go();
           public void setFldNameArray(String[] s){
             fieldNameArray = s;
          }//end setFldNameArray
           public void setFldTypeArray(String[] s){
             fieldTypeArray = s;
          }//end setFldTypeArray
           public void setFldSizeArray(String[] s){
             fieldSizeArray = s;
          }//end setFldSizeArray
           public String[] getFldNameArray(){
             return fieldNameArray;
          }//end setFldNameArray
           public String[] getFldTypeArray(){
             return fieldTypeArray;
          }//end setFldTypeArray
           public String[] getFldSizeArray(){
             return fieldSizeArray;
          }//end setFldSizeArray 
           public int getNumLines(){
             int countLines = 0;
             try {
                    //File must be in same director and be the name of the string below
                BufferedReader in = new BufferedReader(new FileReader("files/FLAT.txt"));
                String str;
                while ((str = in.readLine()) != null) {
                   countLines++;
                in.close();
                 catch (IOException e) {}    
             return countLines;
          }//end of getNumLines
           public void loadFixedWidthFile(){
             int c = getNumLines();
             int i = 0;
             String[] lineProcessed = new String[c];
             String chars;
             try {
                    //File must be in same director and be the name of the string below
                BufferedReader in = new BufferedReader(new FileReader("files/FLAT.txt"));
                String str;
                while ((str = in.readLine()) != null) {
                   //System.out.println(str.length());
                   lineProcessed[i] = parseThatLine(str);
                   i++;
                in.close();
                 catch (IOException e) {}     
                //write out the lineProcess[] array to another file
             writeThatFile(lineProcessed);
          }//end loadFixedWidthFile()
           public void writeThatFile(String[] s){
             try {
                BufferedWriter out = new BufferedWriter(new FileWriter("files/outfilename.txt"));
                for(int i = 0; i < s.length -1; i++){
                   out.write(s);
    }//end for loop
    out.close();
    catch (IOException e) {}
    }//end writeThatFile
    public String parseThatLine(String s){
    int start = 0;
    int end = 0;
    String parsedLine = "";
    int numChars = getFldSizeArray().length;
    //Print number of lines for testing
    //System.out.println(numChars);
    String[] oArray = getFldSizeArray();
    //String chars = oArray[0];
    //System.out.println(chars.length());
    //oArray
    for(int i = 0; i < numChars -1; i++ ){
    if(i == 0){
    start = 0;
    end = end + Integer.parseInt(oArray[i])-1;
    else
    start = end;
    end = end + Integer.parseInt(oArray[i]);
    parsedLine = parsedLine + s.substring(start, end) + "~";
    }//end for loop
    return parsedLine;
    }//End of parseThatLine
    I have tried to illeminate as many arrays as I can thinking that was chewing up the memory but to no avail.
    Any thoughts or ideas?
    Message was edited by:
    SaipanMan2005

    You should not keep a String array of all the lines of the file read.
    Instead for each line read, parse it, then write the parsed line in the other file:      public void loadFixedWidthFile() {
             BufferedReader in = null;
             BufferedWriter out = null;
             try {
                //File must be in same director and be the name of the string below
                in = new BufferedReader(new FileReader("files/FLAT.txt"));
                out = new BufferedWriter(new FileWriter("files/outfilename.txt"));
                String str;
                while ((str = in.readLine()) != null) {
                   //System.out.println(str.length());
                   str = parseThatLine(str);
                   //write out the parsed str to another file
                   out.write(str);
             catch (IOException e) {
                e.printStackTrace(); // At least print the exception - never swallow an exception
             finally { // Use a finally block to be sure of closing the files even when exception occurs
                try { in.close(); }
                catch (Exception e) {}
                try { out.close(); }
                catch (Exception e) {}
          }//end loadFixedWidthFile()Regards

  • "Out of memory" error using SmartView v11.1.1.3.500, MS Excel 2007 & MS Win7 Prof SP1 (all 32-bit)

    Hi All,
    A user is regularly experiencing "Out of memory" error messages while retrieving large MS Excel 2007 worksheets (ad-hoc analysis; approx 700 rows by 13 columns) connected to a Planning cube via 32-bit SmartView v11.1.1.3.500 (Build 008) on a 32-bit MS Windows 7 Prof (SP1) computer with 4GB of RAM. The same user is reporting experiencing a number of other issues (eg, TCP-related time-out, unable to connect to the APS, SmartView add-in disappearing, etc) at the same time.
    I could not locate any specific KB document from the My Oracle Support website which addressed these specific issues all at once but from various posts out there, the recommendations to address similar issues were as follows:
    Tick the Options > Display > Reduce Excel file size option;
    Tick the Options > Display > Improve metadata storage option;
    Rebuild the MS Excel workbook from scratch;
    Delete all temp files located in the C:\Users\USER NAME\AppData\Local\Temp directory;
    Disable auto-recovery for MS Excel;
    Add misc TCP-related registry entries (eg, TcpTimedWaitDelay, MaxUserPort, MaxFreeTcbs, etc) on both the client PC and server;
    Adjust MS Windows for best performance;
    Increase the page file by 25%-50% more than the physical amount of RAM installed on the client PC;
    Relocate the page file to a different drive as compared to the drive where MS Windows is installed on the client PC;
    On top of the above, are there any other recommendations anyone else would like to share to address the "Out of memory" issue?
    Many thanks in advance,
    JBM

    Monitor the Full GC in GC log and see if there is any gradual increase in the free memory retained after every Full GC over a period, if there is a gradual increase and if its reaching maximum specified heap over a period of time, that means there might be some slow leak from application or native libraries.
    Also please check if you have any pattern or request which might be triggering OOM all of the sudden.
    If its memory leak best way to investigate that is to capture JRA's at regular interval's and monitor top 10 objects and see which one is growing and consuming more % of heap over a period of time.
    You can also have this by captured by print_object_summary and print_memusage options in JRCMD command over a period of time.
    Hope this helps.
    - Tarun

  • "out of memory error" will more memory help?

    I seem to be reading mixed reviews. I have an employee doing a really simple editing task with FCP 4.5 on an iMac we have that is 1.8 GHZ with 512 MB RAM. she keeps getting "Out of Memory" errors in FCP 4.5. So we're buying 2 GB RAM. Now I'm reading in other forums that the error may still continue. Any thoughts?

    Aside from the lack of ram - 512 MB is really the minimum for the OS - out of memory messages can also come about when using CMYK color space graphics in FCP. All images should be in the RGB color space.
    This usually doesn't appear as an issue unless you are using images that were intended for offset printing or someone inadvertently changed the color space from RGB.
    Good luck.
    x

  • Microsoft office showing out of memory

    microsoft for mac displaying an out of memory error
    when trying to load word.
    have 2 gb memory and have empty trash??
    and rebooted/any suggestions??
    thanks

    Apparently this problem can occur if the preference setting Word > Preferences > Save > Save AutoRecovery Info is set to an interval shorter than 10 minutes...
    Bob

  • Out of Memory error after upgrading to Reader X

    We have several PDF documents that work fine on Reader 9.  But after upgrading to Reader X, the files will not open.  Instead, they report an Out Of Memory error and a blank document opens instead.  Removing Reader X and re-installing Reader 9 corrects the issue. This has been reproduced on 3 different PCs running both Windows XP and Windows 7.
    Any suggestions?

    Just to throw in my 2 cents... Adobe has known about the out of memory bug at least since 01/12/2011 because that is when I reported the problem to them.  I even had an escalated support case #181995624 and bug#2800823.  Our problem comes from a EFI Fiery E7000 attached to our Lanier LD460c Copier.  Any pdf's made from that copier will not open in Acrobat X, although they will open in lower versions of Acrobat just fine.  Our only workaround is to keep Acrrobat 9 on office computers, or you can open the offending pdf in Acrobat 9 or earlier and print it to pdf, and then it will open in Acrobat X!!!  They acknowledged that this was a bug, see my email chain below.  This was the last update I received from Adobe, very frustrating...
    From:
    Sent: Wednesday, February 09, 2011 9:12 AM
    To:
    Cc:
    Subject: RE: Case 181995624; Unable to open PDF
    Hi Phil,
    We do not have this information or estimate time by our Engineering team yet.
    Regards,
    Neeraj
    From:
    Sent: Monday, February 07, 2011 8:19 PM
    To:
    Cc:
    Subject: RE: Case 181995624; Unable to open PDF
    Next major release as in the next patch for Acrobat X, or as in Acrobat 11?
    From:
    Sent: Saturday, February 05, 2011 4:31 AM
    To:
    Cc:
    Subject: Case 181995624; Unable to open PDF
    Hi Phil,
    You can get back to us with the Bug Number provided you earlier based on which we will give you the update by our Engineering team. However, the update for now is that it is decided to fix in our next major release.
    Regards,
    Neeraj
    From:
    Sent: Thursday, February 03, 2011 1:33 AM
    To:
    Subject: RE: Case 181995624; Unable to open PDF
    Can you send me a link to where I can find information on the bug?
    From:
    Sent: Tuesday, February 01, 2011 10:14 AM
    To:
    Cc:
    Subject: Case 181995624; Unable to open PDF
    Hi Phil,
    Hope you are doing well.
    I have researched on your issue and found that it is a BUG with Acrobat X. I have logged it on your behalf so that our Engineering team can have a look on that. Please find the Bug Number #2800823 for any update on this in future. I am closing this case on your behalf.
    Have a nice day.
    Regards,
    Neeraj
    From:
    Sent: Tuesday, February 01, 2011 12:22 AM
    To:
    Cc:
    Subject: RE: Case 181995624; Unable to open PDF
    Any updates on this case?
    From:
    Sent: Friday, January 14, 2011 2:03 PM
    To:
    Cc:
    Subject: RE: Case 181995624; Unable to open PDF
    The EFI Fiery E-7000 Controller version is 1.2.0 and it handles the scanning and printing functionality of our Lanier LD160c copier.  I have attached two sample files.  One is a 1 page scan from our copier.  The other is a combined pdf that I just created in Acrobat 9.  The first two pages of the combined pdf consists of a webpage that I printed using Acrobat 9 and then the scan from the copier is on the 3rd page.  In Acrobat X, once you get to the 3rd page you will receive the Out of Memory error.  It will open in previous versions of Acrobat just fine though.
    From:
    Sent: Friday, January 14, 2011 11:52 AM
    To:
    Cc:
    Subject: Case 181995624; Unable to open PDF
    Hi Phil,
    Thanks for the information.
    I have the PDF file provided by you and able to reproduce the behavior. I tried to call you at 214-303-1500 but got voice mail.
    Please let me know when you will be available so that I could call you and continue further with this issue.
    Regards,
    Neeraj
    From:
    Sent: Thursday, January 13, 2011 6:57 AM
    To:
    Cc:
    Subject: Re: Case 181995624; Unable to open PDF
    It is a walk up copier and we scan to email.  The EFI Fiery controller E7000 handles pdf conversion for the copier, but yes it has the latest firmware.  The bottom line is that we have 3 or 4 years worth of pdfs created from that copier rendered useless by Acrobat X.  They open fine in previous versions of Acrobat.  Did you get the test pdf file when this case was created?
    -- Sent from my Palm Pre
    On Jan 12, 2011 6:12 PM, Acrobat Support <[email protected]> wrote:
    Hi Philip,
    Thank you for choosing Adobe Support we have got your concern, we see that you are facing an issue with opening PDF files in Acrobat X, created from Lanier (Ricoh) LD160c copier. A technical support case 181995624 has been created for your reference. In order to assist you further. We would like to have the following information.
    ·         Are you using the latest scanner driver ?
    ·         What is the exact scanning workflow ?
    Regards
    Acrobat Support

  • Out of Memory error on some PDF's, not all PDF's.

    Hi there,
    I have read most of the posts in the forums that relate to 'Out of Memory' issues with PDF's and I have to say that there is still no solution that I have found.
    I have tried reinstalling Adobe Reader, Flash Player and tried clearing my Temp Files. None of these fixed the issue.
    The PDF's that receive this memory error are downloaded off a CMS website that has many offices and logins. Only one office is experiencing this out of memory issue so we know that it is not an issue with generating the PDF's on the CMS website, otherwise all the offices and logins would have this issue, since they use the same system.
    I am an admin of that CMS website and even I receive the same out of memory issue when I try and view the PDF's from that office, as the customer does.
    When we open these PDF's with another PDF reading program, the PDF's open fine. So that tells me that the issue lies with Adobe Reader and not the PDF file itself.
    These PDF files that are receiving the error about 1MB-4MB, so they are not large.
    Could you please tell me the solution to this out of memory error as we may lose a customer because your product is producing an error that does not seem to have been fixed after years of being reported by your customers.
    Thanks.

    As an Adobe Reader XI user, if I may put my two cents in, that "Out of Memory" problem is not with the Adobe Reader XI application, which was installed on our computers! I think, the problem is with how the PDF documents, which we were trying to open, were generated (or regenerated).
    I remember I was able to open my old credit card statement online without any problem and now I am not able to open the same old statement (out of memory) because it could have been regenerated! In fact, I could not open any of my credit card statements (old or new) on this specific credit card web site, which makes me believe those statments were regenerated or somthing with some new Adobe software.
    As others mentioned, even I'm able to view other PDF documents without any problem.

  • Thread: Out of memory error

    I'm running a program that reads a file of more than 20k lines. Each line of this file is being processed by a thread. Sometimes the program is being halted with the out of memory error.
    What can I do to solve this problem ? Is it better to change the logic of this program and instead of calling a new thread for each line, call few threads to be responsible of processing multiple lines ?
    The current java version is 1.4.2.13. Upgrading it to the most recent version can solve the problem ?
    I'm running it with the following configuration:
    "-Xms1024m -Xmx2048m -XX urvivorRatio=10"
    Thanks a lot,
    Marcos.

    Indeed this design is flawed. It is an old system that came to my hands and now I have to fix some bugs on it.
    What is the difference between calling thousand of threads and having this backport of concurrent to take care of it ? I mean, isn't the jvm responsible for managing the both cases ?
    Using the backport of concurrent and defining a limited size thread pool, what will happen if the number of lines in the file is greater than the size of this pool ?
    Do you have an example or a link to where I can get a good example on how to use it ?
    Thanks for helping!

  • Why do I get a Track out of memory error while running open loop frequency response?

    MatrixX Build 61mx1411: I get a "Track out of memory" error when I run the Open Loop Frequency Response from the MatrixX pull down tools. What can I do to prevent this? We are running on an HP B1000 with 768 MB of RAM under HP-UX 10.2.

    In the old days of Mx say Version 5 and prior the user actually selected the amount of memory that would be allocated. Depending on the size of the model etc. you would have to allocate memory. In version 6.0 and going forward there is no need for the user to manually allocate the memory.
    Build {rstack=50000,istack=200000,sstack=50000,cstack=50​0 000}
    If this is a command in a script file that you are running and the error is resulting from that then I would try commenting out everything after the letter d in the word build and then starting it back up.
    i.e. only use Build
    I don't believe that there is a way to manually allocate the initial SystemBuild Stack size.
    I believe initially the stack size is set to 10010.
    However, one way
    you can manually set the initial SystemBuild stack size,is to create a large StateSpace as soon as you start up SystemBuild. This will prevent piece-meal reallocs while using SystemBuild.
    You can created a new SuperBlock in SystemBuild and then drop down a StateSpace Block with 199 inputs and 199 Outputs and 1 State and entered ones(200,200)as the StateSpace Matrix without any problems. This would resize this internal stack to at least 40000.
    You really should not have to do this but if that helps then you might think about doing this in your startup.ms file you could use SBA or load the file then you could delete the superblock and begin working.
    "Bob" gave me this little tid bit.
    Please let me know if any of this is of use.
    Garrett
    Garrett Thurston
    [email protected]
    Phone: 781.993.5540

  • Indesign cs5 'out of memory' error when using preflight

    I have been regulary getting an 'out of memory' error when i choose to use my bespoke preflight profile.
    I have 4gig of ram and run Indesign CS5 on OS 10.6.8.
    Does anyone know a work around?
    As soon as I select from the basic default profile, i get the beach ball from hell for 10mins, then it kindly lets me know that I am out of memory, sends a crash report to Adobe and then asks if I want to relauch. I'm stuck in a vicious circle. I must of sent my 4th crash report now and no feedback from anyone at Adobe.

    I have replaced my preferences, but still the problem persists. I have tried switching my view from typical display to fast display before i selected a profile. I thought this may give me the extra memory I needed to avoid the enevitable crash. I learnt that 2 files were indeed rgb instead of cmyk before it crashed again. So I switched them to cmyk and tried again, selected my bespoke profile, but yet again it crashed. I think the problem lies with the file, not Indesign, as i have tried the same profile on a different file and the program doesn't crash and runs as it should. So if in future I need to use said crashing file again, firstly i will need to try Peter's isolate fix method. Otherwise i'll never be able to progress to successful a pdf.

  • Thread Count and Out Of Memory Error

    Hi,
    I was wondering if setting the ThreadPoolSize to a value which is too low can
    cause an out of memory error. My thought is that when it is time for Weblogic
    to begin garbage collection, if it does not get a thread fast enough it is possible
    that memory would be used up before the garbage collection takes place.
    I am asking this because I am trying to track down the cause of an out-of-memory
    occurrence, while at the same time I believe I need to raise the ThreadPoolSize.
    Thanks,
    Mark

    Oops ...
    I was wondering if setting the ThreadPoolSize to a value which is too
    low can cause an out of memory error.No, but the opposite can be true.
    My thought is that when it is time for Weblogic
    to begin garbage collection, if it does not get a thread fast enough it is
    possible that memory would be used up before the garbage collection
    takes place.Weblogic doesn't do GC ... that's the JVM and if it needs a thread it will
    not be using one of Weblogic's execute threads.
    > I am asking this because I am trying to track down the cause of an
    out-of-memory occurrenceIt could be configuration (new vs. old heap for example), but it is probably
    just data that you are holding on to or native stuff (e.g. type 2 JDBC
    driver objects) that you aren't cleaning up correctly.
    while at the same time I believe I need to raise the ThreadPoolSize.Wait until you fix the memory issue.
    Peace,
    Cameron Purdy
    Tangosol, Inc.
    Clustering Weblogic? You're either using Coherence, or you should be!
    Download a Tangosol Coherence eval today at http://www.tangosol.com/
    "Mark Glatzer" <[email protected]> wrote in message
    news:[email protected]..

  • InDesign - out of memory error

    Hi All,
    One of a clients is experiencing a issue whilst using InDesign CS4 version 6.0.5, they are constantly interrupted by an “Out of Memory” error while using InDesign.
    They have been experiencing this issue for some time now and we have performed memory upgrades and also upgraded to version 6.0.5 (which was meant to solve the issue) several months ago and yet they are still getting this issue.
    The issue that they are experiencing is the following error that was said to have been resolved in the 6.0.5 update.
    Document containing hundreds of text frames and custom baseline grids takes a long time to open and
    causes an “out of memory” error, followed by an unexpected quit. [2253219]
    Has there been another hot fix or a known resolution for this issue? Anyone else experiencingthis?
    Cheers,
    Allan

    Hi Scott,
    Their system has the following specs;
    OS:     Windows Vista Business 32-Bit
    RAM:  3GB DDR2
    CPU:   E8400 @ 3.00Ghz (Dual Core)
    Garphics: Onboard intel G33/G31
    Regards,
    Allan

  • Out of memory error, possibly corrupted fs?

    I just got my tour, and I love it. However I ran into a weird problem.
    The desktop manager never show the BB messager as installed, then it tries to uninstall it. If I check the checkbox it installs over it and breaks it.
    I was able to fix the messanger by reinstalling over the old one from the app store OTA.
    Now, vilingo wont run, it throws an out of memory error. I've tried removing it and reinstalling it every way I can think of.
    Also sometimes it takes 20+ minutes to reboot.
    The crux of the matter is I could easily exchange it at this point, do you think I should or should I wipe it? any other suggestions?
    Thanks,
    Chance

    No, not corrupted. You are out of memory..... just like a computer that is bogged down with too much stuff, your phone can only handle so much. The reason your apps keep disabling is most likely it is archiving them to make room for everything else you've got going. The biggest memory muncher is most likely pics or videos... as few as 5 or 6 pics and throw your phone off.  The other thing you might be noticing is missing call logs, messages, or emails.... again, the phone is automatically clearing them out of memory (possibly before you[ve even had a chance to view them) to make room for everything else you've saved to it.
    As far as startup times, again... MEMORY. the more files the blackberry has to scan through when it starts up (and it scans EVERY file EVERY time) the longer its going to take.
    The solution for all of these issues is to get a memory card and use your desktop software to transfer all of the pic video and ringtone content to it. you can transfer one the phone itesself but its a time consuming process.  after this is done pull your battery... startup still takes about 10 min on the tour but you should be ok.  The phone will automatically save media to the SD card once its installed. 

Maybe you are looking for