Elements 2.0 Memory Error

OK...I know this is an old version but it generally has every feature I need for my home video conversions.
I recently started getting an Add Media Error "There is not enough memory to add requested media."
I'm just trying to add about 20 pictures to make a slideshow which I've done many times....so they are just .jpg files and very small.
I have 8G free space on my hard drive and 200G free on my external.  Also, 1G RAM.
The only recent change I made to my computer was to add a wireless router.
Any advice?
Thank you!!

I agree with Steve, and maybe am a bit more stringent on my free-space. I recommend 50GB of defragmented free-space as a minimum. There are many files, beyond just the Exported file size, that PrE needs to work correctly - much more.
Also, there are several aspects of "memory." One is physical RAM, but another is the OS's Page FIle (Windows' Virtual Memory). Both are very important. The first is physical RAM, and the second is HDD space, that Windows will use, as RAM.
This ARTICLE will possibly offer tips on getting your computer setup and ready for an NLE session. Even on my workstation with 4GB RAM (3+GB switch) and many TB's of HDD space, I use those tips for every session.
Good luck,
Hunt

Similar Messages

  • Out of memory error adobe elements 3.0

    When I try to open adobe photo elements 3.0 I get an out of memory error.  the application wont even open.  I upgraded my RAM from 2GB to 3 GB but it has made no difference

    stacey_r79 wrote:
    Hi, just wondering if you had any success in retreiving or opening your Organizer?? Im having the same problem and i have tried everything, i even rang Adobe and they told me there were no technitions able to help because Adobe Photoshop Elements 3.0 was too old. If anyone has a solution please let me know, i have 5 years worth of photos in this software, i just want to be able to open the organizer to burn all my pics to CD then i will update my software. PLEASE HELP!!!!!!!!
    First of all, your photos are not, "in this software" they are in folders on your hard drive, and PSE merely points to them. If >>you personally<< didn't delete them, or reformat your HDD, then they are still there. The default file path for your photos is this; "My Documents" > "My Photos" > "Adobe", and then a couple of different folders dependent on photo type, for example, "Scanned Photos".
    If you are using the default location, you can simply copy and paste you pictures to an external HDD, or burn them to CD or DVD with Windows. I am not recommending this, since if would not allow you to carry over information created by the organizer, and the photos would have to be reimported and retagged, by your files would be backed up and safe, the most important thing.
    There really isn't a need for all these histrionics, Even if you uninstall PSE, your photos will still be exactly where they were.  Even if you uninstall PSE, its catalog information will still be intact, and a later version will pick it up and "convert it" to its own file system.
    Support for PSE-3 has obviously ended, it's more than 5 years and 5 versions old. Just as a topic of conversation, my personal favorite version of PSE, is PSE-5.  IMO, it's the fasest importing, the most reliable, yet it has a lot more bells and whistles than PSE-3. If you can find a usable copy, I'd grab it, especially for a computer running XP.
    As to the out of memory issue there can be several issues that can affect this, for example; not enough RAM is allotted to PSE, this is independent of the amount of RAM actually installed in the computer. The scratch disc area is full or insufficient, the preferences file is corrupt.
    Everybody is complaining about not being helped in this thread, yet no one has posted back as to whether they have tried Barbara's suggestions, or even if they understand enough about Windows file system to be able to begin to help themselves.

  • Out of memory error - from parsing a "fixed width file"

    This may be fairly simple for someone out there but I am trying to write a simple program that can go through a "fixed width" flat txt file and parse it to be comma dilmeted.
    I use a xml file with data dictionary specifications to do the work. I do this because there are over 430 fields that need to be parsed from a fixed width with close to 250,000 lines I can read the xml file fine to get the width dimensions but when I try to apply the parsing instructions, I get an out of memory error.
    I am hoping it is an error with code and not the large files. If it is the latter, does anyone out there know some techniques for getting at this data?
    Here is the code
       import java.io.*;
       import org.w3c.dom.Document;
       import org.w3c.dom.*;
       import javax.xml.parsers.DocumentBuilderFactory;
       import javax.xml.parsers.DocumentBuilder;
       import org.xml.sax.SAXException;
       import org.xml.sax.SAXParseException;
        public class FixedWidthConverter{
          String[] fieldNameArray;
          String[] fieldTypeArray;
          String[] fieldSizeArray;      
           public static void main(String args []){
             FixedWidthConverter fwc = new FixedWidthConverter();
             fwc.go();
             fwc.loadFixedWidthFile();
            //System.exit (0);
          }//end of main
           public void go(){
             try {
                DocumentBuilderFactory docBuilderFactory = DocumentBuilderFactory.newInstance();
                DocumentBuilder docBuilder = docBuilderFactory.newDocumentBuilder();
                Document doc = docBuilder.parse (new File("files/dic.xml"));
                // normalize text representation            doc.getDocumentElement ().normalize ();
                System.out.println ("Root element of the doc is " +
                     doc.getDocumentElement().getNodeName());
                NodeList listOfFields = doc.getElementsByTagName("FIELD");
                int totalFields = listOfFields.getLength();
                System.out.println("Total no of fields : " + totalFields);
                String[] fldNameArray = new String[totalFields];
                String[] fldTypeArray = new String[totalFields];
                String[] fldSizeArray = new String[totalFields];
                for(int s=0; s<listOfFields.getLength() ; s++){
                   Node firstFieldNode = listOfFields.item(s);
                   if(firstFieldNode.getNodeType() == Node.ELEMENT_NODE){
                      Element firstFieldElement = (Element)firstFieldNode;
                      NodeList firstFieldNMList = firstFieldElement.getElementsByTagName("FIELD_NM");
                      Element firstFieldNMElement = (Element)firstFieldNMList.item(0);
                      NodeList textFNList = firstFieldNMElement.getChildNodes();
                      //System.out.println("Field Name : " +
                               //((Node)textFNList.item(0)).getNodeValue().trim());
                      //loads values into an array
                      //fldNameArray[s] = ((Node)textFNList.item(0)).getNodeValue().trim();
                      NodeList typeList = firstFieldElement.getElementsByTagName("TYPE");
                      Element typeElement = (Element)typeList.item(0);
                      NodeList textTypList = typeElement.getChildNodes();
                      //System.out.println("Field Type : " +
                               //((Node)textTypList.item(0)).getNodeValue().trim());
                      //loads values into an array
                      //fldTypeArray[s] = ((Node)textTypList.item(0)).getNodeValue().trim(); 
                      NodeList sizeList = firstFieldElement.getElementsByTagName("SIZE");
                      Element sizeElement = (Element)sizeList.item(0);
                      NodeList textSizeList = sizeElement.getChildNodes();
                      //System.out.println("Field Size : " +
                               //((Node)textSizeList.item(0)).getNodeValue().trim());
                      //loads values into an array
                      fldSizeArray[s] = ((Node)textSizeList.item(0)).getNodeValue().trim();   
                   }//end of if clause
                }//end of for loop with s var
                //setFldNameArray(fldNameArray);
                //setFldTypeArray(fldTypeArray);
                setFldSizeArray(fldSizeArray);
                 catch (SAXParseException err) {
                   System.out.println ("** Parsing error" + ", line "
                      + err.getLineNumber () + ", uri " + err.getSystemId ());
                   System.out.println(" " + err.getMessage ());
                 catch (SAXException e) {
                   Exception x = e.getException ();
                   ((x == null) ? e : x).printStackTrace ();
                 catch (Throwable t) {
                   t.printStackTrace ();
          }//end go();
           public void setFldNameArray(String[] s){
             fieldNameArray = s;
          }//end setFldNameArray
           public void setFldTypeArray(String[] s){
             fieldTypeArray = s;
          }//end setFldTypeArray
           public void setFldSizeArray(String[] s){
             fieldSizeArray = s;
          }//end setFldSizeArray
           public String[] getFldNameArray(){
             return fieldNameArray;
          }//end setFldNameArray
           public String[] getFldTypeArray(){
             return fieldTypeArray;
          }//end setFldTypeArray
           public String[] getFldSizeArray(){
             return fieldSizeArray;
          }//end setFldSizeArray 
           public int getNumLines(){
             int countLines = 0;
             try {
                    //File must be in same director and be the name of the string below
                BufferedReader in = new BufferedReader(new FileReader("files/FLAT.txt"));
                String str;
                while ((str = in.readLine()) != null) {
                   countLines++;
                in.close();
                 catch (IOException e) {}    
             return countLines;
          }//end of getNumLines
           public void loadFixedWidthFile(){
             int c = getNumLines();
             int i = 0;
             String[] lineProcessed = new String[c];
             String chars;
             try {
                    //File must be in same director and be the name of the string below
                BufferedReader in = new BufferedReader(new FileReader("files/FLAT.txt"));
                String str;
                while ((str = in.readLine()) != null) {
                   //System.out.println(str.length());
                   lineProcessed[i] = parseThatLine(str);
                   i++;
                in.close();
                 catch (IOException e) {}     
                //write out the lineProcess[] array to another file
             writeThatFile(lineProcessed);
          }//end loadFixedWidthFile()
           public void writeThatFile(String[] s){
             try {
                BufferedWriter out = new BufferedWriter(new FileWriter("files/outfilename.txt"));
                for(int i = 0; i < s.length -1; i++){
                   out.write(s);
    }//end for loop
    out.close();
    catch (IOException e) {}
    }//end writeThatFile
    public String parseThatLine(String s){
    int start = 0;
    int end = 0;
    String parsedLine = "";
    int numChars = getFldSizeArray().length;
    //Print number of lines for testing
    //System.out.println(numChars);
    String[] oArray = getFldSizeArray();
    //String chars = oArray[0];
    //System.out.println(chars.length());
    //oArray
    for(int i = 0; i < numChars -1; i++ ){
    if(i == 0){
    start = 0;
    end = end + Integer.parseInt(oArray[i])-1;
    else
    start = end;
    end = end + Integer.parseInt(oArray[i]);
    parsedLine = parsedLine + s.substring(start, end) + "~";
    }//end for loop
    return parsedLine;
    }//End of parseThatLine
    I have tried to illeminate as many arrays as I can thinking that was chewing up the memory but to no avail.
    Any thoughts or ideas?
    Message was edited by:
    SaipanMan2005

    You should not keep a String array of all the lines of the file read.
    Instead for each line read, parse it, then write the parsed line in the other file:      public void loadFixedWidthFile() {
             BufferedReader in = null;
             BufferedWriter out = null;
             try {
                //File must be in same director and be the name of the string below
                in = new BufferedReader(new FileReader("files/FLAT.txt"));
                out = new BufferedWriter(new FileWriter("files/outfilename.txt"));
                String str;
                while ((str = in.readLine()) != null) {
                   //System.out.println(str.length());
                   str = parseThatLine(str);
                   //write out the parsed str to another file
                   out.write(str);
             catch (IOException e) {
                e.printStackTrace(); // At least print the exception - never swallow an exception
             finally { // Use a finally block to be sure of closing the files even when exception occurs
                try { in.close(); }
                catch (Exception e) {}
                try { out.close(); }
                catch (Exception e) {}
          }//end loadFixedWidthFile()Regards

  • Facing memory error during shifting from 3d stem graph to 3d surface graph in labview 2010

    hello all,
    earlier I was using 3D Stem graph in labview 2010. Now we want to move to Surface plot for same data.when i am trying surface graph it shows memory error.
    I have  10240 data.
    I was using vector method in stem graph so now for Surface graph i have thought that i have to make matrics of 10240x10240 and its diagonal elements should be my Z co0ordiants.
    plz help me out here.
    thanks & regards, 

    hello mikeporter,
    thanks for reply.I am attaching my Vi. It has some error also.plz look into it.
    X and Y are my co-ordinates pairs.I will be having values for these and have to update for same.
    Attachments:
    Untitled 2_surface.vi ‏184 KB

  • Not enough memory error when passing an array to a chart

    Hello.
    I am trying to pass some data from a reading loop to a display loop using queues and charts.
    In order to save on redraws I am reading the queue just once a while, flush it and pass the resulting array to the Chart. It is actually an array with clusters of three values.
    Everything works fine unitl I actualy want to display the chart. I can pass the data and see it using an indicator without any problems.
    When I wire the data to the chart it randomly shows : "not enough memory to complete this operation"
    The fact is the array being passed to the chart is of random size.
    The queue size is good for about 20 min. Labview shows an error after a few seconds. After clicking ok it prints out the results correctly and then a second or two later same error.
    What am I missing here? Thanks
    LV 2012 f3 I7 2.4GHz 16GB RAM
    Attachments:
    TestChart.vi ‏17 KB

    Not sure if it helps or not but here is the error.
    I put about 200 elements/s in the queue. That is why instead of reading it 200/s I just want it to read it once a while.
    I am not sure what is a bigger burden:copying one 500 elements array twice a second or 200/s and 200 calls to the Queue.
    The max size of the queue is not important here. I can set it to something more reasonable and will get the same error.
    What will happen to my CPU usage if I put the Chart in the for loop and make 500 call to it? And then half a second later another 500 calls?
    As I said at this point I do not know how much history I need or not but all I want is to be able to see a few s of it. If I loose the rest it is ok for now. 
    Lets says that all I care is to see it close to real time with a history of 5s. It would mean the queue buffer size of  1000.
    Back to the main problem: why do I see the memory error  window when I know clearly that I have plenty of memory left.
    Attachments:
    Error.jpg ‏18 KB

  • Premier Elements has encountered an error...

    While working on project using Premier Elements 7.0 with Vista Ultimate OS, I get following error:
    "Premier Elements has encountered an error: [..\..\Src\Markers\Markers.cpp-187]"
    Please can anyone tell me how to fix this error?

    Thanks for all your help.  I solved my problem, at least for now.  I used Adobe PRE to do the conversion.  I was able to edit each .wmv file in a project all by itself, normalize the audio, and then save it to a dv-avi file. 
    (It's interesting that the dv-avi file created by Adobe PRE was much
    larger than the original .wmv file -- it went from 18 mb to 66 mb).   I then imported the dv-avi files into my project.  I still get the error if I try to click on Clip for those 2 clips, even as dv-avi files.  So I suspect that I am still having some space or memory issues.  But I no longer need to edit them, since I already normalized the audio.  So hopefully I will be able to create the final DVD for the project when it is time to do that.  Hopefully it will not give me further errors. 
    Steve mentioned optimizing Vista for video editing.  I would still like to know how to do that.  I often get the message "Adobe Premiere Elements is running low on system memory.  Please save what you are doing and proceed with caution."  And then I have to close the program and open it again.  So I suspect there are some things I could do to improve my editing experience, besides acquiring more disk space.  If you can give me some specific info about that, I would appreciate it.
    Marsha
    Marsha Moskowitz
    email: [email protected]
    Website: www.marshamosk.com
    CafePress Shop: www.cafepress.com/mmphotography
    Theater-Film-Dining Blog: www.marshamosk.com/blog
    Photo Gallery: www.marshamosk.com/photogallery
    Date: Sat, 19 Mar 2011 12:16:30 -0600
    From: [email protected]
    To: [email protected]
    Subject: Premier Elements has encountered an error...
    Marsha,
    I read about GSPOT, but I still don't know how to convert .wmv files to dv-avi type II. Can I do this within GSPOT?
    Both G-Spot and MediaInfo are file info utilities, and are not converters. They can furnish a bit of info, regarding installed CODEC's on the system, but that is about the only other function, besides the file info, that they perform.
    I also read your article on resizing photos, and I'm sorry to say, I got totally lost.  Is there really a significant difference between photos that are 1024x768 (as most of mine are), and 1000x750?
    There is very little difference in the dimensions that you cite. There is very little internal Scaling required (though I do Scale in PS to exactly what I need), and the extra pixels that are being handled are minimal, and until one got into the 1000's of such images, there should be zero noticeable slowdown. I think that we can safely rule out overly-large stills, as a contributing factor here.
    It appears that Neale has posted some instructions for your conversion program. I hope that they solve all the issues.
    Good luck,
    Hunt
    >

  • P6 Out of Memory Error in Citrix Environment

    We are using P6 V7 in a Citrix Environment. One of our users is currently experiencing an "Out of Memory" error when he is linking activities. He has also experienced the problem when deleting or copying WBS Elements (containing up to 600 activities).
    The only time that I have seen this error is when trying to import more than 2000 or so Activities from XLS at a time.
    The project that he is working in currently has around 10000 activities.
    Has anyone else had this problem? And, if so, found a workable solution?

    Does it only affect the one user? Can somebody else perform the same functions without error? If so, his user preferences are probably corrupt. Reset them using the following SQL:
    Update userdata set user_data = null where topic_name = 'pm_settings' and user_id in (select user_id from users where user_name = '<username>');
    where <username> is the user's P6 username.

  • Out of memory error while rendering

    Trying to render a 1 minute comp and part way through it tanks out with a "7::66 Unable to allocate enough memory" error. I watch the RAM usage indicator at the bottom of the render screen and it just gradually climbs until it hits 100% and crashes.
    System is a 2013 Mac Pro running latest version of Yosemite and latest version of CC. I have 64GB of RAM installed so at this point I'm going to disregard AE's advice that I bump this up.
    Also using Element 2 in the comp so that may be a culprit.
    Anyhow, if anyone has any advice on how to troubleshoot this I would greatly appreciate it.

    When you say latest version of CC, do you mean version 12.2.1.5?
    What is in your comp besides the layer with Element on it?
    Have you used a minute-long comp with Element 2 in the past?
    What are the rest of your system specs?
    Have you tried the "secret" preferences setting to purge after a certain number of frames?
    Another suggestion would be to render an image sequence. Then, if it crashes, you can just pick up the render from where it left off.

  • Possible "Out of memory" error  during XSLT ?

    Hi ,
    I am working on 11gR1.
    In my project I am reading a file in batches of ten thousand messages.
    The file is getting read and archived and I can see expected number of instances getting created in the console.
    But nothing useful is visible inside the instance as the link for BPEL process is not appearing.
    (I have kept audit level as production but even in this case, atleast link should appear)
    When I checked the logs , it indicated that transaction was rolled back due to out of memory error.
    Just before this error, there is a reference to the xsl file which I am using :
    [2010-12-13T08:42:33.994-05:00] [soa_server1] [NOTIFICATION] [] [oracle.soa.bpel.engine.xml] [tid: pool-5-thread-3] [userId: xxxx] [ecid: 0000InVxneH5AhCmvCECVH1D1XvN00002J,0:6:100000005] [APP: soa-infra] [composite_name: xxxx] [component_name: xxxx] [component_instance_id: 560005] [composite_instance_id: 570005] registered the bpel uri resolver [File-based Repository]oramds:/deployed-composites/xxxx_rev1.0/ base uri xsl/ABCD.xsl
    [2010-12-13T08:46:12.900-05:00] [soa_server1] [ERROR] [] [oracle.soa.mediator.dispatch.db] [tid: oracle.integration.platform.blocks.executor.WorkManagerExecutor$1@e01a3a] [userId: <anonymous>] [ecid: 0000InVuNCt5AhCmvCECVH1D1XvN000005,0] [APP: soa-infra] DBContainerIdManager:run() failed with error.Rolling back the txn[[
    java.lang.OutOfMemoryError
    My question is , is there any limit on how much payload can oracle's xslt parser handle in one go ?
    Is decreasing the batch size only possible solution for this ?
    Please share your valuable inputs ,
    Ketan
    Is there any limit on how many number of the elements xslt parser can handle ?
    I am reading a file in batch of 10 thousand messages per file. (Each recordsa has some 6-8 fields)
    The file is getting picked up but the instance does not show anything.

    > I'm getting out of memory errro during system copy import for Dual stack system (ABAP & JAVA).
    >
    > FJS-00003  out of memory (in script NW_Doublestack_CI|ind|ind|ind|ind, line 6293
    > 6: ???)
    Is this a 32bit instance? How much memory do you have (physically) in that machine?
    Markus

  • A FIX for error message: When I try to open Snood (it's a game) I get this message.  Not enough memory {Error # :: 0, in sound.cp@line 101  Can you help?

    After years of playing Snood, w/o problems, I started getting this error message, on my iMac, OS 10.5.8,
    with 4 GB of memory when opening Snood:  Not enough memory {Error # :: 0, in sound.cp@line 101
    My MacBook Pro w. Mac OS 10.6.8 did not have this problem.
    Initially I thought that Snood raised its minimum requirement to Mac OS 10.6.
    I had several correspondences with Snood. Their tech support is great. Quick and thorough responses.
    They thought the issue was in Mac's system preferences/ Sound. It was.
    I didn't realize that my sound input and output devices were gone.
    The fix was resetting the PRAM. I found this advice on MacFixIt.com.
    MacFixIt help with volume:   http://reviews.cnet.com/8301-13727_7-10415659-263.html
    Resetting the PRAM is on Apple support:   http://support.apple.com/kb/HT1379
    My sound (music!) is back, along with Snood. So glad I reset the PRAM before reinstalling the OS software!
    Thank you to Snood, MacFixIt and Apple.
    Happy new year all!

    Good work, nice post/tip, thanks!

  • Acrobat XI Pro "Out of Memory" Error.

    We just received a new Dell T7600 workstation (Win7, 64-bit, 64GB RAM, 8TB disk storage, and more processors than you can shake a stick at). We installed the Adobe CS6 Master Collection which had Acrobat Pro X. Each time we open a PDF of size greater than roughly 4MB, the program returns an "out of memory" error. After running updates, uninstalling and reinstalling (several times), I bought a copy of Acrobat XI Pro hoping this would solve the problem. Same problem still exists upon opening the larger PDFs. Our business depends on opening very large PDF files and we've paid for the Master Collection, so I'd rather not use an freeware PDF reader. Any help, thoughts, and/or suggestions are greatly appreciated.
    Regards,
    Chris

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • Acrobat XI Pro "Out of Memory" error after Office 2010 install

    Good Afternoon,
    We recently pushed Office 2010 to our users and are now getting reports of previous installs of Adobe Acrobat XI Pro no longer working but throwing "Out of Memory" errors.
    We are in a Windows XP environment. All machines are HP 8440p/6930p/6910 with the same Service pack level (3) and all up to date on security patches.
    All machines are running Office 2010 SP1.
    All machines have 2GB or 4GB of RAM (Only 3.25GB recognized as we are a 32bit OS environment).
    All machines have adequate free space (ranging from 50gb to 200gb of free space).
    All machines are set to 4096mb initial page file size with 8192mb maximum page file size.
    All machines with Acrobat XI Pro *DO NOT* have Reader XI installed alongside. If Reader is installed, it is Reader 10.1 or higher.
    The following troubleshooting steps have been taken:
    Verify page file size (4096mb - 8192mb).
    Deleted local user and Windows temp files (%temp% and c:\WINDOWS\Temp both emptied).
    Repair on Adobe Acrobat XI Pro install. No change.
    Uninstall Acrobat Pro XI, reboot, re-install. No change.
    Uninstall Acrobat Pro XI Pro along with *ALL* other Adobe applications presently installed (Flash Player, Air), delete all Adobe folders and files found in a full search of the C drive, delete all orphaned Registry entries for all Adobe products, re-empty all temp folders, reboot.
    Re-install Adobe Acrobat XI Pro. No change.
    Disable enhanced security in Acrobat XI Pro. No change.
    Renamed Acrobat XI's plug_ins folder to plug_ins.old.
    You *can* get Acrobat to open once this is done but when you attempt to edit a file or enter data into a form, you get the message, "The "Updater" plug-in has been removed. Please re-install Acrobat to continue viewing the current file."
    A repair on the Office 2010 install and re-installing Office 2010 also had no effect.
    At this point, short of re-imaging the machines (which is *not* an option), we are stumped.
    We have not yet tried rolling back a user to Office 2007 as the upgrade initiative is enterprise-wide and rolling back would not be considered a solution.
    Anyone have any ideas beyond what has been tried so far?

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • Memory Error Neo2 Pt 4*512 DDR 400 or even 333

    Ok, this board has started giving me a lot of problems. 1 week ago all was fine, and suddenly I started getting BSODs. Last friday the computer shut off after a BSOD, while I was just browsing the internet.
    Affter that it failed to start and gave a BSOD on every boot.
    Then it started automatically again. And woked well.
    On sat the system wont start and gave long beeps, indiicating memory error.
    4*512 Ballistix default spd.
    so I reset my BIOS, upgraded bios to 1.A ballistix edition, as syar suggested.
    Computer ran well / rebooted well.
    but when I did a cold start sunday morning I got long beeps again. Even resetting the BIOS did help. Got long beeps.
    I took 1 module out and the system started well. went into the bios and manually set bus to 333 rather than 400, default spd timings.
    Computer started well. I put the 4th module back in, and all worked well till I did a cold boot today, and back to continuos beeps.
    Now the only way the computer works if I just use 2 memory modules.
    I tried manual voltage, manual timings, spd timings, timings on crucial's site, voltage on crucials site. No luck.
    currently i am on 2*512 dimm 1-2, default spd and voltage and system seems to work.
    whats the prob ? Is my memory bad, which I am not too sure off, bcos I tried running 2 modules at a time from the 4 matched pair, and tey seem to work.
    Only the all 4 wont work, and 3 in also gives problem.
    is this motherboard messed up ?
    bcos I am really thinking of switching to a diff platform as since day 1, less than 1 month ago, this mobo is giving me probs with X2 dual core.

    https://mywebspace.wisc.edu/phora/web/cpuz.htm
    Alright, I updated the BIOS to the latest beta, modded by side effect.
    put back the 4 modules in, and they are running now
    2-2-2-7 DDR 400 2.8V
    though the HT is 3x unlike all previos times when it was 5x...anything to do with the probs ?
    anyways, I still dont know how long this config will last, for I have has successful bootups in the past, but they end up failing later.

  • RoboHelp 9 gives an out of memory error and crashes when I try to import or link a Frame 10 file or

    I have Tech Suite 3. If I start a new RoboHelp project and try to import or link Frame files, RoboHelp tries for a while, then I get an Out of Memory error and the program crashes.
    I opened one of the sample projects and was able to link to one of my frame files without any problem, so it seems to be an issue with creating something new.
    Any suggestions?

    It happens when I create a new project and then try to import or link frame docs to make up the content. It starts scanning, then crashes. I did get it to the conversion setting page once, but no further.
    It does not happen if I open one of the supplied example projects and link a file. But then it doesn't let me choose, during import, any style mapping. And I can't delete the sample project fold
    Twice now it has told me when I tried to import (not link, but import) that my .fm file could not be opened, and told me to verify that Frame is installed (it is) and that the file is a valid frame file (it is).
    The docs and project are in separate folders on my C: drive.

  • Low memory errors.

    When I ty to play Destination Treasure Island I get a low memory error message. Is this bad? Could my iPad be bad? If I reboot the iPad it starts and lays but I haven't played for hours at a stretch yet either.
    Should I be concerned?
    Thanks.

    Multi-tasking happens in 4.2, it's not something that can be switched on/off (which not everyone likes). The only thing that you can do is go into the task bar (double-click the home button) and close other apps manually (press and hold one of them until they start to shake, then press the '-' in their top left corenr; press home, or touch the non-taskbar part of the screen when finished closing them). The task bar shows recently used apps as well as those that are running, so even if an app appears there it doesn't necessarily mean that it is active.

Maybe you are looking for

  • Ipod not being recognized in Itunes on Mac, i restored it on PC because i couldnt get it to restore on my Mac

         I just bought an Ipod Classic 160gb 2 days ago. I first hooked it up to my mac and it synced all 2000 songs and played fine. I ejected it but unplugged it before the ipod said it was done. It was my fault, i thought it was like the iphone where

  • PDF convert

    Hi all, I need a possibility to convert Web Report designed in WAD into PDF. SAP help portal says: 'You access BEx Web applications and reports in the portal. You can then convert these into PDF files and print them.' (http://help.sap.com/saphelp_nw0

  • Nokia 5800 GPS Problem

    Hello,  i have an Nokia 5800, and i can´t use the gps with a satellite signal, only with the GPRS. someone can helpme. thanks a lot!!!!!!!!!!!! 

  • Heapdump during WSDK 1.6 install on AIX (JRE 1.4.2 SR7)

    Hello all, I'm experiencing a problem with the Java Web Services Development Kit v1.6 installation on AIX 5.2. I'm installing via the console (using the -console switch), specifying /usr/java14 (java 1.4.2 SR7), and no web container. At about 80%, th

  • Images and Transparency

    Ok, so I have a program wwhich displays 2 Images, one on top of the other. The first one (rendered first) is an image loaded from a file which is completely opaque. The second one starts off as completely transparent, but every time you click on it,