Out of memory error, possibly corrupted fs?

I just got my tour, and I love it. However I ran into a weird problem.
The desktop manager never show the BB messager as installed, then it tries to uninstall it. If I check the checkbox it installs over it and breaks it.
I was able to fix the messanger by reinstalling over the old one from the app store OTA.
Now, vilingo wont run, it throws an out of memory error. I've tried removing it and reinstalling it every way I can think of.
Also sometimes it takes 20+ minutes to reboot.
The crux of the matter is I could easily exchange it at this point, do you think I should or should I wipe it? any other suggestions?
Thanks,
Chance

No, not corrupted. You are out of memory..... just like a computer that is bogged down with too much stuff, your phone can only handle so much. The reason your apps keep disabling is most likely it is archiving them to make room for everything else you've got going. The biggest memory muncher is most likely pics or videos... as few as 5 or 6 pics and throw your phone off.  The other thing you might be noticing is missing call logs, messages, or emails.... again, the phone is automatically clearing them out of memory (possibly before you[ve even had a chance to view them) to make room for everything else you've saved to it.
As far as startup times, again... MEMORY. the more files the blackberry has to scan through when it starts up (and it scans EVERY file EVERY time) the longer its going to take.
The solution for all of these issues is to get a memory card and use your desktop software to transfer all of the pic video and ringtone content to it. you can transfer one the phone itesself but its a time consuming process.  after this is done pull your battery... startup still takes about 10 min on the tour but you should be ok.  The phone will automatically save media to the SD card once its installed. 

Similar Messages

  • Possible "Out of memory" error  during XSLT ?

    Hi ,
    I am working on 11gR1.
    In my project I am reading a file in batches of ten thousand messages.
    The file is getting read and archived and I can see expected number of instances getting created in the console.
    But nothing useful is visible inside the instance as the link for BPEL process is not appearing.
    (I have kept audit level as production but even in this case, atleast link should appear)
    When I checked the logs , it indicated that transaction was rolled back due to out of memory error.
    Just before this error, there is a reference to the xsl file which I am using :
    [2010-12-13T08:42:33.994-05:00] [soa_server1] [NOTIFICATION] [] [oracle.soa.bpel.engine.xml] [tid: pool-5-thread-3] [userId: xxxx] [ecid: 0000InVxneH5AhCmvCECVH1D1XvN00002J,0:6:100000005] [APP: soa-infra] [composite_name: xxxx] [component_name: xxxx] [component_instance_id: 560005] [composite_instance_id: 570005] registered the bpel uri resolver [File-based Repository]oramds:/deployed-composites/xxxx_rev1.0/ base uri xsl/ABCD.xsl
    [2010-12-13T08:46:12.900-05:00] [soa_server1] [ERROR] [] [oracle.soa.mediator.dispatch.db] [tid: oracle.integration.platform.blocks.executor.WorkManagerExecutor$1@e01a3a] [userId: <anonymous>] [ecid: 0000InVuNCt5AhCmvCECVH1D1XvN000005,0] [APP: soa-infra] DBContainerIdManager:run() failed with error.Rolling back the txn[[
    java.lang.OutOfMemoryError
    My question is , is there any limit on how much payload can oracle's xslt parser handle in one go ?
    Is decreasing the batch size only possible solution for this ?
    Please share your valuable inputs ,
    Ketan
    Is there any limit on how many number of the elements xslt parser can handle ?
    I am reading a file in batch of 10 thousand messages per file. (Each recordsa has some 6-8 fields)
    The file is getting picked up but the instance does not show anything.

    > I'm getting out of memory errro during system copy import for Dual stack system (ABAP & JAVA).
    >
    > FJS-00003  out of memory (in script NW_Doublestack_CI|ind|ind|ind|ind, line 6293
    > 6: ???)
    Is this a 32bit instance? How much memory do you have (physically) in that machine?
    Markus

  • Photoshop out of memory error

    Hi, I am running in a mixed enviroment with Open Directory master for authentication on a Snow Leopard OS 10.68 server, with both Lion and Snow Leopard client machines. My Users, due to Adobe products crashing with none mobile accounts, are using external hard drives formatted OS Extended (Journaled) with their mobile account setup on that Hard Drive. This year it has been touch and go on login issues but working throught that. My problem is when a user uses a Lion machine 10.7.4 and Photoshop extended and then moves to our study lab which has Snow Leopard, photoshop workspaces are corrupted and they cannot open any photo's or settings, they get an "Out of memory" error. However when they go back to the Lion machine and after resetting their workspace they can use the software without issues. Anyone else hiving these issues? Ive tried chflags nohidden library in Snow Leopard to view settings, even deleted all photoshop settings in App data and preferences and still cannot access photoshop on SL machine.
    Thanks

    Thanks Kurt for the reply. I'll give more info. All machines have latest updates for both CS6 and current version of OS either 10.6.8 or 10.7.4.
    The only thing on the external hard drive is the user's home folder and their Data, I have to have permissions enabled so their home preferences and WGM settings work correctly. BTW all accounts have admin rights to machine, I have WGM settings preventing access to machine settings and can re-image if i get corrupted machines.
    PS is installed on each machine not the  External Hard Drive.
    All machines have the same Volume name for the internal boot drive, which is set as the PS scratch drive.
    I thought this issue was to do with the memory and may still be so.
    However when a clean profile is connected to our towers with Lion which has 12 gb ram and 1024 MB Video Memory, the settings are at 70% which is around 8 gb.
    When i take same clean profile to our other lab of iMacs which has 8 gb ram. and 512 mb video memory, PS adjusts the performance ram accordingly which is around 5.5 gb ram at 70%.
    I then take that same external drive to the problem iMac's (early 2009 and 2008) which has 4gb ram 256 video memory and is running Lion, PS adjust to 2.4 gb ram.
    Now i put that same drive on the same model, 2009 or 2008 iMac that has Snow Leopard running on it from same model Lion iMac,PS opens fine.
    However after moving from one of the other larger Lion machines and then back to this Snow Leopard machine, the profile gets corrupted, the workspace is corrupted and cannot reset it. Also I am unable to access any settings I get the "Could not complete the operation because there is not enough memory (RAM)" error.
    Now when I go back to a same model Lion machine with same minimum memory i get the same error, however when I go to a larger Lion machine all I get is Color profile cannot sync and the workspace is still corrupted and not showing but I can reset it.
    I then move the performance size to match that of the lower model’s 70%, I still get the error when I go back to the lower end Lion or Snow Leopard machine.
    I tried clearing PS preferences by opening with command+option+shift and delete PS preferences the issue is still there.
    I then remove all PS ~/Library/ settings for PS and it is still present.
    I had to re-create the profile all together to get this to work. As long as I don’t connect to a low end Snow Leopard machine things seem to be going well and PS readjust according to the machine, note, when I set the performance level to a low setting let's say 2.5 gb as on the early iMacs and plug into another machine, PS adjusts to the current machine' mamory availability and does not keep that lower setting setting.
    I have a copy of console error message below.
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x10aa114d0 of class NSCFString autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x106d65010 of class NSBundle autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x11db6aa60 of class NSCFString autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x10a98bc40 of class NSCFString autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x1371a73e0 of class NSCFData autoreleased with no pool in place - just leaking
    9/18/12 2:31:51 PM Adobe Photoshop CS6[254] *** __NSAutoreleaseNoPool(): Object 0x1371f7320 of class NSCFData autoreleased with no pool in place - just leaking
    I have 12 Snow Leopard machines that are early 2008 and 2009 imac and no way to up upgrading to Lion and I am not ready for Mountain Lion to go into production untill I can upgrade my OD masters to Mountain Lion.
    I suspect it's not the ram settings that are affected but the video ram is not adjusting from machine to machine, is it possible to upgrade the 2009 20" iMac video cards and get proper firmware support?
    any help is appreciated.

  • Final Cut Pro 6 - 'Out of Memory/Error Code'

    Hey All,
    I'm in the final stages of editing a documentary and I keep running into an 'Out of Memory' error code in Final Cut Pro 6.0 when trying to export a reference file. The finished film is 81 minutes long and contains 19 sequences. I have all 19 sequences nested in a main sequence of a seperate FCP project file, so I'm not wasting any memory on clips or additional backups of sequences. This Nested sequence is the only sequence that exists in this 'other' project file. All of the photos used are under 4k resolution and are RGB. Even the lower thirds are animation QT's with alphas. Any thoughs would be greatly appreciated.
    Nick
    System Specs:
    Quad-Core 2GHZ MacBook Pro
    4GB DDR3 Memory
    MAC 10.7.2 (Lion)
    FCP v6.0.6

    First troubleshooting step
    https://discussions.apple.com/docs/DOC-2491
    If that doesn't solve the problem, you may have a corrupt sequence
    Try creating a new sequence with the same settings and copy the contents from the problematic sequence
    If that doesn't work.  Mark an in and an out for the first half of the sequence and do an export.  If that works, mark an in and an out for the second half and export.  Keep narrowing down the sections til you find the problematic clip. 

  • Adobe X Pro 10.1.10 Out of Memory error message

    Hello One, Hello All
    Since updating our Adobe X Pro machines to version update 10.1.10, we occasionally receive an "Out of Memory" error message, which depending on what we are doing, may force us to shutdown all Adobe windows and re-open Adobe, or simply click OK and continue working with the PDF document like nothing is wrong. It seems to have no rhyme or reason to when or why it occurs. I do not see any warnings or errors in the Windows Event Log. It happens with files we have created, and also happens with files received from internal and external sources through email. It is affecting our high end machines: Lenovo X Series and W Series laptops with Windows 8.1 3rd/4th gen i5 CPUs, 8+GB RAM, 128+GB SSD's. We have all system and Windows updates applied. We have Trend Micro and Malwarebytes real-time protection and system scans do not find any malware.
    I have found a few other recent threads on Adobe forum related to this error message but the responses are weak at best with no definitive fix. The system and user temp folders size are less than 100MB each so this cannot be the issue. When the error occurs I check task manager and system utilization, including memory, is well below 100%.
    We did not have this issue prior to version update 10.1.10.
    Really hoping Adobe can step in to help here and hoping boilerplate responses are not used.

    Hi wayne103,
    We released a new security update yesterday that is v10.1.11
    Please update to this version and check the performance.
    I have seen this error message occur while opening a 3rd party created pdf that has corrupted file structure.
    Please let me know if the issue occurs for some specific files or random
    Also let me know the PDF producer for these pdfs.
    Regards,
    Rave

  • E71 keeps giving out of memory error; I can't find...

    My Nokia E71 (it's black so it may be an E71x, I guess) keeps giving me out of memory errors. I use my phone for nothing but voice calls, text messages, and emails. I have cleared all call logs, deleted all emails except 5 (they have no attachments) and cleared deleted emails, and deleted all text messages. I went into file manager and made sure there are no images, videos, etc. on the phone. I set up options to use memory card for emails. I've turned the phone off, waited 15 or 20 seconds, and turned it back on.
    I did all this a couple weeks ago. Ever since then. every couple days I get these out of memory errors, and the phone memory shows me 119 MB used and something like 300K free. I'm at the point where every few emails that come in, I have to delete all emails, erase all text messages, etc. or the phone is just too low on memory to operate.
    What else can I try? I don't have any additional applications installed. I've even tried to delete built-in apps such as RealPlayer but it didn't seem like that was possible.
    This is getting really annoying. Please help!

    @EricHarmon
    Although all user data would be deleted, have you tried resetting device to "Out of box" state by keying in *#7370# followed by 12345 (default Nokia lock code unless altered by user)?
    Happy to have helped forum with a Support Ratio = 42.5

  • Large Pdf using XML XSL - Out of Memory Error

    Hi Friends.
    I am trying to generate a PDF from XML, XSL and FO in java. It works fine if the PDF to be generated is small.
    But if the PDF to be generated is big, then it throws "Out of Memory" error. Can some one please give me some pointers about the possible reasons for this errors. Thanks for your help.
    RM
    Code:
    import java.io.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    import org.xml.sax.InputSource;
    import org.xml.sax.XMLReader;
    import org.apache.fop.apps.Driver;
    import org.apache.fop.apps.Version;
    import org.apache.fop.apps.XSLTInputHandler;
    import org.apache.fop.messaging.MessageHandler;
    import org.apache.avalon.framework.logger.ConsoleLogger;
    import org.apache.avalon.framework.logger.Logger;
    public class PdfServlet extends HttpServlet {
    public static final String FO_REQUEST_PARAM = "fo";
    public static final String XML_REQUEST_PARAM = "xml";
    public static final String XSL_REQUEST_PARAM = "xsl";
    Logger log = null;
         Com_BUtil myBu = new Com_BUtil();
    public void doGet(HttpServletRequest request,
    HttpServletResponse response) throws ServletException {
    if(log == null) {
         log = new ConsoleLogger(ConsoleLogger.LEVEL_WARN);
         MessageHandler.setScreenLogger(log);
    try {
    String foParam = request.getParameter(FO_REQUEST_PARAM);
    String xmlParam = myBu.getConfigVal("filePath") +"/"+request.getParameter(XML_REQUEST_PARAM);
    String xslParam = myBu.SERVERROOT + "/jsp/servlet/"+request.getParameter(XSL_REQUEST_PARAM)+".xsl";
         if((xmlParam != null) && (xslParam != null)) {
    XSLTInputHandler input = new XSLTInputHandler(new File(xmlParam), new File(xslParam));
    renderXML(input, response);
    } else {
    PrintWriter out = response.getWriter();
    out.println("<html><head><title>Error</title></head>\n"+
    "<body><h1>PdfServlet Error</h1><h3>No 'fo' "+
    "request param given.</body></html>");
    } catch (ServletException ex) {
    throw ex;
    catch (Exception ex) {
    throw new ServletException(ex);
    public void renderXML(XSLTInputHandler input,
    HttpServletResponse response) throws ServletException {
    try {
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    response.setContentType("application/pdf");
    Driver driver = new Driver();
    driver.setLogger(log);
    driver.setRenderer(Driver.RENDER_PDF);
    driver.setOutputStream(out);
    driver.render(input.getParser(), input.getInputSource());
    byte[] content = out.toByteArray();
    response.setContentLength(content.length);
    response.getOutputStream().write(content);
    response.getOutputStream().flush();
    } catch (Exception ex) {
    throw new ServletException(ex);
    * creates a SAX parser, using the value of org.xml.sax.parser
    * defaulting to org.apache.xerces.parsers.SAXParser
    * @return the created SAX parser
    static XMLReader createParser() throws ServletException {
    String parserClassName = System.getProperty("org.xml.sax.parser");
    if (parserClassName == null) {
    parserClassName = "org.apache.xerces.parsers.SAXParser";
    try {
    return (XMLReader) Class.forName(
    parserClassName).newInstance();
    } catch (Exception e) {
    throw new ServletException(e);

    Hi,
    I did try that initially. After executing the command I get this message.
    C:\>java -Xms128M -Xmx256M
    Usage: java [-options] class [args...]
    (to execute a class)
    or java -jar [-options] jarfile [args...]
    (to execute a jar file)
    where options include:
    -cp -classpath <directories and zip/jar files separated by ;>
    set search path for application classes and resources
    -D<name>=<value>
    set a system property
    -verbose[:class|gc|jni]
    enable verbose output
    -version print product version and exit
    -showversion print product version and continue
    -? -help print this help message
    -X print help on non-standard options
    Thanks for your help.
    RM

  • Scaling images and Out of memory error

    Hi all,
    Does anyone knows why this code throws an out of memory error?
    It loads an image (2048x1166 pixels) and saves it at bufi1 (BufferedImage). After that:
    1- Rescale bufi1 to bufi13A (x3).
    2. Rescale bufi1 to bufi12 (x2).
    3. Rescale bufi1 to bufi13B (x3).
    At 3, the code throws an oome. Why?
    Thanks in advance!
    import java.io.*;
    import javax.imageio.*;
    import java.awt.geom.*;
    import java.awt.image.*;
    public class TestScalePercent {
      public static void main(String[] args) {
        TestScalePercent tsp=new TestScalePercent();
        BufferedImage bufi1=null;
        try {
          bufi1 = ImageIO.read(new File("foo.png"));//2048x1166 pixels
        } catch (Exception e){
          e.printStackTrace();
        BufferedImage bufi13A=tsp.scale(bufi1,3,3);//--> OK
        BufferedImage bufi12=tsp.scale(bufi1,2,2);//--> OK
        BufferedImage bufi13B=tsp.scale(bufi1,3,3);//-->OOM error!
      public BufferedImage scale(BufferedImage bufiSource, double scaleX, double scaleY){
        AffineTransform tx = new AffineTransform();
        tx.scale(scaleX,scaleY);
        AffineTransformOp op = new AffineTransformOp(tx, AffineTransformOp.TYPE_NEAREST_NEIGHBOR);
        BufferedImage bufiop=op.filter(bufiSource, null);//--> Creates the OOM error...
        return op.filter(bufiop, null);
    }

    How much memory does your machine have?
    That image is quite large. If my math is correct and
    assuming the image has 32-bit color the original
    image takes up 76.5 megs of memory by itself. Then
    you are trying to create three other versions of it.
    It isn't lying to you, it is indeed probably running
    out of memory to hold the images.OK. Now I'm wondering if is it possible to free memory between bufi13A - bufi12, and bufi12 - bufi13B? I've tried to invocate the garbbage collector but nothing happens...
    Thanks!

  • Out Of Memory Error : unable to create new native thread

    Hi Experts,
    The details are given below -
    1. Sun Java System Application Server 7
    2. Java 1.4.2_05
    3. Solaris 9 (OS)
    4. RAM 8 GB
    There are 3-4 applications deployed on this Sun Server 7. Some times we got "Out Of Memory" Error while displaying any jsp. Then we restart the Sun Server, the problem is resolved for some time.
    This all is happening on production environment. We can not start the Sun Server again & again on production.
    We have also set the java parameters as -Xms 3584M & -Xmx 3584M i.e 3.5 GB around.
    If we change this parameter means less or more from 3584M, then our site becomes down.
    Please help us out as soon as possible.

    How do you expect anyone to give you a sensible answer to this? What server are you using to execute the JSP's? Tomcat? Is the code in the JSP causing the out of mem or is it a server related issue (unlikely...)?
    Drill down to the core of the problem before posting...

  • Ora-27102: out of memory error on Solaris 10 SPARC zone install or Oracle10

    All
    I'm stuck!! Tried for days now ,and can't get this working.
    I"m getting the classic: "ora-27102: out of memory" error. However, my memory settings seem fine.
    I'm running Solaris 10 on a Zone, while installing Oracle 10.2.0.1.
    I've changed the max-memory to 6Gig's, but still, I get this error message.
    Could it be possible that this error message means something else entirely?
    Thank you for your assistance!!
    Anne

    Hi V
    Thanks for the response.
    I fixed the problem. Turns out it was because my physical memory for box is 16 gig, but my max-shm-memory was only at 6 GIG. I upped to 8 gig, and everything worked.
    I'm sure you were going to tell me to do just that! I found another post that explained it.
    Thanks!
    Anne

  • Another version of 'out of memory' error

    Hi all
    I have a colleague who is getting the following error message.
    As discussed...when I attempt to render a clip on the timeline, once it
    gets to the end of the rendering process and attempts to play the clip, an
    'out of memory' error message box appears on the screen.
    Then when I click on 'OK' to close this box, the canvas window turns to red
    and the following message displays in the canvas window...'Out of memory.
    Free up memory and window will redraw.'
    He is using HDV footage captured with the "HDV" codec [not the intermediate one], and editing it into HDV1080i50 sequences.
    He has a G5 DP 2 GHZ machine running Tiger 10.4.2 and 2 GB of ram.
    He has approx 80 GB free space on Mac HD and approx 400 GB on external Lacie HD. He is running only FCP HD 5 at any one time.
    I have sourced some previous posts which speak of corrupt graphics, clips, sequences and trashing fcp prefs.
    Does anyone have any other suggestions for him?
    [He is quite new to macs and FCP especially].
    I am going to send him an email to subscribe and create an account for this forum, his name is AGebhardt.

    Hello,
    I had the same issue last night, when a render (only 15 min., so not THAT big) completed and would not play and the canvas turned red and said I was out of memory.
    This is different than a general error! out of memory warning, which I have seen happen. Some of the answers in linked threads seem to be pointing to this other situation.
    I have plenty of space and plenty of RAM and was only running iTunes. I quit iTunes and it worked, almost to my disappointment because in the past I have had many apps working at a time with no problems,
    I would be pretty bummed if I could only have FCP open all of a sudden.
    I will try going through my clips to check for corruptions as suggested just to be on the safe side, but have a question;
    What good is it to throw out your render files if you have already checked to see if you have enough storage space? I can see the good if a file is corrupt, but with every project a new render folder is created and unless there is a limit on these folders that I'm unaware of I can't see the sense in trashing the folder.
    Am I missing something?
    Thanks,
    Jesse
    733 G4    

  • Out of memory error - large project

    I'm consulting on a feature doc edit, and the primary editor (Avid guy) is having serious problems accessing anything from the original project.
    It's an hour and 15 minute show, with probably close to 100 hours of footage.
    The box is a D2.3 G5 with 1.5 g of RAM, and the media is on two G-Tech drives: a G-RAID and a G-Drive. Plenty of headroom on both (now) and the system drive is brand new, having been replaced after the original died, and there's nothing loaded on it but FC Studio. The FCP version is 5.1.4. The project file is well over 100 MB.
    We started getting Out of Memory errors with this large project, and I checked all of the usual suspects: CMYK graphics, hard drive space, sufficient RAM... all checked out okay, except possibly the less-than-ideal amount of RAM.
    I copied the important sequences and a couple of select bins to a new project, and everything seems workable for now. The project is still 90 MB, and I've suggested breaking it up into separate projects and work on it as reels, but we can edit and trims work efficiently at the moment. However, the other editor has gotten to a point now where he can't even open bins in the old, big project. He keeps getting the OOM error whenever he tries to do anything.
    I have no similar problems opening the same project on my G5, which is essentially identical except I have 2.5 G RAM (1 G extra). Can this difference in RAM really make this big a deal? Is there something else I'm missing? Why can't this editor access even bins from the other project?
    G4   Mac OS X (10.2.x)  

    Shane's spot on.
    What I often do with large projects is pare down, just what you have done. But 90 out of 100 is not a big paredown by any stretch. In the new copy throw away EVERYTHING that's outdated: old sequences are the big culprit. Also toss any render files and re-render.
    Remember that, to be effective fcp keeps EVERYTHING in ram, so that it can instantly access anything in your project. The more there is to keep track of the slower you get.

  • Out of Memory Error, Buffer Sizes, and Loop Rates on RT and FPGA

    I'm attempting to use an FPGA to collect data and then transfer that data to a RT system via a FIFO DMA, and then stream it from my RT systeam over to my host PC. I have the basics working, but I'm having two problems...
    The first is more of a nuisance. I keep receiving an Out of Memory error. This is a more recent development, but unfortunately I haven't been able to track down what is causing it and it seems to be independent of my FIFO sizes. While not my main concern, if someone was able to figure out why I would really appreciate it.
    Second, I'm struggling with overflows. My FPGA is running on a 100 MHz clock and my RT system simply cannot seem to keep up. I'm really only looking at recording 4 seconds of data, but it seems that no matter what I do I can't escape the problem without making my FIFO size huge and running out of memory (this was before I always got the Out of Memory error). Is there some kind of tip or trick I'm missing? I know I can set my FPGA to a slower clock but the clock speed is an important aspect of my application.
    I've attached a simplified version of my code that contains both problems. Thanks in advance for taking a look and helping me out, I appreciate any feedback!

    David-A wrote:
    The 7965 can't stream to a host controller faster than 800MB/s. If you need 1.6GB/s of streaming you'll need to get a 797x FPGA module.  
    I believe the application calls for 1.6 GB over 4s, so 400 MB/s, which should be within the capabilities of the 7965.
    I was going to say something similar about streaming over ethernet. I agree that it's going to be necessary to find a way to buffer most if not all of the data between the RT Target and the FPGA Target. There may be some benefit to starting to send data over the network, but the buffer on the host is still going to need to be quite large. Making use of the BRAMS and RAM on the 7965 is an interesting possibility.
    As a more out there idea, what about replacing the disk in your 8133 with an SSD? I'm not entirely sure what kind of SATA connection is in the 8133, and you'd have to experiment to be sure, but I think 400 MB/s or close to that should be quite possible. You could write all the data to disk and then send it over the network from disk once the aquisiton is complete. http://digital.ni.com/public.nsf/allkb/F3090563C2FC9DC686256CCD007451A8 has some information on using SSDs with PXI Controllers.
    Sebastian

  • Thread Count and Out Of Memory Error

    Hi,
    I was wondering if setting the ThreadPoolSize to a value which is too low can
    cause an out of memory error. My thought is that when it is time for Weblogic
    to begin garbage collection, if it does not get a thread fast enough it is possible
    that memory would be used up before the garbage collection takes place.
    I am asking this because I am trying to track down the cause of an out-of-memory
    occurrence, while at the same time I believe I need to raise the ThreadPoolSize.
    Thanks,
    Mark

    Oops ...
    I was wondering if setting the ThreadPoolSize to a value which is too
    low can cause an out of memory error.No, but the opposite can be true.
    My thought is that when it is time for Weblogic
    to begin garbage collection, if it does not get a thread fast enough it is
    possible that memory would be used up before the garbage collection
    takes place.Weblogic doesn't do GC ... that's the JVM and if it needs a thread it will
    not be using one of Weblogic's execute threads.
    > I am asking this because I am trying to track down the cause of an
    out-of-memory occurrenceIt could be configuration (new vs. old heap for example), but it is probably
    just data that you are holding on to or native stuff (e.g. type 2 JDBC
    driver objects) that you aren't cleaning up correctly.
    while at the same time I believe I need to raise the ThreadPoolSize.Wait until you fix the memory issue.
    Peace,
    Cameron Purdy
    Tangosol, Inc.
    Clustering Weblogic? You're either using Coherence, or you should be!
    Download a Tangosol Coherence eval today at http://www.tangosol.com/
    "Mark Glatzer" <[email protected]> wrote in message
    news:[email protected]..

  • P6 Out of Memory Error in Citrix Environment

    We are using P6 V7 in a Citrix Environment. One of our users is currently experiencing an "Out of Memory" error when he is linking activities. He has also experienced the problem when deleting or copying WBS Elements (containing up to 600 activities).
    The only time that I have seen this error is when trying to import more than 2000 or so Activities from XLS at a time.
    The project that he is working in currently has around 10000 activities.
    Has anyone else had this problem? And, if so, found a workable solution?

    Does it only affect the one user? Can somebody else perform the same functions without error? If so, his user preferences are probably corrupt. Reset them using the following SQL:
    Update userdata set user_data = null where topic_name = 'pm_settings' and user_id in (select user_id from users where user_name = '<username>');
    where <username> is the user's P6 username.

Maybe you are looking for

  • Image processor - rotate to fit?

    Anyone know what happened to the Rotate to Fit feature in bridge/tools/photoshop/image processor? I have to make several web photo galleries and this most important feature seems to be missing with CS4.

  • HT4623 HAVE IO6 UPDATE ON, NOW NOT CONNECTING TO DESKTOP

    JUST UPDATED TO IOS6 ON MY PHONE. NOW I AM UNABLE TO CONNECT TO MY PC, HELP?

  • Logo looks flattened

    I have recently created a business card and when our company logo was incorporated the letters of the logo look flat. The business card is standard size with the the original imported logo image being 6000x4200 png file.  I tried making the size smal

  • Adobe Patch Files:  What??

    FYI:  Mac OS 10.6.6  Using LR 3.3 and PSE 9  plus Acrobat (free), and Flash HD>User>Home>Applications>Adobe  in my Adobe Folder, in addition to the apps, there is a file <AdobePatchFiles> with two Folders: {F1469F01-D291-4C09-BC44-962823C175EC} and {

  • Signing a certificate

    Is there a way to sign X.509 certificates using JCA? I can't find anything on how to sign an X.509 certificate with another one anywhere in the documentation for JCA, and Google searches haven't been any help so far. This is intended to be used in an