Memory errors after 3 hours recording

I need to record data on 36 channels over a period of three days.  I would like the data in 30 second chunks.  I am haing two problems with this.  I have sest up my log files to record in 30 second chunks but after 3-4 hours of recording I start getting memory errors.  It doens't matter whether I am recording 30 seconds every 5 minutes or continuous data in 30 second chunks.  Also, it doesn't seem to matter whether I store the data on a relatively small (13 GB) internal solid state hard drive or a much larger (2 TB) external hard drive.  Is there some kind of memory buildup that is overflowing after 3 hours?
As a related question, when I am logging to the external hard drive the recording status in the recording options tab still shows the disk information for my internal hard drive. 
Also, if I'm allowed another question, I am trying to use a Save to ASCII funtion to produce a backup to a different location.  Is there any way I can alter the file size to make them larger than the amount of data to be read in?

Hello Roger,
Quick question, what sampling rate you are using on what card? Also what is your start and stop condition in SE. Did you try continuous logging? Also does your system slows down after a while logging data? Not quiet sure on why this is happening?I believe RAM or harddisk is full after 3 hrs of data logging.
-lab

Similar Messages

  • Photoshop keeps on getting out of memory error after installing Premier Pro

    I just upgrade my CS to CC. Yesterday I installed Photoshop and did my work without any problem but today after installing Premier and After Effect, I keep on getting out of memory error when I'm working even though I don't have any other application running accept photoshop alone. The file I'm working on is a small file, iphone plus size interface. Basically I can open the file, add blur effect and try to type text I will get photoshop telling me that my system is out of memory. Restarting photoshop give the same problem, restarting my computer give the same problem, ie do one thing and next will give memory not enough.
    I don't think my system is slow as it is workstation with dual processor and 12 gig of ram, windows 7 64bits 1gb dedicated memory for graphic card.
    I uninstall Premier and After Effect and suddenly the problem go away. Photoshop work as per normal. I didn't have the time to reinstall premier again but will try to do it tonight or tomorrow
    Anyone experience such problem before?

    When you get that error leaver the error showing and use something the can show you how much free disk space there is left on Photoshop scratch disk.  It may be a problem with scratch storage space not ram storage space.  I see Photoshop use much more scratch space the ram.  I have seen Photoshop using less than 10GB of ram on my machine leaving 30GB of free ram on my system unused while using over 100GB of scratch space.

  • Getting virtual memory error when fetching records from database

    HI,
    I am using Oracle as Database and the Oracle JDBC driver. I have simple code which is a select statement but the problem is the resultset dies while fetching the data i.e. 5,50,000. And it gives me the memory error as its storing all in the memory. One of the way which i have serched in the old threads is using the batch method fetching rows at a time but can you tell me how to implement in my code. I am pasting my code.
    The overall functionality of my code is that it's reterving data from database and generating an XML file that would be validated with a DTD.
    //My Code
    public class Invoicef3 implements ExtractF3 {
         final String queryString = "select * from hsbc_f3_statement
    order by bill_no, duplicate,
    invoice_address1,
    invoice_address2,
    invoice_address3,
    invoice_address4,
    invoice_address5,
    invoice_address6,
    main_section, order_1, page,
    section, product_category,
    sub_sect_1, order_2,
    sub_sect_2, child_product,
    sub_sect_3, account,
    line,entry_date, currency, tier";
         public ArrayList process() {
              Connection con = null;
              Statement stmt = null;
              ResultSet rset = null;
              ArrayList arr1 = null;
              try {
                   con =
    ConnectionManager.getConnection();
                   stmt = con.createStatement();
              rset = stmt.executeQuery(queryString);
                   arr1 = new ArrayList();
                   while (rset.next()) {
                        arr1.add(
                             new F3StatementExtract(
                                  rset.getString(1),
                                  rset.getInt(2),
                                  rset.getString(3),
                                  rset.getInt(4),
                                  rset.getInt(5),
                                  rset.getString(6),
                                  rset.getInt(7),
                                  rset.getString(8),
                                  rset.getInt(9),
                                  rset.getString(10),
                                  rset.getInt(11)));
                   rset.close();
                   stmt.close();
              } catch (SQLException e) {
                   e.printStackTrace();
              } finally {
                   ConnectionManager.close(rset);
                   ConnectionManager.close(stmt);
                   ConnectionManager.close(con);
              return arr1;
    }

    The problem is that you are fetching and processing all the rows for the query, which the VM cannot handle given the heap space available. The points you could think over are:
    * Allocate more heap memory (this would help only to a limited extent)
    * Try to process only a few records at a time instead of all of them (there is actually no need to process all the records at a time. Try processing records in lots of say 1000)
    * Avoid selecting all the columns [SELECT *] from the table, if all of them are not going to be used.
    There is a slight change i have done in the code is that i am using two quereies now one is fetching all the Bills and the secondquery is fetching all the data for the relevant BILL.
    //My Code
    public class Invoicef3 implements ExtractF3 {
         /*Query to get distinct bill numbers*/
         final String queryString1 =
              "select distinct(bill_no) from hsbc_print_bills";
         /*Query to get distinct bill numbers statement details*/
         final String queryString =
              "select * from hsbc_f3_statement where bill_no='";
         public ArrayList process() {
              Connection con = null;
              Statement stmt = null;
              ResultSet rset = null;
              ArrayList arr1 = null;
              ArrayList arr2 = null;
              try {
                   con = ConnectionManager.getConnection();
                   stmt = con.createStatement();
                   rset = stmt.executeQuery(queryString1);
                   arr1 = new ArrayList();
                   while (rset.next()) {
                        arr1.add(new F3BillExtract(rset.getString(1))); //generating the Bill_No's
                   System.out.print(arr1.size());
                   rset.close();
                   stmt.close();
                   for (int i = 0; i < arr1.size(); i++) {
                        stmt = con.createStatement();
                        rset =
                             stmt.executeQuery(
                                  queryString
                                       + (((F3BillExtract) arr1.get(i)).getBill_No())
                                       + "'");
                        arr2 = new ArrayList();
                        /*Fetching the statement Details of the particular Bill_No*/
                        while (rset.next()) {
                             arr2.add(
                                  new F3StatementExtract(
                                       rset.getString(1),
                                       rset.getInt(2),
                                       rset.getString(3),
                                       rset.getInt(4),
                                       rset.getInt(5),
                                       rset.getString(6),
                                       rset.getInt(7),
                                       rset.getString(8),
                                       rset.getInt(9),
                                       rset.getString(10),
                                       rset.getInt(11),
                                       rset.getString(12),
                                       rset.getFloat(13),
                                       rset.getDate(14),
                                       rset.getString(15),
                                       rset.getInt(16),
                                       rset.getString(17),
                                       rset.getString(18),
                                       rset.getString(19),
                                       rset.getString(20),
                                       rset.getString(21),
                                       rset.getString(22),
                                       rset.getString(23),
                                       rset.getString(24),
                                       rset.getString(25),
                                       rset.getString(26),
                                       rset.getString(27),
                                       rset.getString(28),
                                       rset.getString(29),
                                       rset.getString(30),
                                       rset.getDate(31),
                                       rset.getDate(32),
                                       rset.getDate(33),
                                       rset.getDate(34),
                                       rset.getString(35),
                                       rset.getString(36),
                                       rset.getString(37),
                                       rset.getString(38),
                                       rset.getString(39),
                                       rset.getString(40),
                                       rset.getFloat(41),
                                       rset.getFloat(42),
                                       rset.getFloat(43),
                                       rset.getInt(44),
                                       rset.getFloat(45),
                                       rset.getString(46),
                                       rset.getString(47)));
                        rset.close();
                        stmt.close();
                        ((F3BillExtract) arr1.get(i)).setArr(arr2);
              } catch (SQLException e) {
                   e.printStackTrace();
              } finally {
                   ConnectionManager.close(rset);
                   ConnectionManager.close(stmt);
                   ConnectionManager.close(con);
              return arr1;
    }

  • Out of Memory error after upgrading to Reader X

    We have several PDF documents that work fine on Reader 9.  But after upgrading to Reader X, the files will not open.  Instead, they report an Out Of Memory error and a blank document opens instead.  Removing Reader X and re-installing Reader 9 corrects the issue. This has been reproduced on 3 different PCs running both Windows XP and Windows 7.
    Any suggestions?

    Just to throw in my 2 cents... Adobe has known about the out of memory bug at least since 01/12/2011 because that is when I reported the problem to them.  I even had an escalated support case #181995624 and bug#2800823.  Our problem comes from a EFI Fiery E7000 attached to our Lanier LD460c Copier.  Any pdf's made from that copier will not open in Acrobat X, although they will open in lower versions of Acrobat just fine.  Our only workaround is to keep Acrrobat 9 on office computers, or you can open the offending pdf in Acrobat 9 or earlier and print it to pdf, and then it will open in Acrobat X!!!  They acknowledged that this was a bug, see my email chain below.  This was the last update I received from Adobe, very frustrating...
    From:
    Sent: Wednesday, February 09, 2011 9:12 AM
    To:
    Cc:
    Subject: RE: Case 181995624; Unable to open PDF
    Hi Phil,
    We do not have this information or estimate time by our Engineering team yet.
    Regards,
    Neeraj
    From:
    Sent: Monday, February 07, 2011 8:19 PM
    To:
    Cc:
    Subject: RE: Case 181995624; Unable to open PDF
    Next major release as in the next patch for Acrobat X, or as in Acrobat 11?
    From:
    Sent: Saturday, February 05, 2011 4:31 AM
    To:
    Cc:
    Subject: Case 181995624; Unable to open PDF
    Hi Phil,
    You can get back to us with the Bug Number provided you earlier based on which we will give you the update by our Engineering team. However, the update for now is that it is decided to fix in our next major release.
    Regards,
    Neeraj
    From:
    Sent: Thursday, February 03, 2011 1:33 AM
    To:
    Subject: RE: Case 181995624; Unable to open PDF
    Can you send me a link to where I can find information on the bug?
    From:
    Sent: Tuesday, February 01, 2011 10:14 AM
    To:
    Cc:
    Subject: Case 181995624; Unable to open PDF
    Hi Phil,
    Hope you are doing well.
    I have researched on your issue and found that it is a BUG with Acrobat X. I have logged it on your behalf so that our Engineering team can have a look on that. Please find the Bug Number #2800823 for any update on this in future. I am closing this case on your behalf.
    Have a nice day.
    Regards,
    Neeraj
    From:
    Sent: Tuesday, February 01, 2011 12:22 AM
    To:
    Cc:
    Subject: RE: Case 181995624; Unable to open PDF
    Any updates on this case?
    From:
    Sent: Friday, January 14, 2011 2:03 PM
    To:
    Cc:
    Subject: RE: Case 181995624; Unable to open PDF
    The EFI Fiery E-7000 Controller version is 1.2.0 and it handles the scanning and printing functionality of our Lanier LD160c copier.  I have attached two sample files.  One is a 1 page scan from our copier.  The other is a combined pdf that I just created in Acrobat 9.  The first two pages of the combined pdf consists of a webpage that I printed using Acrobat 9 and then the scan from the copier is on the 3rd page.  In Acrobat X, once you get to the 3rd page you will receive the Out of Memory error.  It will open in previous versions of Acrobat just fine though.
    From:
    Sent: Friday, January 14, 2011 11:52 AM
    To:
    Cc:
    Subject: Case 181995624; Unable to open PDF
    Hi Phil,
    Thanks for the information.
    I have the PDF file provided by you and able to reproduce the behavior. I tried to call you at 214-303-1500 but got voice mail.
    Please let me know when you will be available so that I could call you and continue further with this issue.
    Regards,
    Neeraj
    From:
    Sent: Thursday, January 13, 2011 6:57 AM
    To:
    Cc:
    Subject: Re: Case 181995624; Unable to open PDF
    It is a walk up copier and we scan to email.  The EFI Fiery controller E7000 handles pdf conversion for the copier, but yes it has the latest firmware.  The bottom line is that we have 3 or 4 years worth of pdfs created from that copier rendered useless by Acrobat X.  They open fine in previous versions of Acrobat.  Did you get the test pdf file when this case was created?
    -- Sent from my Palm Pre
    On Jan 12, 2011 6:12 PM, Acrobat Support <[email protected]> wrote:
    Hi Philip,
    Thank you for choosing Adobe Support we have got your concern, we see that you are facing an issue with opening PDF files in Acrobat X, created from Lanier (Ricoh) LD160c copier. A technical support case 181995624 has been created for your reference. In order to assist you further. We would like to have the following information.
    ·         Are you using the latest scanner driver ?
    ·         What is the exact scanning workflow ?
    Regards
    Acrobat Support

  • Acrobat XI Pro "Out of Memory" error after Office 2010 install

    Good Afternoon,
    We recently pushed Office 2010 to our users and are now getting reports of previous installs of Adobe Acrobat XI Pro no longer working but throwing "Out of Memory" errors.
    We are in a Windows XP environment. All machines are HP 8440p/6930p/6910 with the same Service pack level (3) and all up to date on security patches.
    All machines are running Office 2010 SP1.
    All machines have 2GB or 4GB of RAM (Only 3.25GB recognized as we are a 32bit OS environment).
    All machines have adequate free space (ranging from 50gb to 200gb of free space).
    All machines are set to 4096mb initial page file size with 8192mb maximum page file size.
    All machines with Acrobat XI Pro *DO NOT* have Reader XI installed alongside. If Reader is installed, it is Reader 10.1 or higher.
    The following troubleshooting steps have been taken:
    Verify page file size (4096mb - 8192mb).
    Deleted local user and Windows temp files (%temp% and c:\WINDOWS\Temp both emptied).
    Repair on Adobe Acrobat XI Pro install. No change.
    Uninstall Acrobat Pro XI, reboot, re-install. No change.
    Uninstall Acrobat Pro XI Pro along with *ALL* other Adobe applications presently installed (Flash Player, Air), delete all Adobe folders and files found in a full search of the C drive, delete all orphaned Registry entries for all Adobe products, re-empty all temp folders, reboot.
    Re-install Adobe Acrobat XI Pro. No change.
    Disable enhanced security in Acrobat XI Pro. No change.
    Renamed Acrobat XI's plug_ins folder to plug_ins.old.
    You *can* get Acrobat to open once this is done but when you attempt to edit a file or enter data into a form, you get the message, "The "Updater" plug-in has been removed. Please re-install Acrobat to continue viewing the current file."
    A repair on the Office 2010 install and re-installing Office 2010 also had no effect.
    At this point, short of re-imaging the machines (which is *not* an option), we are stumped.
    We have not yet tried rolling back a user to Office 2007 as the upgrade initiative is enterprise-wide and rolling back would not be considered a solution.
    Anyone have any ideas beyond what has been tried so far?

    As mentioned, the TEMP folder is typically the problem. MS limits the size of this folder and you have 2 choices: 1. empty it or 2. increase the size limit. I am not positive this is the issue, but it does crop up at times. It does not matter how big your harddrive is, it is a matter of the amount of space that MS has allocated for virtual memory. I am surprised that there is an issue with 64GB of RAM, but MS is real good at letting you know you can't have it all for use because you might want to open up something else. That is why a lot of big packages turn off some of the limits of Windows or use Linux.

  • Memory error after C042 errors on users database

    Hi,
    We recently have memory error messages on the POA and on the Client.
    The error seemed to follow several C042 errors:
    The database function 53 reported error [C042] on user4mn.db
    Error: Memory error. Memory function failure [8101] User:
    I cannot find any process that is accessing the user database at that time (GWcheck, Backup)
    The user also gets a memory error and has to restart the client.
    any ideas?

    On 9/7/2011 2:16 AM, pdjongh wrote:
    >
    > Hi,
    >
    > We recently have memory error messages on the POA and on the Client.
    > The error seemed to follow several C042 errors:
    > -The database function 53 reported error [C042] on user4mn.db
    > Error: Memory error. Memory function failure [8101] User:-
    >
    > I cannot find any process that is accessing the user database at that
    > time (GWcheck, Backup)
    > The user also gets a memory error and has to restart the client.
    >
    > any ideas?
    >
    >
    Have you run a full contents and structure check on the user?

  • Memory errors after Array is created SATA drive

    Hello,
    I am trying to setup a 655 Max board with a SATA drive.  Once I define an array, I get Memory errors. I am testing this using the Microsoft memory dest boot CD.  As soon as I delete the array from the drive, the memory test is able to work sucessfully. I am using FASTtrack with a promise 376 driver.
    Brian

    I have been searching the forum for links between raid array's and memory issues. I noticed some people that have random stop errors when loading xp (which is what I was also seeing).
    Why would the promise (raid controller) have an effect on the system (ram) like that? I noticed as I ran the Microsoft memory test, it would start to get all garbled on the screen and every once in a while information would show up from the fast track screen. Hmmmm....

  • Thinkpad Yoga memory error after bios update

    I updated BIOS to version 1.25 for windows 10 installation.Now I have errors and sometimes blue screen. Lenovo solution center test gives a memory error.WME800800-RK7CBK

    Dear luca,
    welcome in lenovo community,
    Could you please send us your machine serial number
    Thanks to use lenovo community.

  • T43 1875 upper memory ERROR after firmware update

    Hi, I got new HD hts541612j9at00 and run a bootable CD with the latest firmware update file to correct the Error 2010. However, after rebooting the system I am getting the following message (and the system freezes):
    Starting Caldera DR-DOS
    ...(Copyright  bla..bla ...bla) 
    EMM386: Cannot find an unused 64 kb range reserve of upper memory to use for an EMS page frame.
    Any thoughts how to solve this problem with the upper memory ?

    I'm haveing exactly the same problem with my Thinkpad.
    Nobody around with any thoughts?

  • Weblogic generating out of memory error when commandline run doesn't

    Hello,
    I am just beginning to use weblogic. I have a jar file which runs fine if run from command line with the options
    "C:\Program Files (x86)\Java\jre6\bin\java" -jar -Xms1024m -Xmx1024m Report.jarIt connects to oracle and selects some data around (500k records) and writes it to a csv.
    When I run the same jar from within a web application (I mean obvoiusly a servlet calling the jar's main method) the webapp generates
    out of memory error after 80k records itself.
    I tried changing the configuration the server Startup arguments from the console (Server Start and then Arguments) and then restart.
    I just wrote the same thing there -Xms1024m -Xmx1024m
    I guess I am missing something. Please share your answers.
    Environment :
    Win2k8, weblogic 10gR3, jdk 1.5. The application is installed as a service.
    Thanks,
    Neetesh
    Edited by: user13312817 on 5 Dec, 2011 12:15 AM

    If you are not using NodeManager, then I don't think those server settings actually control anything. If you are just exploring WLS, then I suspect you are simply starting an AdminServer in a basic domain and don't have a cluster/nodemanager based environment. I could be wrong, please correct as needed!
    If you are simply starting your WebLogic Server instance using the startWebLogic.sh|cnd script, you can set an environment variable on the command shell that will be picked up and used when the server starts.
    set USER_MEM_ARGS=-Xmx1024 -Xms1024m
    * apply appropriate *nix changes as appropriate.
    Then use the startWebLogic.sh|cmd script to start the server and test your application.
    It may very well be the case that your particular application consumes > 1GB heap so you may need more. Remember that you now have a server environment running over your "main" class, so there is bound to be more memory used that could be just sneaking your heap use over 1GB. For example.
    -steve-

  • Oracle Service Bus For loop getting out of memory error

    I have a business service that is based on a JCA adapter to fetch an undertimed amout of records from a database.  I then need to upload those to another system using a webservice designed by an external source.  This web service will only accept upto to x amount of records.
    The process:
    for each object in the Jca Response
          Insert object into Service callout Request body
          if object index = number of objects in jca response or object index = next batch index
               Invoke service callout
               Append service callout Response to a total response object (xquery transform)
               increase next batch index by Batch size
               reset service callout to empty body
           endif
    end for
    replace body  with total response object.
    If I use the data set that only has 5 records  and use a batch size of 2 the process works fine.
    If I use  a data set with 89 records  and a batch size of 2 I get the below out of memory error  after about 10 service callouts
    the quantity of data in the objects is pretty small, less than 1kB for each JCA Object
    Server Name:
    AdminServer
    Log Name:
    ServerLog
    Message:
    Failed to process response message for service ProxyService Sa/Proxy Services/DataSync:
    java.lang.OutOfMemoryError: allocLargeObjectOrArray:
    [C, size 67108880 java.lang.OutOfMemoryError: allocLargeObjectOrArray:
    [C, size 67108880 at org.apache.xmlbeans.impl.store.Saver$TextSaver.resize(Saver.java:1700)
    at org.apache.xmlbeans.impl.store.Saver$TextSaver.preEmit(Saver.java:1303) at
    org.apache.xmlbeans.impl.store.Saver$TextSaver.emit(Saver.java:1234)
    at org.apache.xmlbeans.impl.store.Saver$TextSaver.emitXmlns(Saver.java:1003)
    at org.apache.xmlbeans.impl.store.Saver$TextSaver.emitNamespacesHelper(Saver.java:1021)
    at org.apache.xmlbeans.impl.store.Saver$TextSaver.emitElement(Saver.java:972)
    at org.apache.xmlbeans.impl.store.Saver.processElement(Saver.java:476)
    at org.apache.xmlbeans.impl.store.Saver.process(Saver.java:307)
    at org.apache.xmlbeans.impl.store.Saver$TextSaver.saveToString(Saver.java:1864)
    at org.apache.xmlbeans.impl.store.Cursor._xmlText(Cursor.java:546)
    at org.apache.xmlbeans.impl.store.Cursor.xmlText(Cursor.java:2436)
    at org.apache.xmlbeans.impl.values.XmlObjectBase.xmlText(XmlObjectBase.java:1500)
    at com.bea.wli.sb.test.service.ServiceTracer.getXmlData(ServiceTracer.java:968)
    at com.bea.wli.sb.test.service.ServiceTracer.addDataType(ServiceTracer.java:944)
    at com.bea.wli.sb.test.service.ServiceTracer.addDataType(ServiceTracer.java:924)
    at com.bea.wli.sb.test.service.ServiceTracer.addContextChanges(ServiceTracer.java:814)
    at com.bea.wli.sb.test.service.ServiceTracer.traceExit(ServiceTracer.java:398)
    at com.bea.wli.sb.pipeline.debug.DebuggerTracingStep.traceExit(DebuggerTracingStep.java:156)
    at com.bea.wli.sb.pipeline.PipelineContextImpl.exitComponent(PipelineContextImpl.java:1292)
    at com.bea.wli.sb.pipeline.MessageProcessor.finishProcessing(MessageProcessor.java:371)
    at com.bea.wli.sb.pipeline.RouterCallback.onReceiveResponse(RouterCallback.java:108)
    at com.bea.wli.sb.pipeline.RouterCallback.run(RouterCallback.java:183)
    at weblogic.work.ContextWrap.run(ContextWrap.java:41)
    at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTuningWorkManagerImpl.java:545)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256) at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)
    Subsystem:
    OSB Kernel
    Message ID:
    BEA-382005
    It appears to be the service callout that is the problem (it calls another OSB service that logins and performs the data upload to the External service)  because If I change the batch size up to 100  the loop will load all the 89 records into the callout request and execute it fine.  If I have a small batch size then I run out of memory.
    Is there some settings I need to change?  Is there a better way in OSB (less memory intensive than service callout in a for loop)?
    Thanks.

    hi,
    Could you please let me know if you get rid off this issue as we are also facing the same issue.
    Thanks,
    SV

  • Supply memory error on M201dw what does it mean?

    I just purchased a HP LaserJet Pro 201dw and when I plugged it in I god a Supply memory Warning. What does this mean?

    Hi @JJprinter ,
    I understand that you are getting a supply memory error after setting up the new printer. I will certainly do my best to help you.
    Make sure all the orange packing material and the seal is removed from the toner and any packing material inside the printer. Here is a picture of the toner and the orange packing material that needs to be removed.
    I have provided the hardware installation guide that you can take a look at. LaserJet Pro M201, M202.
    If you need further assistance, just let me know.
    Have a nice day!
    Thank You.
    Please click “Accept as Solution ” if you feel my post solved your issue, it will help others find the solution.
    Click the “Kudos Thumbs Up" on the right to say “Thanks” for helping!
    Gemini02
    I work on behalf of HP

  • CR XI - "Out of memory" Error

    Crystal XI - Oracle 9i
    I have a report which has group sort all with totals for first certain number of accounts and hide the rest giving a total of displayed, hidden and a grand total, and when i run this report i am running into "Out of memory error" aprrox 4715K records now
    Any help is appreciated!
    Thank you, Pad

    Hi
    In this case, the java.lang.OutOfMemoryError is caused by excessive threads being created. There are various reasons why the number of threads is excessive.
    The more memory you give to the JVM the more likely you are to get java.lang.OutOfMemoryError: unable to create new native thread
    To create more threads you have to reduce the memory allocated to the JVM.
    http://www.egilh.com/blog/archive/2006/06/09/2811.aspx
    http://jroller.com/page/rreyelts/20040909
    Use the lsof -p PID command (Unix® platforms) to see how many threads are active for this process.
    Determine if there is a maximum number of threads per process defined by the operating system. If the limit is too low for the application, try raising the per-process thread limit.
    Examine the application code to determine if there is code that is creating threads or connections (such as LDAP connections) and not destroying them. You could dump the Java™ threads to see if there are an excessive number has been created.
    If you find that too many connections are opened by the application, make sure that any thread that the application creates is destroyed. An enterprise application (.ear) or Web application (.war) runs under a long-running JVM™. Just because the application is finished does not mean that the JVM process ends. It is imperative that an application free any resources that it allocates.
    Regards
    Sumit Jain

  • Getting 'Out of memory' error while opening the file. I have tried several versions of Adobe 7.0,9.0,X1. It is creating issue to convert PDF into TIFF. Please provide the solution ASAP

    Hello All,
    I am getting 'Out of memory' error while opening the file. I have tried several versions of Adobe 7.0,9.0,X1.
    Also, it is creating issue to convert PDF into TIFF. Please provide the solution ASAP.

    I am using Adobe reader XI. When i open PDF it gives "OUT of memory" error after scrolling PDF gives another alert "Insufficient data for an image". after clicking both alerts it loads full data of PDF. It is not happening with all PDFs. couple of PDFs are facing this issue. Because of this error my software is not able to print these PDFS into TIFF. My OS in window7*64. I tried it on win2012R2 and XP. Same issue is generating there.
    It has become critical issue for my production.

  • Camcorder video error: "Unable to record. Resolution not supported by memory card."

    We purchased two Samsung Droid Charges the first day they were released and recently they've both been getting this error after 8-15 seconds of recording in HD (1280x720): "Warning: Unable to record. Resolution not supported by memory card."
    Called Verizon tech support, but they referred me to Samsung. Not knowing much about micro SDHC cards, I was alarmed to discover that the factory pre-installed 32gb card is only rated class 2 for speed. Samsung told me you need a minimum of speed class 4 or higher to record at this resolution. The only problem is you need an SD card installed in order to use the camcorder, period. In other words, all of us who payed an arm and a leg for the best phone Verizon sells now have to go buy our own micro sd card that actually supports all of the phone features as advertised?! The lowest prices I've seen for the faster cards are $70. Are we the only two Charge owners experiencing this problem?  I've read somewhere that lower speed class cards may initially support faster speeds than they are rated until they accumulate more files, so the problem may go undetected for some time until you have more files that are stored in different locations on the card (making them difficult to access at the faster speeds). This may explain why are the first ones to get this error since we've had the phones the longest and only 50% free space remaining on the SD card. Oh, Samsung also said they won't be able to replace your card for one that has the proper speed rating. Kind of seems like false advertising to say a feature is available (HD camcorder), and then not mention the fact that the memory card provided isn't sufficient for that feature (separate upgrade required). Anyone else sharing the same frustrations? Ideas?

    Thanks for the response. I've tried everything on that page. It's not an issue with the cable, or firewire port or the cable being connected properly, as the iSight camera works fine with iChat. I've disconnected the only other firewire device on the computer, didn't make any difference. Camera works fine in iChat, so it's not the camera itself. And, as I mentioned in the post, the same camera, if hooked up to my laptop, works just fine recording in QuickTime. So, it appears to be an issue with the QuickTime installation on this computer. I've done a search for the same error I'm getting and found some other occurrences with the same problems, but none that offered a solution.

Maybe you are looking for