DB_DUP causes transaction out of memory

HI I have been using berkeley DB for storig high volume real-time tick data (25k-50k ticks/sec). Its been working like a charm for some time. Today I tried to allow duplicates (entries with the same key) with the DB_DUP option. all of a sudden i started getting the folllowing error...
Error No: 12, Db::put: Cannot allocate memory
Unable to allocate memory for transaction detail
my DB_CONFIG file has the following entries...
set_cachesize 0 536870912 0
set_lg_regionmax 1000000
set_tx_max 2000
I googled and found that increasing tx_max may solve this, I went upto 10000 but still the same (it did take a longer time to start getting the errors)
any help is appreciated.
db_ = new Db(env_, 0);
db_->open(NULL, sfile.c_str(), NULL, DB_BTREE, DB_CREATE | DB_AUTO_COMMIT | DB_THREAD | DB_DUP, 0);
void store_in_db(Db db_, Dbt k_, Dbt *v_) {
int i_tries=0;
while (i_tries < 10) {
try {
int ret = db_->put(NULL, k_, v_, 0);
if (ret != 0) {
cout << "Could not write to DB(Err Code: " << ret << ")\n\tKey: " << (char *)(k_->get_data()) << endl;
return;
} catch (DbDeadlockException &e) {
i_tries++;
cout << "Deadlock detected: Try #" << i_tries << endl;
} catch(DbException &e) {
cout << "Could not write to DB, Key: " << (char *)(k_->get_data()) << endl << "Error No: " << e.get_errno() << ", " << e.what() << endl;
return;
}

Hi Mike, thanks for your response. I ran the code again and did db_stat -t twice when it was running and once when it started throwing the "Error No: 12, Db::put: Cannot allocate memory..Unable to allocate memory for transaction detail" errors. As you can see below, ' Snapshot transactions' and 'Maximum snapshot transactions' are same and keep increasing. the application fails when they go above 2000 (which is the limit I have set in DB_CONFIG). I open the environment with the following code...
     u_int32_t envFlags      = DB_CREATE | DB_RECOVER | DB_INIT_MPOOL | DB_INIT_TXN | DB_INIT_LOCK | DB_INIT_LOG | DB_THREAD;
     env_ = new DbEnv(0);
     try {
          env_->set_lk_detect(DB_LOCK_MINWRITE);
          env_->set_lg_max(1000*1024*1024);
          env_->log_set_config(DB_LOG_AUTO_REMOVE, 1);
          env_->set_flags(DB_TXN_WRITE_NOSYNC, 1);     //1 is to set the flag, 0 is to clear the flag
          env_->open("/RSIGrid/TickStore", envFlags, 0);
     } catch(DbException &e) {
          cout << "DbException while opening environment: " << e.get_errno() << ", " << e.what() << endl;
          return 1;
[nmittal@development TickStore]$ db_stat -t
1/6583654 File/offset for last checkpoint LSN
Wed Jul 1 08:15:19 2009 Checkpoint timestamp
0x80015b7c Last transaction ID allocated
2000 Maximum number of active transactions configured
0 Active transactions
5 Maximum active transactions
88956 Number of transactions begun
0 Number of transactions aborted
88956 Number of transactions committed
612 Snapshot transactions
612 Maximum snapshot transactions
0 Number of transactions restored
768KB Transaction region size
2412 The number of region locks that required waiting (0%)
Active transactions:
[nmittal@development TickStore]$
[nmittal@development TickStore]$ db_stat -t
1/28447874 File/offset for last checkpoint LSN
Wed Jul 1 08:16:49 2009 Checkpoint timestamp
0x8002e37d Last transaction ID allocated
2000 Maximum number of active transactions configured
0 Active transactions
5 Maximum active transactions
189309 Number of transactions begun
0 Number of transactions aborted
189309 Number of transactions committed
1340 Snapshot transactions
1340 Maximum snapshot transactions
0 Number of transactions restored
768KB Transaction region size
5105 The number of region locks that required waiting (0%)
Active transactions:
[nmittal@development TickStore]$
[nmittal@development TickStore]$ db_stat -t
1/45426368 File/offset for last checkpoint LSN
Wed Jul 1 08:18:20 2009 Checkpoint timestamp
0x800481da Last transaction ID allocated
2000 Maximum number of active transactions configured
0 Active transactions
5 Maximum active transactions
295386 Number of transactions begun
0 Number of transactions aborted
295386 Number of transactions committed
2134 Snapshot transactions
2134 Maximum snapshot transactions
0 Number of transactions restored
768KB Transaction region size
8210 The number of region locks that required waiting (0%)
Active transactions:
[nmittal@development TickStore]$

Similar Messages

  • Printsubmitter out of memory error

    Hi,
    I am facing a printsubmitter error message in my Process that calls the output service to merge data from an xml input file with the XDP to generate a PDF document.
    The upstream process (run by a scheduler wait every 1 min) calls a small java DSC to generate the XML from from a database and puts it in the watched folder for the process that is failing. A large number of documents have to be generated (over 10,000) and this is done one at a time. This works well for the first 5-600 files and then all files start giving printsubmitter errors like below.
    At this point I am not sure if this is a problem related the database connection, memory, or the XML file generated. Do I need to increase memory to the forms service? how can this be done?
    I am attaching one of the failure logs below. Any help would be greatly appreciated.
    Failure Time----Fri Mar 07 15:17:52 IST 2014
    source location ---- Reason of failure is-----com.adobe.printSubmitter.PrintException: 0 : Out of Memory in com.adobe.livecycle.formsservice.exception.RenderFormException, cause: 0 : Out of Memory in com.adobe.livecycle.formsservice.exception.FormServerException
    com.adobe.printSubmitter.PrintException: 0 : Out of Memory in com.adobe.livecycle.formsservice.exception.RenderFormException, cause: 0 : Out of Memory in com.adobe.livecycle.formsservice.exception.FormServerException
    0 : Out of Memory in com.adobe.livecycle.formsservice.exception.RenderFormException, cause: 0 : Out of Memory in com.adobe.livecycle.formsservice.exception.FormServerException
    0 : Out of Memory
    0 : Out of Memory
    IDL:com/adobe/document/xmlform/RenderException:1.0
    com.adobe.livecycle.output.exception.OutputException: com.adobe.printSubmitter.PrintException: 0 : Out of Memory in com.adobe.livecycle.formsservice.exception.RenderFormException, cause: 0 : Out of Memory in com.adobe.livecycle.formsservice.exception.FormServerException
              at com.adobe.printSubmitter.PrintServer.execute(PrintServer.java:678)
              at com.adobe.printSubmitter.PrintServer.submit(PrintServer.java:233)
              at com.adobe.printSubmitter.service.OutputServiceImpl.generateOutputInTxn(OutputServiceImpl. java:278)
              at com.adobe.printSubmitter.service.OutputServiceImpl.generatePDFOutputInTxn(OutputServiceIm pl.java:411)
              at com.adobe.printSubmitter.service.OutputServiceImpl.access$100(OutputServiceImpl.java:84)
              at com.adobe.printSubmitter.service.OutputServiceImpl$2.doInTransaction(OutputServiceImpl.ja va:362)
              at com.adobe.idp.dsc.transaction.impl.ejb.adapter.EjbTransactionBMTAdapterBean.doRequiresNew (EjbTransactionBMTAdapterBean.java:218)
              at sun.reflect.GeneratedMethodAccessor530.invoke(Unknown Source)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
              at java.lang.reflect.Method.invoke(Method.java:597)
              at org.jboss.invocation.Invocation.performCall(Invocation.java:386)
              at org.jboss.ejb.StatelessSessionContainer$ContainerInterceptor.invoke(StatelessSessionConta iner.java:233)
              at org.jboss.resource.connectionmanager.CachedConnectionInterceptor.invoke(CachedConnectionI nterceptor.java:156)
              at org.jboss.ejb.plugins.CallValidationInterceptor.invoke(CallValidationInterceptor.java:63)
              at org.jboss.ejb.plugins.AbstractTxInterceptor.invokeNext(AbstractTxInterceptor.java:121)
              at org.jboss.ejb.plugins.AbstractTxInterceptorBMT.invokeNext(AbstractTxInterceptorBMT.java:1 73)
              at org.jboss.ejb.plugins.TxInterceptorBMT.invoke(TxInterceptorBMT.java:77)
              at org.jboss.ejb.plugins.StatelessSessionInstanceInterceptor.invoke(StatelessSessionInstance Interceptor.java:173)
              at org.jboss.ejb.plugins.SecurityInterceptor.process(SecurityInterceptor.java:228)
              at org.jboss.ejb.plugins.SecurityInterceptor.invoke(SecurityInterceptor.java:211)
              at org.jboss.ejb.plugins.security.PreSecurityInterceptor.process(PreSecurityInterceptor.java :97)
              at org.jboss.ejb.plugins.security.PreSecurityInterceptor.invoke(PreSecurityInterceptor.java: 81)
              at org.jboss.ejb.plugins.LogInterceptor.invoke(LogInterceptor.java:205)
              at org.jboss.ejb.plugins.ProxyFactoryFinderInterceptor.invoke(ProxyFactoryFinderInterceptor. java:138)
              at org.jboss.ejb.SessionContainer.internalInvoke(SessionContainer.java:650)
              at org.jboss.ejb.Container.invoke(Container.java:1092)
              at org.jboss.ejb.plugins.local.BaseLocalProxyFactory.invoke(BaseLocalProxyFactory.java:436)
              at org.jboss.ejb.plugins.local.StatelessSessionProxy.invoke(StatelessSessionProxy.java:103)
              at com.sun.proxy.$Proxy243.doRequiresNew(Unknown Source)
              at com.adobe.idp.dsc.transaction.impl.ejb.EjbTransactionProvider.execute(EjbTransactionProvi der.java:133)
              at com.adobe.idp.dsc.transaction.impl.DefaultTransactionTemplate.execute(DefaultTransactionT emplate.java:79)
              at com.adobe.printSubmitter.service.OutputServiceImpl.invokeGeneratePDFOutputWithSMT(OutputS erviceImpl.java:357)
              at com.adobe.printSubmitter.service.OutputServiceImpl.generatePDFOutput2(OutputServiceImpl.j ava:338)
              at sun.reflect.GeneratedMethodAccessor1269.invoke(Unknown Source)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
              at java.lang.reflect.Method.invoke(Method.java:597)
              at com.adobe.idp.dsc.component.impl.DefaultPOJOInvokerImpl.invoke(DefaultPOJOInvokerImpl.jav a:118)
              at com.adobe.idp.dsc.interceptor.impl.InvocationInterceptor.intercept(InvocationInterceptor. java:140)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.interceptor.impl.DocumentPassivationInterceptor.intercept(DocumentPassi vationInterceptor.java:53)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.transaction.interceptor.TransactionInterceptor$1.doInTransaction(Transa ctionInterceptor.java:74)
              at com.adobe.idp.dsc.transaction.impl.ejb.adapter.EjbTransactionBMTAdapterBean.doBMT(EjbTran sactionBMTAdapterBean.java:197)
              at sun.reflect.GeneratedMethodAccessor542.invoke(Unknown Source)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
              at java.lang.reflect.Method.invoke(Method.java:597)
              at org.jboss.invocation.Invocation.performCall(Invocation.java:386)
              at org.jboss.ejb.StatelessSessionContainer$ContainerInterceptor.invoke(StatelessSessionConta iner.java:233)
              at org.jboss.resource.connectionmanager.CachedConnectionInterceptor.invoke(CachedConnectionI nterceptor.java:156)
              at org.jboss.ejb.plugins.CallValidationInterceptor.invoke(CallValidationInterceptor.java:63)
              at org.jboss.ejb.plugins.AbstractTxInterceptor.invokeNext(AbstractTxInterceptor.java:121)
              at org.jboss.ejb.plugins.AbstractTxInterceptorBMT.invokeNext(AbstractTxInterceptorBMT.java:1 73)
              at org.jboss.ejb.plugins.TxInterceptorBMT.invoke(TxInterceptorBMT.java:77)
              at org.jboss.ejb.plugins.StatelessSessionInstanceInterceptor.invoke(StatelessSessionInstance Interceptor.java:173)
              at org.jboss.ejb.plugins.SecurityInterceptor.process(SecurityInterceptor.java:228)
              at org.jboss.ejb.plugins.SecurityInterceptor.invoke(SecurityInterceptor.java:211)
              at org.jboss.ejb.plugins.security.PreSecurityInterceptor.process(PreSecurityInterceptor.java :97)
              at org.jboss.ejb.plugins.security.PreSecurityInterceptor.invoke(PreSecurityInterceptor.java: 81)
              at org.jboss.ejb.plugins.LogInterceptor.invoke(LogInterceptor.java:205)
              at org.jboss.ejb.plugins.ProxyFactoryFinderInterceptor.invoke(ProxyFactoryFinderInterceptor. java:138)
              at org.jboss.ejb.SessionContainer.internalInvoke(SessionContainer.java:650)
              at org.jboss.ejb.Container.invoke(Container.java:1092)
              at org.jboss.ejb.plugins.local.BaseLocalProxyFactory.invoke(BaseLocalProxyFactory.java:436)
              at org.jboss.ejb.plugins.local.StatelessSessionProxy.invoke(StatelessSessionProxy.java:103)
              at com.sun.proxy.$Proxy243.doBMT(Unknown Source)
              at com.adobe.idp.dsc.transaction.impl.ejb.EjbTransactionProvider.execute(EjbTransactionProvi der.java:95)
              at com.adobe.idp.dsc.transaction.interceptor.TransactionInterceptor.intercept(TransactionInt erceptor.java:72)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.interceptor.impl.InvocationStrategyInterceptor.intercept(InvocationStra tegyInterceptor.java:55)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.interceptor.impl.InvalidStateInterceptor.intercept(InvalidStateIntercep tor.java:37)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.interceptor.impl.AuthorizationInterceptor.intercept(AuthorizationInterc eptor.java:188)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.interceptor.impl.JMXInterceptor.intercept(JMXInterceptor.java:48)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.engine.impl.ServiceEngineImpl.invoke(ServiceEngineImpl.java:121)
              at com.adobe.idp.dsc.routing.Router.routeRequest(Router.java:131)
              at com.adobe.idp.dsc.provider.impl.base.AbstractMessageReceiver.routeMessage(AbstractMessage Receiver.java:93)
              at com.adobe.idp.dsc.provider.impl.vm.VMMessageDispatcher.doSend(VMMessageDispatcher.java:22 5)
              at com.adobe.idp.dsc.provider.impl.base.AbstractMessageDispatcher.send(AbstractMessageDispat cher.java:66)
              at com.adobe.idp.dsc.clientsdk.ServiceClient.invoke(ServiceClient.java:208)
              at com.adobe.workflow.engine.PEUtil.invokeAction(PEUtil.java:893)
              at com.adobe.idp.workflow.dsc.invoker.WorkflowDSCInvoker.transientInvoke(WorkflowDSCInvoker. java:350)
              at com.adobe.idp.workflow.dsc.invoker.WorkflowDSCInvoker.invoke(WorkflowDSCInvoker.java:158)
              at com.adobe.idp.dsc.interceptor.impl.InvocationInterceptor.intercept(InvocationInterceptor. java:140)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.interceptor.impl.DocumentPassivationInterceptor.intercept(DocumentPassi vationInterceptor.java:53)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.transaction.interceptor.TransactionInterceptor$1.doInTransaction(Transa ctionInterceptor.java:74)
              at com.adobe.idp.dsc.transaction.impl.ejb.adapter.EjbTransactionBMTAdapterBean.doRequiresNew (EjbTransactionBMTAdapterBean.java:218)
              at sun.reflect.GeneratedMethodAccessor530.invoke(Unknown Source)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
              at java.lang.reflect.Method.invoke(Method.java:597)
              at org.jboss.invocation.Invocation.performCall(Invocation.java:386)
              at org.jboss.ejb.StatelessSessionContainer$ContainerInterceptor.invoke(StatelessSessionConta iner.java:233)
              at org.jboss.resource.connectionmanager.CachedConnectionInterceptor.invoke(CachedConnectionI nterceptor.java:156)
              at org.jboss.ejb.plugins.CallValidationInterceptor.invoke(CallValidationInterceptor.java:63)
              at org.jboss.ejb.plugins.AbstractTxInterceptor.invokeNext(AbstractTxInterceptor.java:121)
              at org.jboss.ejb.plugins.AbstractTxInterceptorBMT.invokeNext(AbstractTxInterceptorBMT.java:1 73)
              at org.jboss.ejb.plugins.TxInterceptorBMT.invoke(TxInterceptorBMT.java:77)
              at org.jboss.ejb.plugins.StatelessSessionInstanceInterceptor.invoke(StatelessSessionInstance Interceptor.java:173)
              at org.jboss.ejb.plugins.SecurityInterceptor.process(SecurityInterceptor.java:228)
              at org.jboss.ejb.plugins.SecurityInterceptor.invoke(SecurityInterceptor.java:211)
              at org.jboss.ejb.plugins.security.PreSecurityInterceptor.process(PreSecurityInterceptor.java :97)
              at org.jboss.ejb.plugins.security.PreSecurityInterceptor.invoke(PreSecurityInterceptor.java: 81)
              at org.jboss.ejb.plugins.LogInterceptor.invoke(LogInterceptor.java:205)
              at org.jboss.ejb.plugins.ProxyFactoryFinderInterceptor.invoke(ProxyFactoryFinderInterceptor. java:138)
              at org.jboss.ejb.SessionContainer.internalInvoke(SessionContainer.java:650)
              at org.jboss.ejb.Container.invoke(Container.java:1092)
              at org.jboss.ejb.plugins.local.BaseLocalProxyFactory.invoke(BaseLocalProxyFactory.java:436)
              at org.jboss.ejb.plugins.local.StatelessSessionProxy.invoke(StatelessSessionProxy.java:103)
              at com.sun.proxy.$Proxy243.doRequiresNew(Unknown Source)
              at com.adobe.idp.dsc.transaction.impl.ejb.EjbTransactionProvider.execute(EjbTransactionProvi der.java:145)
              at com.adobe.idp.dsc.transaction.interceptor.TransactionInterceptor.intercept(TransactionInt erceptor.java:72)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.interceptor.impl.InvocationStrategyInterceptor.intercept(InvocationStra tegyInterceptor.java:55)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.interceptor.impl.InvalidStateInterceptor.intercept(InvalidStateIntercep tor.java:37)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.interceptor.impl.AuthorizationInterceptor.intercept(AuthorizationInterc eptor.java:188)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.interceptor.impl.JMXInterceptor.intercept(JMXInterceptor.java:48)
              at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
              at com.adobe.idp.dsc.engine.impl.ServiceEngineImpl.invoke(ServiceEngineImpl.java:121)
              at com.adobe.idp.dsc.routing.Router.routeRequest(Router.java:131)
              at com.adobe.idp.dsc.provider.impl.base.AbstractMessageReceiver.routeMessage(AbstractMessage Receiver.java:93)
              at com.adobe.idp.dsc.provider.impl.vm.VMMessageDispatcher.doSend(VMMessageDispatcher.java:22 5)
              at com.adobe.idp.dsc.provider.impl.base.AbstractMessageDispatcher.send(AbstractMessageDispat cher.java:66)
              at com.adobe.idp.dsc.clientsdk.ServiceClient.invoke(ServiceClient.java:208)
              at com.adobe.idp.jobmanager.execution.workadapter.AbstractExecutableJob.invokeRequest(Abstra ctExecutableJob.java:127)
              at com.adobe.idp.jobmanager.execution.workadapter.PersistentExecutableJob.execute(Persistent ExecutableJob.java:60)
              at com.adobe.idp.dsc.workmanager.adapter.UnManagedAsynchronousWorkAdapter.run(UnManagedAsync hronousWorkAdapter.java:39)
              at org.jboss.resource.work.WorkWrapper.execute(WorkWrapper.java:205)
              at org.jboss.util.threadpool.BasicTaskWrapper.run(BasicTaskWrapper.java:260)
              at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
              at java.lang.Thread.run(Thread.java:662)
    Caused by: com.adobe.printSubmitter.PrintException: 0 : Out of Memory in com.adobe.livecycle.formsservice.exception.RenderFormException, cause: 0 : Out of Memory in com.adobe.livecycle.formsservice.exception.FormServerException
              at com.adobe.printSubmitter.util.FormSubmitter.submit(FormSubmitter.java:211)
              at com.adobe.printSubmitter.util.FormSubmitter.submit(FormSubmitter.java:113)
              at com.adobe.printSubmitter.util.Splitter.endElementNoBatch(Splitter.java:204)
              at com.adobe.printSubmitter.util.Splitter.endElement(Splitter.java:179)
              at org.apache.xerces.parsers.AbstractSAXParser.endElement(Unknown Source)
              at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanEndElement(Unknown Source)
              at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch( Unknown Source)
              at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source)
              at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
              at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
              at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
              at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
              at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source)
              at com.adobe.printSubmitter.PrintServer.execute(PrintServer.java:541)
              ... 137 more
    Caused by: com.adobe.livecycle.formsservice.exception.RenderFormException: 0 : Out of Memory
              at com.adobe.formServer.FormServer.renderForm(FormServer.java:233)
              at com.adobe.formServer.FormServer.renderForm(FormServer.java:281)
              at com.adobe.formServer.docservice.FormsDocService.renderForm(FormsDocService.java:558)
              at com.adobe.printSubmitter.util.FormSubmitter.submit(FormSubmitter.java:187)
              ... 150 more
    Caused by: com.adobe.livecycle.formsservice.exception.FormServerException: 0 : Out of Memory
              at com.adobe.formServer.PA.XMLFormAgentWrapper.doPAExecute(XMLFormAgentWrapper.java:457)
              at com.adobe.formServer.PA.XMLFormAgentWrapper.execute(XMLFormAgentWrapper.java:203)
              at com.adobe.formServer.Controller.doXFARender(Controller.java:809)
              at com.adobe.formServer.Controller.doRender(Controller.java:623)
              at com.adobe.formServer.Controller.render(Controller.java:140)
              at com.adobe.formServer.FormServer.renderForm(FormServer.java:214)
              ... 153 more
    Caused by: com.adobe.document.xmlform.RenderException: IDL:com/adobe/document/xmlform/RenderException:1.0
              at com.adobe.formServer.PA.XMLFormAgentWrapper.doPAExecute(XMLFormAgentWrapper.java:422)
              ... 158 more

    This should help : http://help.adobe.com/en_US/livecycle/10.0/SharePointConfig/WS98f05ad7ea54cfa4-2fe92c60138 b24d77a7-7fff.html
    Thanks,
    Wasil

  • Unknown File, Out of Memory

    When opening my project sequence across a gigabit Ethernet connection to an Xsan, I first get: File Error: Unknown File, followed by: Error: Out of Memory.
    The Sequence is full of Jpegs, Tiffs, and some 10-Bit Uncompressed. How can I get to view this project..?
    Si

    As mentioned above, picture files, such as jpegs require RAM in order to playback. You mentioned you have a lot of TIFFs and JPEGs in this project. I'm still willing to bet that is exactly what is causing the Out of Memory error.
    Sure, you can copy files over the same network with great speeds, but try taking all of those jpegs and try transferring them simultaneously. I doubt you'll get great speeds doing that.
    You could have your editor use the Media Manager to copy the project to an external drive.

  • Java Memory Management/Out of Memory

    Hi Guys,
    I have a few questions about java memory management
    Because i keep encounter a lot of out of memory error which i think java does not handle Vector/ArrayList re initialisation automatically
    Asumme i have 2 million record in database and , i will process every 80000 and store it in Vector
    while(true)
    list = new Vector();
    list = GetResultFromDatabase() // Process Every 80000
    if list.size() > 0 =======> My VEctor list contain 80000
    //loop the 800000
    //Process Some logic and data
    list.clear();
    list = null;
    If u See , i need to call list.clear and list = Null every process so it wont cause me out of memory
    Before i put that 2 lines , i always hit out of memory Exception.
    Seems like garbage collector cannot claim memory if i dont declare
    Is Memory Occupied by VEctor cannot be recoverable if we dont explitcitynya clear it and set it to NULL??
    Because in term of logic wise it wont cause a problem if i just
    do in this statement after it process like below
    list = new Vector() which will reinstatiate the object.
    Thanks.

    Damm i should hacve read your post again
    Look here:
    while(true)
    list = new Vector();What uer doing is craeting a new vector object everytime the while does an ityteration so when your while loop does 40000 loops there will be 40000 new objects in jou memory
    i sugest moving the decleration outside the while loop:
      list = new Vector();
    while(true)
    ///rest of loop
    } This could also be a problem
    hope it help :-)
    werns

  • Out of memory error in java- flex communication using RTMP

    We are using flex and java in our application. due to RTMP call to fetch data every 5 seconds. it is hiting database and establishing connection and keeps it open hence this is causing for Out of memory issue. Can come one help on this issue.

    You probably want to ask here: http://forums.adobe.com/community/flex

  • Out of memory when reading USER_SDO_GEOM_METADATA

    Hi,
    I'm running a plsql procedure from sqldeveloper against a Oracle10g (Spatial) server.
    The procedure mainly performs a SDO_NN search.
    I was expecting the procedure processing time to be about 48h (due to the huge tables involved)
    But I always get this same kind of errors, breaking the process when six hours elapsed :
    ORA-29902: error in executing ODCIIndexStart() routine
    ORA-13203: failed to read USER_SDO_GEOM_METADATA view
    ORA-13203: failed to read USER_SDO_GEOM_METADATA view
    ORA-29400: data cartridge error
    ORA-04030: out of process memory when trying to allocate 16396 bytes. (QERHJ hash-joi,QERHJ Bit vector)
    extending memory could be a workaround
    but it wouldn't explain why the read of USER_SDO_GEOM_METADATA always causes the "out of memory".
    Thanks for your advice or explanation
    Clem

    but it wouldn't explain why the read of USER_SDO_GEOM_METADATA always causes the "out of memory".Reading USER_SDO_GEOM_METADATA wouldn't be the cause of 'out of memory'. It is more likely that some rogue process caused you to run out of memory and the next thing that was supposed to happen was a read of USER_SDO_GEOM_METADATA. This read failed as there was no memory left for the query to perform.
    So the failed read of USER_SDO_GEOM_METADATA is the symptom rather than the cause.
    A few general points:
    - What exact version of the 10g are you on? There were several memory related bugs fixed over the life of 10g. Ensure you are on the very latest patchset before spending any more time on this.
    - Try to ensure your data is valid. i.e. it passes SDO_GEOM.VALIDATE_LAYER_WITH_CONTEXT(). If you've a very big table, then it could be tricky to get all the data valid, but be aware that invalid data can lead to all sorts of problems including memory leaks.
    - Follow the general guidelines for ORA-04030 errors. In particular, ensure you have a sensible PGA_AGREGATE_TARGET value.
    Following that, I'd recommend posting the key parts of your pl/sql procedure here in case you're doing something in the code that is causing this error.

  • "Out of memory" error when compiling loki-like templates w/Typelists

    Using Forte C++ 5.3 with the latest [2002/03/20] patches, attempts to compile a loki-like Functor template [download at www.awl.com/cseng/titles/0-201-70431-5 or see "Modern C++ Design" by A. Alexandrescu] whose family of operator()() functions take up to 15 parameters cause an "Out of memory" error. The top(1) command showed that the complier's process size exceeded 2.0G and that the compiler had run for 20+ minutes.
    At 10 parameters, the compiler quickly [ <2 minutes ] and successfully compiles the same source. At 11 parameters, the compilation is very slow [ >5 minutes] yet still successful.
    Platform: SPARC SunFire 280R 500MHz

    I'm having similar problems:
    One source file that has a 13 argument recursive template takes a long time to build, and uses almost .75GB. Optimized builds of the same file never complete, and fail with:
    SunWS_cache: Error: Error occurred in invoked compiler, /opt/SUNWspro/bin/../WS6
    U2/bin/CC [process fork failure, errno = 12]. Aborting....
    operator()() functions take up to 15 parameters cause
    an "Out of memory" error. The top(1) command showed
    that the complier's process size exceeded 2.0G and
    that the compiler had run for 20+ minutes.
    At 10 parameters, the compiler quickly [ <2 minutes ]
    and successfully compiles the same source. At 11
    parameters, the compilation is very slow [ >5 minutes]
    yet still successful.
    Platform: SPARC SunFire 280R 500MHz

  • Thread Count and Out Of Memory Error

    Hi,
    I was wondering if setting the ThreadPoolSize to a value which is too low can
    cause an out of memory error. My thought is that when it is time for Weblogic
    to begin garbage collection, if it does not get a thread fast enough it is possible
    that memory would be used up before the garbage collection takes place.
    I am asking this because I am trying to track down the cause of an out-of-memory
    occurrence, while at the same time I believe I need to raise the ThreadPoolSize.
    Thanks,
    Mark

    Oops ...
    I was wondering if setting the ThreadPoolSize to a value which is too
    low can cause an out of memory error.No, but the opposite can be true.
    My thought is that when it is time for Weblogic
    to begin garbage collection, if it does not get a thread fast enough it is
    possible that memory would be used up before the garbage collection
    takes place.Weblogic doesn't do GC ... that's the JVM and if it needs a thread it will
    not be using one of Weblogic's execute threads.
    > I am asking this because I am trying to track down the cause of an
    out-of-memory occurrenceIt could be configuration (new vs. old heap for example), but it is probably
    just data that you are holding on to or native stuff (e.g. type 2 JDBC
    driver objects) that you aren't cleaning up correctly.
    while at the same time I believe I need to raise the ThreadPoolSize.Wait until you fix the memory issue.
    Peace,
    Cameron Purdy
    Tangosol, Inc.
    Clustering Weblogic? You're either using Coherence, or you should be!
    Download a Tangosol Coherence eval today at http://www.tangosol.com/
    "Mark Glatzer" <[email protected]> wrote in message
    news:[email protected]..

  • InDesign - out of memory error

    Hi All,
    One of a clients is experiencing a issue whilst using InDesign CS4 version 6.0.5, they are constantly interrupted by an “Out of Memory” error while using InDesign.
    They have been experiencing this issue for some time now and we have performed memory upgrades and also upgraded to version 6.0.5 (which was meant to solve the issue) several months ago and yet they are still getting this issue.
    The issue that they are experiencing is the following error that was said to have been resolved in the 6.0.5 update.
    Document containing hundreds of text frames and custom baseline grids takes a long time to open and
    causes an “out of memory” error, followed by an unexpected quit. [2253219]
    Has there been another hot fix or a known resolution for this issue? Anyone else experiencingthis?
    Cheers,
    Allan

    Hi Scott,
    Their system has the following specs;
    OS:     Windows Vista Business 32-Bit
    RAM:  3GB DDR2
    CPU:   E8400 @ 3.00Ghz (Dual Core)
    Garphics: Onboard intel G33/G31
    Regards,
    Allan

  • Repeated Opening of  database in a Txn  causes Logging region out of memory

    Hi
    BDB 4.6.21
    When I open and close a single database file repeatedly, it causes the error message "Logging region out of memory; you may need to increase its size". I have set the 65KB default size for set_lg_regionmax. Is there any work around for solving this issue, other than increasing the value for the set_lg_regionmax. Even if we set it to a higher value, we cannot predict how the clients of BDB will opens a file and closes a database file. Following is a stand alone program, using which one can reproduce the scenario.
    void main()
    const int SUCCESS = 0;
    ULONG uEnvFlags = DB_CREATE | DB_INIT_MPOOL | DB_INIT_LOG | DB_INIT_TXN |DB_INIT_LOCK | DB_THREAD;// |
    //DB_RECOVER;
    LPCSTR lpctszHome = "D:\\Nisam\\Temp";
    int nReturn = 0;
    DbEnv* pEnv = new DbEnv( DB_CXX_NO_EXCEPTIONS );
    nReturn = pEnv->set_thread_count( 20 );
    nReturn = pEnv->open( lpctszHome, uEnvFlags, 0 );
    if( SUCCESS != nReturn )
    return 0;
    DbTxn* pTxn = 0;
    char szBuff[MAX_PATH];
    UINT uDbFlags = DB_CREATE | DB_THREAD;
    Db* pDb = 0;
    Db Database( pEnv, 0 );
    lstrcpy( szBuff, "DBbbbbbbbbbbbbbbbbbbbbbbbbbb________0" );// some long name
    // First create the database
    nReturn = Database.open( pTxn, szBuff, 0, DB_BTREE, uDbFlags, 0 );
    nReturn = Database.close( 0 );
    for( int nCounter = 0; 10000 > nCounter; ++nCounter )
    // Now repeatedly open and close the above created database
    pEnv->txn_begin( pTxn, &pTxn, 0 );
    Db Database( pEnv, 0 );
    nReturn = Database.open( pTxn, szBuff, 0, DB_BTREE, uDbFlags, 0 );
    if( SUCCESS != nReturn )
    // when the count reaches 435, the error occurs
    pTxn->abort();
    pDb->close( 0 );
    pEnv->close( 0 );
    return 0;
    pTxn->abort();
    pTxn = 0;
    Database.close( 0 );
    By the way, following is the content of my DB_CONFIG file
    set_tx_max 1000
    set_lk_max_lockers 10000
    set_lk_max_locks 100000
    set_lk_max_objects 100000
    set_lock_timeout 20000
    set_lg_bsize 1048576
    set_lg_max 10485760
    #log region: 66KB
    set_lg_regionmax 67584
    set_cachesize 0 8388608 1
    Thanks are Regards
    Nisam

    Hi Nisam,
    I was able to reproduce the problem using Berkeley DB 4.6.21. The problem is with releasing the FNAME structure in certain cases involving abort Transactions. In a situation where you have continuous (in a loop) transactional (open, abort, close) of databases you will notice (as you did) that the log region size needs to be increased (set_lg_regionmax).
    This problem was identified and reproduced yesterday (thanks for letting us know about this) and is reported as SR #15953. It will be fixed in the next release of Berkeley DB and is currently in code review/regression testing. I have a patch that you can apply to Berkeley DB 4.6 and have confirmed that your test program runs with the patch applied. If you send me email at (Ron dot Cohen at Oracle) I’ll send the patch to you.
    As you noticed, commiting the transaction will run cleanly without error. You could do that (with the suggestiion DB_TXN_NOSYNC below) but you may not even need transactions for this.
    I want to expand a bit on my recommendation that you not abort transactions in the manner that you are doing (though with the patch you can certainly do that). First, the open/close database is a heavyweight operation. Typically you create/open your databases and keep them open for the life of the application (or a long time).
    You also mentioned, that you noticed commits may have taken a longer time. We can talk about that (if you email me), but you could consider using the DB_TXN_NOSYNC flag losing durability. Make sure that this suggestion will work with your application requirements.
    Even if you have (create/open/get/commit/abort) that should not need transactions for a single get operation. For that case, there would be no logging for the open and close therefore this sequence would be faster. This was a code snippet so what you have in your application may be a lot more complicated and justify what you have done. But the simple test case above should not require a transaction since you are doing a single atomic get.
    I hope this helps!
    Ron Cohen
    Oracle Corporation

  • HT201317 dear, i have a question please.  if i want to keep a folder of pictures in my icloud account; then delete this folder off my phone would it be deleted from icloud account too?? cause im running out of memory on my iphone

    dear, i have a question please.  if i want to keep a folder of pictures in my icloud account; then delete this folder off my phone would it be deleted from icloud account too?? cause im running out of memory on my iphone
    thanks

    You can't use icloud to store photos.  There's a service called photo stream which is used to sync photos from one device to another, but this can't be used to store photos permanently.
    Always sync photos from a device to your computer,  Then you can delete them from the device.
    Read this...  import photos to your computer...
    http://support.apple.com/kb/HT4083

  • What causes my iphone 5s to keep on running out of memory with any new download. I recently updated to  ios 8.2

    What causes my iphone 5s to keep on running out of memory without any new download. I recently updated to  ios 8.2

    I meant to say without downloading or receiving anything

  • Result Set Causing out of memory issue

    Hi,
    I am having trouble to fix the memory issue caused by result set.I am using jdk 1.5 and sql server 2000 as the backend. When I try to execute a statement the result set returns minimum of 400,000 records and I have to go through each and every record one by one and put some business logic and update the rows and after updating around 1000 rows my application is going out of memory. Here is the original code:
    Statement stmt = con.createStatement();
    ResultSet   rs = st.executeQuery("Select * from  database tablename where field= 'done'");
                while(rs!=null && rs.next()){
                System.out.println("doing some logic here");
    rs.close();
    st.close();
    I am planning to fix the code in this way:
    Statement stmt = con.createStatement(ResultSet.TYPE_FORWARD_ONLY,
                          ResultSet.CONCUR_UPDATABLE);
    stmt.setFetchSize(50);
    ResultSet   rs = st.executeQuery("Select * from  database tablename where field= 'done'");
                while(rs!=null && rs.next()){
                System.out.println("doing some logic here");
    rs.close();
    st.close();But one of my colleague told me that setFetchSize() method does not work with sql server 2000 driver.
    So Please suggest me how to fix this issue. I am sure there will be a way to do this but I am just not aware of it.
    Thanks for your help in advance.

    Here is the full-fledged code.There is Team Connect and Top Link Api being used. The code is already been developed and its working for 2-3 hours and then it fails.I just have to fix the memory issue. Please suggest me something:
    Statement stmt = con.createStatement();
    ResultSet   rs = st.executeQuery("Select * from  database tablename where field= 'done'");
                while(rs!=null && rs.next()){
                 /where vo is the value object obtained from the rs row by row     
                if (updateInfo(vo, user)){
                               logger.info("updated : "+ rs.getString("number_string"));
                               projCount++;
    rs.close();
    st.close();
    private boolean updateInfo(CostCenter vo, YNUser tcUser) {
              boolean updated;
              UnitOfWork unitOfWork;
              updated = false;
              unitOfWork = null;
              List projList_m = null;
              try {
                   logger.info("Before vo.getId() HERE i AM" + vo.getId());
                   unitOfWork = FNClientSessionManager.acquireUnitOfWork(tcUser);
                   ExpressionBuilder expressionBuilder = new ExpressionBuilder();
                   Expression ex1 = expressionBuilder.get("application")
                             .get("projObjectDefinition").get("uniqueCode").equal(
                                       "TABLE-NAME");
                   Expression ex2 = expressionBuilder.get("primaryKey")
                             .equal(vo.getPrimaryKey());// primaryKey;
                   Expression finalExpression = ex1.and(ex2);
                   ReadAllQuery projectQuery = new ReadAllQuery(FQUtility
                             .classForEntityName("EntryTable"), finalExpression);
                   List projList = (List) unitOfWork.executeQuery(projectQuery);
                   logger.info("list value1" + projList.size());
                   TNProject project_hist = (TNProject) projList.get(0); // primary key
                   // value
                   logger.info("vo.getId1()" + vo.getId());
                   BNDetail detail = project_hist.getDetailForKey("TABLE-NAME");
                   project_hist.setNumberString(project_hist.getNumberString());
                   project_hist.setName(project_hist.getName());
                   String strNumberString = project_hist.getNumberString();
                   TNHistory history = FNHistFactory.createHistory(project_hist,
                             "Proj Update");
                   history.addDetail("HIST_TABLE-NAME");
                   history.setDefaultCategory("HIST_TABLE-NAME");
                   BNDetail histDetail = history.getDetailForKey("HIST_TABLE-NAME");
                   String strName = project_hist.getName();
                   unitOfWork.registerNewObject(histDetail);
                   setDetailCCGSHistFields(strNumberString, strName, detail,
                             histDetail);
                   logger.info("No Issue");
                   TNProject project = (TNProject) projList.get(0);
                   project.setName(vo.getName());
                   logger.info("vo.getName()" + vo.getName());
                   project.setNumberString(vo.getId());
                   BNDetail detailObj = project.getDetailForKey("TABLE-NAME"); // required
                   setDetailFields(vo, detailObj);//this method gets the value from vo and sets in the detail_up object
                   FNClientSessionManager.commit(unitOfWork);
                   updated = true;
                   unitOfWork.release();
              } catch (Exception e) {
                   logger.warn("update: caused exception, "
                             + e.getMessage());
                   unitOfWork.release();
              return updated;
         }Now I have tried to change little bit in the code. And I added the following lines:
                        updated = true;
                     FNClientSessionManager.release(unitOfWork);
                     project_hist=null;
                     detail=null;
                     history=null;
                     project=null;
                     detailObj=null;
                        unitOfWork.release();
                        unitOfWork=null;
                     expressionBuilder=null;
                     ex1=null;
                     ex2=null;
                     finalExpression=null;
    and also I added the code to request the Garbage collector after every 5th update:
    if (updateInfo(vo, user)){
                               logger.info("project update : "+ rs.getString("number_string"));
                               projCount++;
                               //call garbage collector every 5th record update
                               if(projCount%5==0){
                                    System.gc();
                                    logger.debug("Called Garbage Collectory on "+projCount+"th update");
                          }But now the code wont even update the single record. So please look into the code and suggest me something so that I can stop banging my head against the wall.

  • Help! Mid files cause "out of memory" session shut down.

    When I export .mid files from Reason 7 and import them into Logic Pro 9, it give me the error message "out of memory" and shuts down the session. Help!!

    I tried a number of values that were multiples of 1024 (2048, etc).
    The only thing that worked seemed to be changing the number and rebooting the server machine.
    Not sure what caused the initial problem though.  Saw very wierd activity in the FMS console.  Had a few hundred users connected, as I typically do for one of my events, and the chat application was repeatedly unloaded from FMS.  When it was reloaded the FMS console would show crazy large numbers in the "Clients" column - numbers like 3954.  Then it would drop to like 340.  Then shoot up again to wildly large numbers.
    Is that the result of a memory problem?

  • Top Link Causes out of memory issue when millions of records need to update

    Hello everyone,
    I am using TopLink 9.0.4 in a batch process. The batch process reads from the temp table(temp table has millions of records one month worth of data which need be updated). The database being used is sqlserver 2005. Below is the snippet of code. It works for 6-7 hours and crashes after that due of out of memory:
    ExpressionBuilder expressionBuilder = new ExpressionBuilder();
    Statement stmt = con.createStatement();
    ResultSet rs = st.executeQuery("Select * from database tablename where field= 'done'");
    while(rs!=null && rs.next()){
    *//where vo is the value object obtained from the rs row by row*     
    if (updateInfo(vo, user,expressionBuilder )){
                   logger.info("updated : "+ rs.getString("col_name"));
                   projCount++;
    rs.close();
    st.close();
    private boolean updateInfo(ProjectVO vo, YNUser tcUser,expressionBuilder ) {
              boolean updated;
              updated = false;
              try {
                   updated = true;
              } catch (Exception e) {
                   logger.warn("update: caused exception, "
                             + e.getMessage());
              return updated;
    Edited by: user8981696 on Jan 14, 2010 1:00 PM

    Thanks for your reply.
    Please find below the answers to you suggestions/concerns:
    You seem to be using raw JDBC to select all of the records in a single result set, not sure if this may be causing a memory issue. You could try paging through the results instead.
    Ans: I have modified the code to get me 1000 records each time and I am getting the ResultSet by using PrepartedStatement instead of regular Statement object.
    What type of caching are you using?
    Ans: No caching is being used. If you have some thoughts on caching please suggest or put some sample code. Again there is no AppServer is being used, its just a regular java process(Batch process) so I dont know how to do caching in a simple java process.
    You may also wish to try the latest 9.0.4 patch release, or try the 10.1.3 version, or the latest EclipseLink 2.0 release.
    Ans: Where can I find the latest patch release 9.0.4?
    Any help/suggestion is really appreciated!

Maybe you are looking for