OutOfMemoryException ...

The method below is called several thousand times when 50 users are working with the same screen/form. Result: 2 hours later, the application throws an OutOfMemoryException. The profiler indicates that this method creates many XMLAttr instances when calling the setAttribute method. These objects are not garbage collected by the JVM.
What is wrong with this method? Or is this a bug in the XML parser (Oracle)?
The mDataRowModelDataRowElement object is created when the user starts working with the form and it can be garbage collected when the user exits the given form, which never happens because the application throws an OutOfMemoryException before the user could finish his/her job (after 2 hours).
public Element getRowElement( int pRow, boolean pFull )
throws DTMException
//Performance: this method is called several thousand times.
Element dataRowElement = (Element)m_dataRowsElement.getChildNodes().item( pRow );
if ( dataRowElement != null && pFull )
NodeList dataCellsElements = dataRowElement.getChildNodes();
NodeList modelDataCellsElements = mDataRowModelDataRowElement.getChildNodes();
//int dataCellsElementsLength = dataCellsElements.getLength();
Element dataCellElement;
Element modelDataCellElement ;
for ( int i = 0; i < dataCellsElementsLength; i++ )
// ???? look into making this a separate method as something
// similar is also called from the dataCells.
dataCellElement = (Element)dataCellsElements.item(i);
modelDataCellElement = (Element)modelDataCellsElements.item(i);
dataCellElement.setAttribute( "entName", modelDataCellElement.getAttribute( "entName") );
dataCellElement.setAttribute( "SQLName", modelDataCellElement.getAttribute( "SQLName") );
// we cannot change this otherwise it thinks it has changed.
// dataCellElement.setAttribute( "oldVal", modelDataCellElement.getAttribute( "oldVal") );
dataCellElement.setAttribute( "cls", modelDataCellElement.getAttribute( "cls") );
dataCellElement.setAttribute( "keyOrd", modelDataCellElement.getAttribute( "keyOrd") );
dataCellElement.setAttribute( "ord", modelDataCellElement.getAttribute( "ord") );
dataCellElement.setAttribute( "upd", modelDataCellElement.getAttribute( "upd") );
return dataRowElement;
}

Yes, this method is called many times, but the application is being tested with LoadRunner and after creating attributes for a row (query), the next step of the test scenerio is a roll back operation and this will remove the rows from the DOM object and the row holds the attributes created in the method you see above.
This is how the rows are removed from the DOM (m_dataRowsElement).
1. create a nodelist (m_dataRowsElement.getChildNodes();)
2. create a loop
3. call the removeChild method in the loop
m_dataRowsElement.removeChild( node );

Similar Messages

  • Urgent: System.OutOfMemoryException in SQL Reporting Services 2005 Web Service

    Hi All,
    On a production server box with 8GB of RAM, we are running a windows service which retrieves image format bytes of a particular report which results in pages ranging from 10 to 300 using Reporting Services 2005 Web Service.
    Problem is when image bytes are being retrieved by using Render method for a report with more than around 30-40 pages,
    Report Web Service throws an exception of type System.OutOfMemoryException. This does not happen with reports with smaller pages. Please note that we are retrieving high DPI image and we have no option but to do so.
    Since the production server has 8 GB of RAM and we have even enabled the 3GB switch in boot.ini of Windows Server 2003, This problem still exists.
    I have already read the contents of the article http://support.microsoft.com/kb/909678 but as i mentioned even the large physical memory does not seem to help and shortening the report is not an option.
    Also note that large number of records is also not a problem as our application works fine when we generate excel files instead of images.
    Follwing is the detailed exception that we receive.
    System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
       at System.String.GetStringForStringBuilder(String value, Int32 startIndex, Int32 length, Int32 capacity)
       at System.Text.StringBuilder.set_Capacity(Int32 value)
       at System.Xml.BufferBuilder.ToString()
       at System.Xml.XmlTextReaderImpl.ParseText()
       at System.Xml.XmlTextReaderImpl.ParseElementContent()
       at System.Xml.XmlTextReaderImpl.Read()
       at System.Xml.XmlTextReader.Read()
       at System.Xml.XmlTextReaderImpl.InitReadElementContentAsBinary()
       at System.Xml.XmlTextReaderImpl.ReadElementContentAsBase64(Byte[] buffer, Int32 index, Int32 count)
       at System.Xml.XmlTextReader.ReadElementContentAsBase64(Byte[] buffer, Int32 index, Int32 count)
       at System.Xml.Serialization.XmlSerializationReader.ReadByteArray(Boolean isBase64)
       at System.Xml.Serialization.XmlSerializationReader.ToByteArrayBase64(Boolean isNull)
       at Microsoft.Xml.Serialization.GeneratedAssembly.XmlSerializationReaderReportExecutionService.Read34_RenderResponse()
       at Microsoft.Xml.Serialization.GeneratedAssembly.ArrayOfObjectSerializer25.Deserialize(XmlSerializationReader reader)
       at System.Xml.Serialization.XmlSerializer.Deserialize(XmlReader xmlReader, String encodingStyle, XmlDeserializationEvents events)
       at System.Xml.Serialization.XmlSerializer.Deserialize(XmlReader xmlReader, String encodingStyle)
       at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall)
       at System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters)
       at ReportExecution2005.ReportExecutionService.Render(String Format, String DeviceInfo, String& Extension, String& MimeType, String& Encoding, Warning[]& Warnings, String[]& StreamIds)

    Hi Hameer,
    This error might be caused by the memory limit.  There are two settings that will affect the memory configuration.
    When you render a report through the Reporting Services Web service, the Reporting Services Web service obtains the "MemoryLimit" setting from the Machine.config file. 
    A scheduled report is rendered by the Report Server Windows service. The Report Server Windows service obtains the "MemoryLimit" setting from the RSReportServer.config file.
    These values represent a percentage of physical memory. You can adjust them to change the memory limit.
    If there are any more questions, please let me know.
    Thanks.

  • Crystal report Using Push Method (OutOfMemoryException)

    Hello,
    i am developping reports using Sap Crystal reports , i am using the push method ( which uses a DataSet for binding informations with the reports) , and i want to display a large data, but i m getting an OutOfMemoryException, because of using the dataset,
    is there any solution to solve this problem?

    This is my code, and i think it's correct, i was using this code for previous versions of crystal report (cr10) and it worked fine,
    now i am using crystal report v 13 sp1 with visual studio 2010,
    When i used the Data set , the next page and zoom worked fine, but with stored procedure it doesn't work,
    Note: when i open the Tree of Data on the left,  then click on the next page or the zoom, it become funtional, it's very weird,
            protected void Page_Init(object sender, EventArgs e)
                if (!IsPostBack)
                    reportDocument = (ReportDocument)this.Page.Session["_reportDocument"];
                    if (reportDocument != null)
                            string rptFile = reportDocument.FileName.Split('
    ', '/').Last();
                            LoadReportRessource(reportDocument);
                                TableLogOnInfo log = new TableLogOnInfo();
                                ConnectionStringSettings conn = WebConfigurationManager.ConnectionStrings[0]; //la premiere connectionstring
                                SqlConnectionStringBuilder SConn = new SqlConnectionStringBuilder(conn.ConnectionString);
                                log.ConnectionInfo.ServerName = SConn.DataSource;
                                log.ConnectionInfo.DatabaseName = SConn.InitialCatalog;
                                log.ConnectionInfo.UserID = SConn.UserID;
                                log.ConnectionInfo.Password = SConn.Password;
                                log.ConnectionInfo.Type = ConnectionInfoType.SQL;
                                for (int i = 0; i < reportDocument.Database.Tables.Count; i++)
                                    reportDocument.Database.Tables<i>.ApplyLogOnInfo(log);
                                for (int i = 0; i < CrystalReportViewer1.LogOnInfo.Count; i++)
                                    CrystalReportViewer1.LogOnInfo<i>.ConnectionInfo = log.ConnectionInfo;
                            CrystalReportViewer1.ParameterFieldInfo = (ParameterFields)this.Page.Session["_paramFields"];
                            CrystalReportViewer1.ReportSource = reportDocument;
                            CrystalReportViewer1.DataBind();
                            CrystalReportViewer1.ShowFirstPage();
                else
                    CrystalReportViewer1.ReportSource = (ReportDocument)this.Page.Session [ "_reportDocument" ] ;

  • PreparedStatement execution cause OutOfMemoryException

    Hello,
    I use PreparedStatement for getting data from database. It works well unless there is much data to get. If the result set has many rows (over 1000000) I get OutOfMemory error. I tried use Statement instead of PreparedStatement and tried to set my parameters value manually, but it doesn't work (in postgre database):
    sqlQuery = select * from employee_tmp where hire_date > ?; PreparedStatement st = connection.prepareStatement(sqlQuery); st.setDate(1, myDate); ResultSet rs = st.executeQuery();
    Result is OK, but for much data I get OutOfMemory error.
    sqlQuery = select * from employee_tmp where hire_date > myDate.toString(); Statement st = connection.createStatement(); ResultSet rs = st.executeQuery(sqlQuery);
    Getting data from database is fast, but I get all records, not only these from the first case.
    Thank you for any help in advanced.
    Agata

    agad wrote:
    Hello,
    I use PreparedStatement for getting data from database. It works well unless there is much data to get. If the result set has many rows (over 1000000) I get OutOfMemory error. I tried use Statement instead of PreparedStatement and tried to set my parameters value manually, but it doesn't work (in postgre database):
    sqlQuery = select * from employee_tmp where hire_date > ?;
    PreparedStatement st = connection.prepareStatement(sqlQuery);
    st.setDate(1, myDate);
    ResultSet rs = st.executeQuery();Result is OK, but for much data I get OutOfMemory error.
    sqlQuery = select * from employee_tmp where hire_date > myDate.toString();
    Statement st = connection.createStatement();
    ResultSet rs = st.executeQuery(sqlQuery);Getting data from database is fast, but I get all records, not only these from the first case.
    Thank you for any help in advanced.
    AgataThe PreparedStatement itself should definately not cause an OutOfMemoryException. Are you maybe looping through the ResultSet and storing the rows in an List or something? If it is truely the Driver throwing it, then ask at a PostGreSQL Driver forum, and probably post a bug report (to PostGreSQL, not Sun).

  • URGENT - java.lang.OutOfMemoryException

    Hi, I am having problems with a simple JSP-based application which is deployed on Oracle Application Server 10g (version 9.0.3.1). When only 1 person is logged on it works fine, however when more than 1 person attempts to log on the following error is displayed:
    Server error: java.lang.OutOfMemoryException
    <<no stack trace found>>
    The Java heap stack trace on the application server is set to 512MB, and the Home Page is pulling about 2000 records and displaying them in an HTML table. Any help would be greatly appreciated as the deadline for this project is a few days away. Thanks

    It seems that an even larger memory setting can not solve prevent this OutOfMemoryException. The reasonable way out is to reduce the memory usage of your application..
    The first thought that occurs to me is that you do not have to pulling all 2000 records from the database at one swoop. You can query the database on demand. A human being can hardly take a serious look at 2000 records on one page. Ask the user to provide more restrictions on their search. Split the result across several pages. Then make the more restrictive query. (Think of google, a query can return millions of matches. We only see the first ten normally.)
    When only 1 person is logged on it works fine, however when more than
    1 person attempts to log on ...you can cache the html output in strings somewhere, say, in pageContext with application scope, the first time the client access your page. Next time, just output the strings. Then no matter how many persons are viewing the same page, the memory usage will hardly go up. This applies to the case when the content of the output does not change much.
    You may find other ways to reduce the memory usage. Good luck!

  • Installing Oracle Forms Reports 1112 throws OutOfMemoryException: PermSpace

    Installing Oracle Forms Reports 11.1.2 on Windows 2008 64bit throes OutOfMemoryException: PermSpace when running the ASInstance configuration. I tried to configure the JAVA_OPTIONS variable to increase the MaxPermSize parameter but is not working.
    Regards,
    Néstor Boscán

    The server has 16GB of memory and this is a new installation. I tried to configure JAVA_OPTIONS=-XX:MaxPermSize=512m and JAVA_OPTIONS=-XX:MaxPermSize=1024m but it didn't work.

  • Full Optimize Error - OutOfMemoryException

    Hello
    Has anyone come across the following error when running a full Optimize from SAP BPC Administration.
    Error Message: Exception of type system OutOfMemoryException was thrown.
    I've raised a support call but would appreciate any help on knowing why this happens and how I can resolve it.
    Many thanks
    Hina

    The size of your SQL log file needs to be increased.  Depending on your drive space you can change your autogrowth options to unrestricted and that should solve the issue.

  • Can creating new object without object reference cause OutOfMemoryException

    I am getting OutOfMemoryException in my application. After looking at the logs and doing some analysis I think creating a new object and not attaching it to a reference is causing the issue.
    The simplified code is as below:
    void valuate(int tradeNum){
    new CaptureTrade().process(tradeNum); //1
    Will the above code called n number of times cause OutOfMemoryException?
    Should I use something like this instead:
    void valuate(int tradeNum){
    CaptureTrade ct = new CaptureTrade();
    ct.process(tradeNum); //2
    Can the first program cause OutOfMemoryException which can be rectified using the second piece of code?

    ashay wrote:
    I am getting OutOfMemoryException in my application. After looking at the logs and doing some analysis I think creating a new object and not attaching it to a reference is causing the issue.
    The simplified code is as below:
    void valuate(int tradeNum){
    new CaptureTrade().process(tradeNum); //1
    Will the above code called n number of times cause OutOfMemoryException?
    Should I use something like this instead:
    void valuate(int tradeNum){
    CaptureTrade ct = new CaptureTrade();
    ct.process(tradeNum); //2
    Can the first program cause OutOfMemoryException which can be rectified using the second piece of code?What happened when you tried it?

  • System.OutOfMemoryException Oracle.DataAccess.Client.OracleParameter

    Hi , nt
    I am working on a legacy system where the end users reported exception while loading a page. This is developend in ASP.NET 2.0 and Oracle.DataAccess. The exception is not reproducible in any of the staging or development serves, and is not consistent.ie sometime works and sometiem not. The exception is as follows,
    System.OutOfMemoryException: Insufficient memory to continue the execution of the program. at System.Runtime.InteropServices.Marshal.AllocCoTaskMem(Int32 cb) at Oracle.DataAccess.Client.OracleParameter.PreBind_Char() at Oracle.DataAccess.Client.OracleParameter.PreBind(OracleConnection conn, IntPtr errCtx, Int32 arraySize) at Oracle.DataAccess.Client.OracleCommand.ExecuteNonQuery()
    The way it calls in the data access layes are as follows, removed some parameters to just to get flow clear,
    public void GetXXXData    (
          UserCredentials credentials
          , decimal? P_IN_ID
          , string P_IN_TYPE
          , out string[] P_OUT_XX_ID
          , out string[] P_OUT_XX_NAME
         , int? outputArrayLength
         if (outputArrayLength == null)
            outputArrayLength = 3000; 
          OdacCmd cmd = null;   
          P_OUT_XX_ID = null;
          P_OUT_XX_NAME = null;
          cmd = OdacFactory.CreateCommand("PK_XX_QAS1.pr_xx_data_q", credentials);
          cmd.AddParam("P_IN_ID", P_IN_ID);
          cmd.AddParam("P_IN_QUAL_TYPE", P_IN_TYPE);
          IDataParameter param_P_OUT_XX_ID = cmd.AddOutputParameterString("P_OUT_XX_ID", outputArrayLength.Value);
          IDataParameter param_P_OUT_XX_NAME = cmd.AddOutputParameterString("P_OUT_XX_NAME", outputArrayLength.Value);
           cmd.ExecuteNonQuery();
            P_OUT_XX_ID = OdacFactory.GetStringArray(param_P_OUT_XX_ID.Value);
            P_OUT_XX_NAME = OdacFactory.GetStringArray(param_P_OUT_XX_NAME.Value);
    Any help on this is greatly apprecitaed
    -Jyothish George

    What version of the client / odp.net are you using?
    I took your code and pretty much pasted it into Visual Studio Express:
    Module Module1
      Sub Main()
        TestOracle()
        TestMicrosoft()
      End Sub
      Private Sub TestOracle()
        Dim builder As New Oracle.DataAccess.Client.OracleConnectionStringBuilder
        builder.DataSource = "lt10gr2"
        builder.UserID = "scott"
        builder.Password = "tiger"
        Dim oraConn As Oracle.DataAccess.Client.OracleConnection = New Oracle.DataAccess.Client.OracleConnection(builder.ConnectionString)
        Dim oraCMD As New Oracle.DataAccess.Client.OracleCommand("SELECT empno, ename FROM emp", oraConn)
        Dim MyDA As New Oracle.DataAccess.Client.OracleDataAdapter(oraCMD)
        Dim ds As New DataSet
        MyDA.Fill(ds)
      End Sub
      Private Sub TestMicrosoft()
        Dim builder As New System.Data.OracleClient.OracleConnectionStringBuilder
        builder.DataSource = "lt10gr2"
        builder.UserID = "scott"
        builder.Password = "tiger"
        Dim oraConn As System.Data.OracleClient.OracleConnection = New System.Data.OracleClient.OracleConnection(builder.ConnectionString)
        Dim oraCMD As New System.Data.OracleClient.OracleCommand("SELECT empno, ename FROM emp", oraConn)
        Dim MyDA As New System.Data.OracleClient.OracleDataAdapter(oraCMD)
        Dim ds As New DataSet
        MyDA.Fill(ds)
      End Sub
    End ModuleIt ran with no errors on my system.
    C:\>gacutil /l | find "Oracle.DataAccess"
      Oracle.DataAccess, Version=2.102.2.20, Culture=neutral, PublicKeyToken=89b483f429c47342, processorArchitecture=x86- Mark

  • Java.lang.OutOfMemoryException during JSP compilation

    Hi,
              hunting down a problem at our customer's site, I upgraded our test server to WebLogic 8.1SP6 from SP5, just on the hunch this might have influenced the problem. What I ended up doing was a complete wipe of our SP5 installation and making a fresh SP6 setup.
              That was a very interesting experience. Lot's of changes. APP-INF/lib working, Windows service creation working. Interesting.
              However, the application failed a few pages inwards with an OutOfMemoryException during JSP compilation. I upped the JVM's memory with -Xms and -Xmx, but it still fails. It didn't with SP5, with all settings at their defaults.
              Anybody an idea?
              Bert Laverman
              Perot Systems Nederland BV
              <pre>javax.servlet.ServletException: JSP compilation of /SomeJSPFile.jsp failed: weblogic.utils.compiler.CodeGenerationException: Exception: 'java.lang.OutOfMemoryError' while trying to invoke: serviceMethod at line 46 - with nested exception:
              [weblogic.utils.compiler.CodeGenerationException: Exception: 'java.lang.OutOfMemoryError' while trying to invoke: serviceMethod at line 46 - with nested exception:
              [java.lang.reflect.InvocationTargetException - with target exception:
              [java.lang.OutOfMemoryError]]]
                   at weblogic.servlet.jsp.JspStub.prepareServlet(JspStub.java:256)
                   at weblogic.servlet.jsp.JspStub.prepareServlet(JspStub.java:196)
                   at weblogic.servlet.internal.ServletStubImpl.getServlet(ServletStubImpl.java:598)
                   at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:406)
                   at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:526)
                   at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:28)
                   at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:27)
                   at com.company.application.impl.TransactionInViewFilter.doFilter(TransactionInViewFilter.java:89)
                   at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:27)
                   at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:7053)
                   at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
                   at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
                   at weblogic.servlet.internal.WebAppServletContext.invokeServlet(WebAppServletContext.java:3902)
                   at weblogic.servlet.internal.ServletRequestImpl.execute(ServletRequestImpl.java:2773)
                   at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:224)
                   at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:183)
              </pre>

    How did you resolve this problem? I got same error          > ever since upgraded server from sp5 to sp6. Thanks in
              > advance.
              The thing I found is that the startWebLogic.cmd and startManagedWebLogic.cmd scripts (which is where I did the memory settings) are not used at all if you're using managed servers. If you want to adjust memory, it has to be through the settings for that server in the WebLogic console, which is a pain, because you cannot enter one field and have WLS provide defaults for the rest. It's a all-or-nothing. When I finally got them right the managed server process correctly used the new settings and the problem disappeared.
              Cheers,
              Bert

  • [U8.1][C#] HttpWebRequest's GetResponseAsync fails with OutOfMemoryException

    Hi everyone,
    I'm having a strange issue with fetching data from a web service. Here's my code, with a few boring bits trimmed:
    private async Task<IEnumerable<string[]>> GenericGrab(string query)
    var req = (HttpWebRequest) WebRequest.Create(new Uri("http://example.org/" + query));
    HttpWebResponse res = null;
    try
    res = (HttpWebResponse) await req.GetResponseAsync().ConfigureAwait(false); // BOOM!
    // Boring exception handlers skipped
    finally
    if (res != null)
    res.Dispose();
    StreamReader read = new StreamReader(res.GetResponseStream());
    string l;
    while (true) {
    l = await read.ReadLineAsync().ConfigureAwait(false);
    if (l == null) break;
    // Process lines here
    // Return result of processing
    This works fine on requests that return 20 KB of data and dies, immediately after running the line containing GetResponseAsync, with an OutOfMemoryException (wrapped in the XAML-generated break on unhandled exception) with requests that return 600 KB of
    data. (The way this code is normally used is to first request a small amount of data, then a larger amount. If I remove the second call, I can do hundreds of request cycles without running into an exception.)
    I find this rather strange because as far as I understand, GetResponseAsync should not even read in all the data... after all, if it did, using a response stream would not make much sense. So, what could be the issue?

    > "/usr/ccs/bin/nm" fails to demangle
    Demangler installed on Solaris by default was unable to process all kinds of gcc-style-mangled names.
    Demangler in Beta is updated to handle all those names but it does not help your system 'nm'.
    You can use c++filt that is part of Beta to demangle it.
    > Ut96 versus Ut98
    Ut stands for "unnamed type".
    It seems that va_list presence in signature leads to unstable mangling.
    Converting it to something else might help. Say, wrap into the struct with some name.
    regards,
      Fedor.

  • Wldeploy OutOfMemoryException

    Hello,
    I am using WLDeploy ant task to deploy my exploded EAR with no stage option. I am getting every 3 or 4 deployements an OutOfMemoryException. I tried to tune the JVM parameters without success.
    My project is compound of many thrid part lib ( about 25 jars)
    I understood that JVM store class definitions separtly from the heap. Thus , may the issue be linked to the classpath management of Weblogic that load each time all the class ?
    I am using : WL 8.1 SP5 with JDK 1.4_08
    Any hints? thx

    Are you getting OOM errors during the deployment? Or after when the app is running?

  • System.InsufficientMemoryException: Failed to allocate a managed memory buffer of 268435456 bytes. The amount of available memory may be low. --- System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.

    Appfabric 1.1 server setup on 3 Windows server 2008 R2 machines.
    Client windows 7 64 bit.
    Because there is no bulkupdate ,we are trying to persist around 4000 objects wrapped in a CLR object.
     [Serializable]
        public class CacheableCollection<T> : ICacheable, IEnumerable<T>
            where T : class, ICacheable
            [DataMember]
            private Dictionary<string, T> _items;
            public IEnumerator<T> GetEnumerator()
                return _items.Values.GetEnumerator();
            IEnumerator IEnumerable.GetEnumerator()
                return _items.Values.GetEnumerator();
            [DataMember]
            public string CacheKey { get; private set; }
            public T this[string cacheKey] { get { return _items[cacheKey]; } }
            public CacheableCollection(string cacheKey, T[] items)
                if (string.IsNullOrWhiteSpace(cacheKey))
                    throw new ArgumentNullException("cacheKey", "Cache key not specified.");
                if (items == null || items.Length == 0)
                    throw new ArgumentNullException("items", "Collection items not specified.");
                this.CacheKey = cacheKey;
                _items = items.ToDictionary(p => p.CacheKey, p => p);
    We tried with the following options on server and client
    Server:
     <advancedProperties>
                <partitionStoreConnectionSettings leadHostManagement="false" />
                <securityProperties mode="None" protectionLevel="None">
                    <authorization>
                        <allow users="[email protected]" />
                        <allow users="[email protected]" />
                    </authorization>
                </securityProperties>
                <transportProperties maxBufferSize="500000000" />
            </advancedProperties>
    Client: 
     <transportProperties connectionBufferSize="131072" maxBufferPoolSize="500000000"
                           maxBufferSize="838860800" maxOutputDelay="2" channelInitializationTimeout="60000"
                           receiveTimeout="600000"/>
    I see different people experiencing different memory size issues. What is the actual memory limit of an  object that can be pushed to Appfabric. 
    Can some one please help ?
    Stack trace:
    Test method Anz.Cre.Pdc.Bootstrapper.Test.LoaderFuncCAOTest.AppFabPushAndRetrieveData threw exception: 
    System.InsufficientMemoryException: Failed to allocate a managed memory buffer of 268435456 bytes. The amount of available memory may be low. ---> System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
    System.Runtime.Fx.AllocateByteArray(Int32 size)
    System.Runtime.Fx.AllocateByteArray(Int32 size)
    System.Runtime.InternalBufferManager.PooledBufferManager.TakeBuffer(Int32 bufferSize)
    System.Runtime.BufferedOutputStream.ToArray(Int32& bufferSize)
    System.ServiceModel.Channels.BufferedMessageWriter.WriteMessage(Message message, BufferManager bufferManager, Int32 initialOffset, Int32 maxSizeQuota)
    System.ServiceModel.Channels.BinaryMessageEncoderFactory.BinaryMessageEncoder.WriteMessage(Message message, Int32 maxMessageSize, BufferManager bufferManager, Int32 messageOffset)
    System.ServiceModel.Channels.FramingDuplexSessionChannel.EncodeMessage(Message message)
    System.ServiceModel.Channels.FramingDuplexSessionChannel.OnSendCore(Message message, TimeSpan timeout)
    System.ServiceModel.Channels.TransportDuplexSessionChannel.OnSend(Message message, TimeSpan timeout)
    System.ServiceModel.Channels.OutputChannel.Send(Message message, TimeSpan timeout)
    Microsoft.ApplicationServer.Caching.WcfClientChannel.SendMessage(EndpointID endpoint, Message message, TimeSpan timeout, WaitCallback callback, Object state, Boolean async)
    Microsoft.ApplicationServer.Caching.WcfClientChannel.Send(EndpointID endpoint, Message message, TimeSpan timeout)
    Microsoft.ApplicationServer.Caching.WcfClientChannel.Send(EndpointID endpoint, Message message)
    Microsoft.ApplicationServer.Caching.DRM.SendRequest(EndpointID address, RequestBody request)
    Microsoft.ApplicationServer.Caching.RequestBody.Send()
    Microsoft.ApplicationServer.Caching.DRM.SendToDestination(RequestBody request, Boolean recordRequest)
    Microsoft.ApplicationServer.Caching.DRM.ProcessRequest(RequestBody request, Boolean recordRequest)
    Microsoft.ApplicationServer.Caching.DRM.ProcessRequest(RequestBody request, Object session)
    Microsoft.ApplicationServer.Caching.RoutingClient.SendMsgAndWait(RequestBody reqMsg)
    Microsoft.ApplicationServer.Caching.DataCache.SendReceive(RequestBody reqMsg)
    Microsoft.ApplicationServer.Caching.DataCache.ExecuteAPI(RequestBody reqMsg)
    Microsoft.ApplicationServer.Caching.DataCache.InternalPut(String key, Object value, DataCacheItemVersion oldVersion, TimeSpan timeout, DataCacheTag[] tags, String region)
    Microsoft.ApplicationServer.Caching.DataCache.Put(String key, Object value, String region)
    Anz.Cre.Pdc.DataCache.DataCacheAccess.Put[T](String cacheName, String regionName, T value) in C:\SVN\2.3_Drop3\app\Src\Anz.Cre.Pdc.DataCache\DataCacheAccess.cs: line 141
    Anz.Cre.Pdc.DataCache.DataCacheAccess.Put[T](String cacheName, String regionName, Boolean flushRegion, T value) in C:\SVN\2.3_Drop3\app\Src\Anz.Cre.Pdc.DataCache\DataCacheAccess.cs: line 372
    Anz.Cre.Pdc.Bootstrapper.Test.LoaderFuncCAOTest.AppFabPushAndRetrieveData() in C:\SVN\2.3_Drop3\app\Src\Anz.Cre.Pdc.Bootstrapper.Test\LoaderFuncCAOTest.cs: line 281

    Essentially what we are trying to do is the following.
    we have different kinds of objects in our baseline. Objects of type CAO, Exposures, Limits that change everyday after close of business day. 
    We wanted to push these different objects in to respective named regions in the cache. 
    Region Name     Objects
    CAO                   ienumerable<caos>
    Exposures           ienumerable<exposures>
    Limits                ienumerable<limits>
    we have a producer that pushes this data in to the cache and consumers of this data acting on the data when its available.
    Now the issue we are facing is when we try to push around 4000 cao objects (roughly in the size of 300MB when serialized using xml) ,we are getting the above error. Increasing the size on the client and cache cluster didnt help.
    The other alternative we were thinking about is chunking and pushing because appfabric doesnt support streaming. We might be able to push this data successfuly if we chunk. But how about the consumers ? wouldnt they face the same memory issue when we use
    getallobjectsinregion ?
    We thought if there was a way to figure out the keys in the region then probably the consumers can get one by one. However there is no such API. 
    The only option i see is using Appfabric notifications which msdn says isnt a reliable way.
    Please help.

  • OutOfMemoryException because Image is too big

    hi,
    I'm trying to learn java by devoloping a simple 2D-action-game. Now I want to draw the the ground of the game-world (or the map) on an image, in order to be able to copy the piece of the image on the screen, on which the player is currently looking.
    My problem is, that createImae() (used in a JComponent) throws an OutOfMemoryException, because the map seems to be too big. How can I deal with this problem? Is there a way to not load the whole image into the cache, but to save it on hard-disk?
    thanx

    Objects are stored on the heap. The default heap size is 64 MByte. If your image is bigger than that you can increase the heap size when your program is started by specifying a comand line parameter.

  • System.OutOfMemoryException On SBOItem Synching Of Web Tools 625

    After installing the newest version of Web Tools, v625, and synching the new data, I get the following error:
    Exception of type 'System.OutOfMemoryException' was thrown.
       at SAPbobsCOM.CompanyClass.GetBusinessObject(BoObjectTypes Object)
       at NetPoint.SynchSBO.SBOObjects.SBOPriceBrowser..ctor(Company company, SecurityTicket st)
       at NetPoint.SynchSBO.SBOObjects.SBOItem.SBOToNetPoint(SBOQueueObject qData)
       at NetPoint.SynchSBO.SynchObjectBase.Synch()
    This error appears once for each SBOItem that tries to synch for about 1700 of our 10000+ SBOItems. After that the SynchManager stops. Everything up to that point seems to synch OK. Why would this be happening? There is ample space on my HD, 2Gb of RAM, and the peak memory request hasn't gone past 2.8Gb. Do I need another 2Gb of RAM? Is it possible that SynchManager blew up while Windows was allocating more Virtual Memory, in which case should I permanently set it to at least 4Gb?

    Hi Michael,
    If memory serves me, this happens because B1 has the trace turned on.
    The solution is to open your
    SAP\SAP Business One\Log\b1LogConfig.xml and turn the trace off.
    Check to see that the trace attribute is set to 0 on the B1logger.
    <b1logger Mode="A" MaxFileSize="5" MaxNumOfMsg="500" LogStack="0" Activate = "1">
             <Components SystemMessage="1" SQLMessage="0" Trace="0" General="1" StockTool="0" Upgrade="0" Performance="0">
            <Component name="SystemMessage">
                 <Severities Note="0" Warning="0" Error="1" CriticalError="1" BeProtective="1" AuditFailure="1">
                      <Severity name="BeProtective" LogStack="1"/>
                 </Severities>
            </Component>
             </Components>
             <Severities Note="0" Warning="0" Error="1" CriticalError="1" BeProtective="1" AuditFailure="1"/>
       </b1logger>

Maybe you are looking for