System out of memory load ing large project

Folks:
I am opening a large solution in SSDT.  (Visual Studio 2013 - with Update 4 - 12.0.41025.0.)  The project contains a rather large OLTP solution.  In addition - we run focused data marts off of this common solution.  So - the solution
contains 148 projects.
When I attempt to open the solution - I get the below.  When I close the dialog, Visual Studio crashes.  I can never do anything productive.
In Task manager - I see less the 50% memory used.  I have 16GB installed, < 8GB is being used when Visual Studio crashes.  So there must be an artificial limit some where.
How do I get around this?
Microsoft.Data.Tools.Schema.Sql.Build.SqlPackageException
Exception of type 'System.OutOfMemoryException' was thrown.
   at Microsoft.Data.Tools.Schema.Sql.Build.SqlPackageContent.GetStream()
   at Microsoft.Data.Tools.Schema.SchemaModel.DataSchemaModel.ReadDataSchemaModelHeaderFromPackage(SqlPackage package, Boolean readCustomData)
   at Microsoft.VisualStudio.Data.Tools.Package.Internal.DatabaseProjectOrSqlSchemaFileReferenceNodeExtender.ReadHeaderData()
   at Microsoft.VisualStudio.Data.Tools.Package.Internal.DatabaseProjectOrSqlSchemaFileReferenceNodeExtender.TryReadHeaderData(Exception& ex)
   at Microsoft.VisualStudio.Data.Tools.Package.Project.Features.ProjectReferenceController.IsValidReferenceVerifySqlServerVersions(IDatabaseReferenceNode refNode, Boolean& canShowDefault, String& reason)
   at Microsoft.VisualStudio.Data.Tools.Package.Project.Features.ProjectReferenceController.IsValidReference(IDatabaseFileReferenceNode refNode, Boolean& canShowDefault, String& reason)
   at Microsoft.VisualStudio.Data.Tools.Package.Project.Features.ProjectReferenceController.IsValidReference(IDatabaseReferenceNode refNode, String& reason)
   at Microsoft.VisualStudio.Data.Tools.Package.Project.Features.ProjectReferenceController.GetReferenceData(IDatabaseReferenceNode referenceNode, CustomSchemaData& schemaData)
   at Microsoft.VisualStudio.Data.Tools.Package.Project.Features.FileManagerFeature.ProcessReferencesInQueue()
   at Microsoft.VisualStudio.Data.Tools.Package.Project.Features.FileManagerFeature.OnIdle()
   at Microsoft.VisualStudio.Data.Tools.Package.Project.DatabaseProjectNode.OnIdle()
   at Microsoft.VisualStudio.Data.Tools.Package.Project.SqlProjectIdleProcessingComponent.FDoIdle(UInt32 grfidlef)
John

Visual Studio is a 32 bit process so the most it could possibly use is 2GB.
Honestly I have used visual studio for a while (since VB6/VC6 to VS .Net 2002 all the way to 2015) and it is a memory hog as it is, I really can't imagine what it is like to have 148 projects in the solution - it must be really slow. I guess you are using
build configs to limit what you build each time or your build times alone will be painful!
I really would look at splitting the solution down into smaller parts, I don't think you will have much luck until you do that.

Similar Messages

  • BPC 5.0.502  BPC Web Error Message: Exxception of Type System out of memory

    In the process of updating a web page in the content library, page crashed received an error message of "Exception of type System out of memory.  Exception was throw".  What does this mean?  We are now unable to open this sheet without receiving the prior message.  Any remadies would be appreciated.

    Hello,
        Did you receive this specific error message only managing that web page or for all pages?
        Did you try to stop all COM+ components on the application servers(on each if you are using more). It looks to be related to some memory problems.
        If the problem is related to a specific web page, the problem can be related to the content of that web page.
    Best regards,
    Mihaela

  • System.out of memory exception

    I am running a TestStand sequence, using the .NET adaptor I am creating a class, and calling methods of a Class Library I created in VB.NET. I tested my class library in VB.NET and everything works in that environment. The sequence I call is pretty simple,
    create an instance of my class, then
    call a method that setups and initializes a camera
    call a method that runs through the tests of a test sequence
    The first pass of the sequence, it runs, the second pass of the sequence, whether in a For loop or the entire TestStand Sequence started again, it initializes the camera, but the second sequence, TestStand reports a system.out of memory exception in an underlying assembly/dll of my dll. But, when I call my dll from a .exe I made to test my dll, it works fine.
    Thanks

    Thanks for you reply.
    I am calling the setup and initialization of the camera once; it is the calling the method that runs a sequence of tests that I am repeating calls to and subsequently failing on the second call. The interesting thing is if I close out of TestStand completely after the first run. Then run my TestStand sequence again, it reportsd the same error. It seems something is not being released, or cleared, or something. Keep in mind, I am only getting this with TestStand, I do not get it when I run a test .exe that calls my dll. I since have tested calling my .exe in TestStand that calls my DLL and that works. The problem is clearly in using TestStand to create the class and call the methods.
    Thanks

  • Out of memory error - WLW Schema Project build

    hi
    Wasn't sure which group (workshop vs xmlbeans) but here goes.
    I'm doing a persomal eval of the WLS 8.1.1. platform.
    My current focus is on XML Schema support, XML Beans, transformations etc
    and in particular how it all copes with large schema.
    I'm using Justice XML Data Dictionary (3.0.0.0) as an example of a large
    schema
    (http://it.ojp.gov/jxdd/)
    I am getting an out of memory error when I build this within WLW.
    The build appears to be in the javac phase (and occurs when the memory goes
    over 256M) so I think it's the ANT task.
    WLW doesn't seem to be running out of memory and the build machine is not
    running out of resources.
    Where do I configure this? The <xmlbean> tag ? Is there any doco?
    regards
    Jim Nicolson
    =============== Build trace below ====================
    Build project JXDDSchema started.
    BUILD STARTED
    build:
    check-uptodate:
    build-sub:
    Updating property file:
    C:\data\projects\bea\integration\wlitest\.workshop\.ide\JXDDSchema\build.pro
    perties
    Deleting directory
    C:\DOCUME~1\JIMNIC~1\LOCALS~1\Temp\.wlw-temp\wlw_compile38940\JXDDSchema
    Created dir:
    C:\DOCUME~1\JIMNIC~1\LOCALS~1\Temp\.wlw-temp\wlw_compile38940\JXDDSchema
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\jxdds_3.0.0.0_full-doc.x
    sd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\iso_639-2t_1.0.0.0_full-
    doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ut_offender-tracking-mis
    c_1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-gun_1.0.0.0_fu
    ll-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-other-transact
    ions_1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-uniform-offens
    e_1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\mn_offense_1.0.0.0_full-
    doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\dod_jcs-pub2.0-misc_1.0.
    0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\cap_1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\nibrs_misc_1.0.0.0_full-
    doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\fips_6-4_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\iso_3166_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\dod_misc_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-misc_1.0.0.0_f
    ull-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-boat_1.0.0.0_f
    ull-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-securities_1.0
    .0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-article_1.0.0.
    0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\unece_rec20-misc_1.0.0.0
    _full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-state-country_
    1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\iso_4217_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\fips_10-4_1.0.0.0_full-d
    oc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ansi_d20_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\iso_639-2b_1.0.0.0_full-
    doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\fips_5-2_1.0.0.0_full-do
    c.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-vehicle_1.0.0.
    0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\dod_exec-12958_1.0.0.0_f
    ull-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\ncic_2000-personal-descr
    iptors_1.0.0.0_full-doc.xsd
    Loading schema file
    C:\data\projects\bea\integration\wlitest\JXDDSchema\usps_states_1.0.0.0_full
    -doc.xsd
    Time to build schema type system: 6.75 seconds
    The system is out of resources.
    Consult the following stack trace for details.
    java.lang.OutOfMemoryError
    BUILD FAILED
    BUILD FAILED

    hi Kevin
    Thanks I'll take a look there.
    I think it's a good move to make XMLBeans open source.
    (I've been using Castor for the same purpose for the last two years).
    I was mainly focussed on XMLBeans as they are used in WLW context and my
    testing suggested that I would probably want to know more about the ANT
    build/tasks etc. (Just to be able to deal with unexpected issues)
    It's not urgent - I'm just doing a personal eval to come up to speed with
    WLS 8.1 platform.
    thanks
    Jim Nicolson
    "Kevin Krouse" <[email protected]> wrote in message
    news:[email protected]..
    Sorry, the first URL should have been:
    http://cvs.apache.org/viewcvs.cgi/*checkout*/xml-xmlbeans/v1/xkit/anttask.html?rev=1.2
    >
    --k
    On Tue, 30 Sep 2003 22:35:58 -0700, Kevin Krouse wrote:
    Hi Jim,
    Well, the XMLBean docs are in a process of moving into the Apache
    XMLBeans
    project (http://xml.apache.org/xmlbeans), so I can point you to where
    they
    are in the cvs repository. You can also look at the source if you'dlike!
    >>
    Here's an html doc for the xmlbean task:
    http://cvs.apache.org/viewcvs.cgi/*checkout*/xml-xmlbeans/v1/xkit/anttask.html?rev=1.2http://cvs.apache.org/viewcvs.cgi/*checkout*/xml-xmlbeans/v1/xkit/anttask.html?rev=1.2
    >>
    Here's the code for the xmlbean task:
    http://cvs.apache.org/viewcvs.cgi/xml-xmlbeans/v1/src/xmlcomp/org/apache/xmlbeans/impl/tool/XMLBean.java?rev=HEAD&content-type=text/vnd.viewcvs-markup
    >>
    >>
    As to your second question about WLW becoming unresponsive, after the
    Schemas.jar is built, the source for the entire project is re-scanned
    which could take several minutes. The performance impact of rescanning
    the project is improved in SP2 and will be addressed even more in future
    releases. Stay tuned.
    --k

  • Out of memory error for large recordings:

    Hi,
      My workflow process takes input as a list. It loops through the workflow for all the items in the list. If my list contain 4 items, there will be 3*7= 21 steps when I invoke the process. I can able to check my recordings for debugging.
    If I give 5 inputs, It takes around 5*7 = 35 steps in the workflow. If I try to play the recording, i see the error: java heap space: Out of memory exception. I changed my space to Xmx1024m in workbench.ini. I also did: java -Xms<initial heap size> -Xmx<maximum heap size> (Q1: do we need to restart the JBoss after we do the second step? ).
    Q2: My system is Windows7 with 4GB RAM. I have liveCycle 8.2 with SP2. JBoss & MySql. Do I need to set this java RAM size in any of our JBoss server files?
    Q3: Is there any file to set the maximum permissible steps for a process recording? If so, please let me know. It would be a great help.
    Thanks,
    Chaitanya

    If the size of the documents you are adding as input are very large this could result in OOM errors so that is something to consider. 
    Some other things to try are as follows:
    1. Increase memory allocated by JBoss by another 200 - 300 M as per the following documentation on the web:
    http://help.adobe.com/en_US/livecycle/9.0/workbenchHelp/help.html?topic=000097&topic=00183 7
    2. Read the following sections of the Workbench Help which can be found on the web and describes how to limit your recording storage space used:  http://help.adobe.com/en_US/livecycle/9.0/workbenchHelp/help.html?topic=000097&topic=00106 0
    I would suggest increasing your maxNumberOfRecordingEntries to about 200.
    Heather

  • System out of memory when exporting/migrating Solution Manager

    Dear all,
    since we decided to migrate our Solution Manager (win2003/sql server 2005) system to a more appropriate server, we started an export using the NW70sr2 installation master CD, the one we used to install this solution manager instance.
    Everything went ok up to the Java export phase, here we caught an out of memory issue, here's the relevant part of the sapinst.dev.log file:
    May 24, 2008 1:49:46 PM com.sap.inst.jload.Jload dbExport
    SEVERE: DB Error during export of BC_SLD_CHANGELOG
    May 24, 2008 1:49:46 PM com.sap.inst.jload.Jload printSQLException
    SEVERE: Message: The system is out of memory. Use server side cursors for large result sets:null. Result set size:262,182,182. JVM total memory size:518,979,584.
    May 24, 2008 1:49:46 PM com.sap.inst.jload.Jload printSQLException
    SEVERE: SQLState: null
    May 24, 2008 1:49:46 PM com.sap.inst.jload.Jload printSQLException
    SEVERE: ErrorCode: 0
    May 24, 2008 1:49:46 PM com.sap.inst.jload.db.DBConnection disconnect
    INFO: disconnected
    TRACE      [iaxxejsexp.cpp:199]
               EJS_Installer::writeTraceToLogBook()
    2008-05-24 13:49:49.755 JavaApplication execution finished
    TRACE      [iaxxejsexp.cpp:199]
               EJS_Installer::writeTraceToLogBook()
    2008-05-24 13:49:49.755 NWDB._callJLoad(export) done: throwing
    TRACE      [iaxxejsexp.cpp:199]
               EJS_Installer::writeTraceToLogBook()
    NWException thrown: nw.syscopy.jloadRunFailed:
    <html>Execution of JLoad tool 'C:\j2sdk1.4.2_14-x64\bin\java.exe -classpath F:\usr\sap\SMD\SYS\global\sltools\sharedlib\launcher.jar -showversion -Xmx512m com.sap.engine.offline.OfflineToolStart com.sap.inst.jload.Jload
    SMDCA/sapmnt/SMD/SYS/global/security/lib/tools/iaik_jce.jar;
    SMDCA/sapmnt/SMD/SYS/global/security/lib/tools/iaik_jsse.jar;
    SMDCA/sapmnt/SMD/SYS/global/security/lib/tools/iaik_smime.jar;
    SMDCA/sapmnt/SMD/SYS/global/security/lib/tools/iaik_ssl.jar;
    SMDCA/sapmnt/SMD/SYS/global/security/lib/tools/w3c_http.jar;F:/usr/sap/SMD/SYS/global/sltools/sharedlib/jload.jar;F:/usr/sap/SMD/SYS/global/sltools/sharedlib/antlr.jar;F:/usr/sap/SMD/SYS/global/sltools/sharedlib/exception.jar;F:/usr/sap/SMD/SYS/global/sltools/sharedlib/jddi.jar;F:/usr/sap/SMD/SYS/global/sltools/sharedlib/logging.jar;F:/usr/sap/SMD/SYS/global/sltools/sharedlib/offlineconfiguration.jar;F:/usr/sap/SMD/SYS/global/sltools/sharedlib/opensqlsta.jar;F:/usr/sap/SMD/SYS/global/sltools/sharedlib/tc_sec_secstorefs.jar;F:\usr\sap\SMD\DVEBMGS00\exe\mssjdbc\sqljdbc.jar -sec SMD,jdbc/pool/SMD,
    SMDCA/sapmnt/SMD/SYS/global/security/data/SecStore.properties,
    SMDCA/sapmnt/SMD/SYS/global/security/data/SecStore.key -dataDir E:/EXPORT/JAVA/JDMP -remove_trailing_blanks "C:/Documents and Settings/administrator.ATRSAP/removeTrailingBlanks.txt" -convert_empty_LOBs "C:/Documents and Settings/administrator.ATRSAP/convertEmptyLobs.txt" -convert_empty_strings "C:/Documents and Settings/administrator.ATRSAP/convertEmptyStrings.txt" -convert_empty_binary "C:/Documents and Settings/administrator.ATRSAP/convertEmptyBinary.txt"' aborts with return code 1.<br>SOLUTION: Check 'jload.log' and 'C:/Documents and Settings/administrator.ATRSAP/jload.java.log' for more information.</html>
    Actually we used a workaroud to complete the phase, ie we menually re-issued the command and changed the heap value from -Xmx512m to -Xmx1024m
    Is there a way to defalut the sapinst to pass to the java export command an higher java heap value, ie -Xmx1024m ?
    Thanks in advance and cheers.
    Franco.

    It sounds like you may have a corrupt render file in your 7th episode.  You can set in and out points in 5 minute increments and export those segments to narrow down which clip is the offending clip. Then re-render the offending clip.
    I hope this helps.

  • Out of Memory problem at large IDocs (HRMD_A.HRMD_A06)

    Hello,
    we need help at the following problem:
    We send HRMD_A.HRMD_A06 IDocs in a IDOC->XI->IDOC Scenario.
    In this scenario there is a small graphical Mapping. When
    we send 100 Idocs, after 15 - 30 Minutes all queues are stopped and the error "Out of Memory" is showing.
    We have configured that Large IDocs are running in own queues (Max. 3 parallel), but the error is still there.
    Only Solution this time is to restart the queues repeatly and at anytime the Idocs are all going through.
    Has anyone a idea what we can do ?
    Regards,
    Christian

    Thanks for answers,
    now after 15-30 Minutes a new Error shows: 
    "Timelimit Overstepped"  - In German Original "Zeitlimit Überschritten".
    and all Queues stopped.
    What can i do ?
    Regards
    Christian

  • General Error and Out of Memory Error - affecting multiple projects

    I am running Final Cut Pro 6 on my 1.25 GHz PowerPC G4 iMac, with 1.25 GB RAM. OS X 10.4.11.
    I have had this setup for more than a year and used it without any problems.
    I have two projects stored on an external LaCie firewire drive with more than 140GB disk space remaining. As of this morning, I cannot work on either of them: both are throwing the "General Error" message, and one of them is also telling me it's "Out of Memory". On project #1, the "Out of Memory" occurs whenever I try to open a clip in the viewer, following a "General Error" message. On project #2, the "General Error" message occurs when I try to open the project; it gets halfway through the process and then throws the error, leaving me unable to open the timeline.
    Both projects are short, less than 3 minutes long, and neither project contains any CMYK or grayscale graphics. Project #2 does have a short Motion sequence.
    Things I have tried:
    ~ restarting and hard-rebooting
    ~ trashing preferences
    ~ rebuilding permissions
    ~ trashing render files
    ~ creating a new project
    ~ searching this forum and Google for other answers
    Help?

    Thanks for the support, Jim. I've had terrible experiences with Firewire drives in the past, too; regrettably, there doesn't seem to be an affordable alternative at this point.
    I just looked up resetting the PMU, as that's not something I've done before. I really hope it's that simple, as the thought of recreating all these clips makes my head hurt. But I'll definitely try your suggestion of reconnecting individual media files first. I've been through that before.

  • Apple HDV codec and Out of Memory error on transferred project

    Hi,
    I recently captured and have been editing fine with HDV footage on my iMac (surprisingly enough). My question is, I transfered the entire project to a collegue to edit on his new Powerbook(much faster than my machine) he gets two errors when he tries to open the sequence. 1. Codec not found. You may be using a compression type without the corresponding hardware card. Then, 2 - Out of Memory. He can play the audio off the QT movies but the display is only white. I looked up the Movie info for the QT movies and mine is reading that it is Apple HDV codec. The exact same clip's Movie info on his Powerbook is reading only HDV? Can anybody help? I'm pretty perplexed on this one.
    Thanks!

    It really would be useful if you included information like: FCP versions, QT versions, how the footage was captured and what codec was used?
    Do you and he have matching versions of FCP? of QT?
    If he doesn't have the codec you used to capture, then he t'aint gonna play the file. It is as simple as that.
    x

  • Encore DVD 2.0 memory errors on large project

    Hello,
    I'm compiling a DVD for commercial distribution using Encore 2.0.
    It's a large project in that there are 4 x 3 minute video clips, 2000 timeline-arranged audio files, motion menus with audio and quite a lot of buttons.
    Encore is misbehaving quite badly - it has problems with audio clips that are not the standard (48000 hertz, 768 kbps bitrate, PCM audio) and the project does preview briefly before 'internal memory error 0x000002' comes up.
    The workstation is an AMD64 dual-core 3800, on a Gigabyte skt 939 board, with a Geforce 6600 graphics card and 2 Gb of DDR RAM. So no hardware problems, I think. And yes it is a SSE2 processor - it'd have to be to run Encore at all.
    I can't even build a DVD image due to the apparent size of the project. My client is certain that all of the audio timelines are necessary to the project, but I know that deleting them and cutting the size of the project down will cure the problems I'm having.
    I don't know if there is any obvious reason that Encore can't handle the project - Adobe Updater does not have any new updates for Encore so I assume that this is as good as I'm gonna get.
    Any similar experiences? Any ideas? Thanks in advance,
    Ed Brown
    myweb.tiscali.co.uk/iline
    [email protected]

    You're right - some of the files are mono, and I have been batch processing them up to 48k stereo, thus doubling the bitrate.
    Originally the audio files were recorded in-camera during interviews. The audio of these interviews was chopped up and edited, and animation was produced to complement the selected sections of the audio interviews. Now we're putting the animation series (called Terra 2050) onto DVD and I'm in charge of the project. For the DVD extras, we wanted to include the full audio interviews.
    So I set up a basic DVD with Play All, Select Episode etc. Then I started to add a timeline for each person, dumped all of that person's interview onto the audio track, made a PSD of that person's name, so by choosing "Colin Pillinger" on the Interviewees menu, the DVD should jump to a timeline that plays a still saying "Colin Pillinger - Professor of Planetary Science" while Colin's interview plays over the top.
    The audio files are only half the problem, however.
    The structure is as follows:
    Top menu: Play All - Select Episode - Extras Menu
    Extras Menu: Behind the Scenes - Audio Interviews
    Audio Interviews: (a long list of buttons, with each person's name on each button, which links to that person's timeline)
    Each timeline contains; video track contains a PSD of that person's name; audio track contains several audio clips of that person's interview.
    So I do not have more than 99 timelines or 99 chapters in any one timeline. I think the project is too large to be cached in one go for Preview purposes, but I know it all works properly according to the Flowchart/Check Project windows.
    Should I just switch to Sonic Scenarist and stick my head in a DVD architecture manual for a month? Encore doesn't seem to handle 'large' (read professional quality) projects at all well.
    Thanks so much for your input - I really appreciate your help.
    Ed

  • Out of memory when coverting large files using Web service call

    I'm running into an out of memory error on the LiveCycle server when converting a 50 meg Word document with a Web service call.  I've already tried increasing the heap size, but I'm at the limit for the 32 bit JVM on windows.  I could upgrade to a 64 bit JVM, but it would be a pain and I'm trying to avoid it.  I've tried converted the 50 meg document using the LiveCycle admin and it works fine, the issue only occurs when using a web service call.  I have a test client and the memory spikes when it's generating the web service call taking over a gig of memory.  I assume it takes a similar amount of memory on the receiving end which is why LiveCycle is running out of memory.  Does any one have any insight on why passing over a 50 meg file requires so much memory?   Is there anyway around this?
    -Kelly

    Hi,
    You are correct that a complete 64bit environment would solve this. The problem is that you will get the out of memory error when the file is written to memory on the server. You can solve this by creating an interface which stores large files on the server harddisk instead, which allows you to convert as large files as LC can handle without any memory issue.

  • Cannot Upload Resources to P6 v7 - System out of Memory

    Hello,
    When I import resource assignments to P6 v7, I get an the following error message:
    "Your system is out of memory. Please close other running applications and try this operation again."
    I also noticed that the PM.exe file in Task Manager was consuming a huge amount of system memory, over 1 GB.
    After this error message, when I go back to my P6 schedule, I find that several activity durations have been zeroed out. I've never experienced this with earlier versions of P6. Has anyone encountered any of this before?
    Here's some more info about the schedule - It has 8600 activities, and 7000+ resource assignments.
    Thanks for your help!

    Ashish777 wrote:
    Jason,
    Thanks for your reply. I would have thought that 2GB of memory would me more than sufficient. Would increasing the memory to say 3GB or 4GB help? Incidentally, I did break down the spreadsheet into 7 "chunks" of about 1000 entries, and was able to successfully import the spreadsheet that way.
    Also what do you mean by API?
    Thanks again!It certainly won't hurt, but I doubt that 3/4gb even is going to allow you to import that many assignments in one shot with the method you are using; since P6 is a 32bit application, it cannot address that much memory.
    The api is the programmers interface. It would require knowledge of Java programming, however, if you are going to be doing this sort of thing a lot, it would pay for itself many times over in speed, ease of use, and avoiding any potential issues.

  • How to overcome a "System out of memory exception"?

    Hi,
    As i am running my program , I get (sometimes) an out of memory exception.
    I don't know exactly why because I am always doing the same thing so if I get this exception once It should always be so... (of course, as I am trying , no other program is running on my computer! ).
    anyway.
    I have 3 questions:
    1) Do you know how to eliminate this error ?
    (I don't mind if the time of execution is longer)
    2) I have Win XP, do you think that using a software to build ".exe" files can change the problem ? If so, have you heard about a simple 'one' (I downloaded JET Excelsior, but it seems rather complicated to parametrize)
    3) (last but not least) Can someone explain to me WHY there is this type of exception ( I would have thought that when "memory is full", then there is a swap, and the program doesn't stop !
    I know there is a lot of questions in one ! ( altough I tried to be short)
    Thanks

    In answer to your third question, the error occurs when
    the JVM runs out of memory, not the OS. Since the OS
    controls swapping the fact that the memory space
    assigned to the JVM is running low won't cause
    swapping to take place. The solution is either a) use
    less space by reducing what you have loaded at any
    given time or b) increase the amount of memory
    available to the JVM. You can user the -Xms, -Xmx and
    -Xss switches to increase the amount of memory
    available.
    Mark

  • Out of Memory Error and large video files

    I created a simple page that links a few large video files (2.5 gig) total size. On Preview or FTP upload Mues crashes and gives a Out of Memory error. How should we handle very large files like this?

    Upload the files to your host using an FTP client (i.e. Filezilla) and hyperlink to them from within your Muse site.
    Muse is currently not designed to upload files this large. The upload functionality takes the simple approach of reading an entire linked file into RAM and then uploading it. Given Muse is currently a 32-bit application, it's limited to using 2Gb of RAM (or less) at any given time regardless of how much RAM you have physically installed. We should add a check to the "Link to File..." feature so it rejects files larger than a few hundred megs and puts up a explanation alert. (We're also hard at work on the move to being a 64-bit app, but that's not a small change.)
    In general your site visitor will have a much better experience viewing such videos if you upload them to a service like YouTube or Vimeo rather than hosting them yourself. Video hosting services provide a huge amount of optimization of the delivery of video that's not present for a file hosted on standard hosting (i.e. automatic resizing of the video to the appropriate resolution for the visitor's device (rather than potentially downloading a huge amount of unneeded data), transcoding to the video format required by the visitor's browser (rather than either having to due so yourself or have some visitors unable to view your video), automatic distribution of a highly viewed video to multiple data centers for better performance in multiple geographies, and no doubt tons of other stuff I'm not thinking of or am ignorant of.

  • Decompressing LZMA .zip files always gives "System out of Memory" on AIR.

    AIR is supposed to support LZMA compression in addition to Deflate, but it has an error every time I try to use it. ZIPs using the Deflate method decompress just fine.
    aByteArray.uncompress("lzma");
    Should decompress the LZMA compressed data, but always just has a #1000 System is Out of Memory error.
    I have tried skipping the standard header in the zip that should be 2 bytes version info, extra properties length (n) and a filed of the length of n;
    Even when I do that, the same thing happens. I have tried deducting the LZMA header length from the general compressed data size in the ZIP header as well and that makes no difference. I am also above AIR 3.3 so that should not be the issue. I also tried setting a small 'library size' when compressing. LZMA is supposed to be "suited for embedded applications" so mobile devices should surly be able to handle it.
    The header with the 13 byte header i believe is only for certain stand alone LZMA formats, so that should not apply.
    What am I doing wrong? How do I make it work?

    No. I try and avoid cluttering my apps with extra libraries if at all possible. It may have a sizeable footprint when I only need a few things, and it might contain encryption stuff if it supports the whole zip specification, which makes your app banned from export to some countries apparently.
    I already have my app doing what I need it to do with the most common deflate method, but it simply would be a shame to not have LZMA support when it is built right into AIR. I would like to know what I'm doing wrong when I call uncompress with "lzma".

Maybe you are looking for

  • How to prevent creation of view object rows when using browser refresh butt

    HI i have the following problem: I have a creation form using a partial submition. The problem is that if i have entered some value by using autosubmit , this value will be setted as attribute in the ViewObject . After that if i click refresh button

  • After upgrade to iOS 8, My ipad is getting shut down intermediately.

    After upgrade to iOS 8, My ipad is getting shut down intermediately. Even while surfing net also it happens. Earlier I thought it may be due to any software/game which is not compatible. But it happens during surfing net on Safri. No option to move b

  • HDV Print to Video

    Have read all threads for the last year on this, and have not been able to Print to Video. It starts fine, transfers via FW to my Sony FX-1. After about 2-4 min. it starts freezing for a split second, then jumps forward, only to freeze again every fe

  • Fix for imessage and facetime not working in iOS5

    I HAVE THE SOLUTION - It's worth shouting about! will work on itouch as well Upgraded 3 devices to IOS5. iphone 4, iPad 2 and iTouch (4th Gen). iMessage activates but during sending message it get stuck at 90%, Facetime stopped working on all 3 devic

  • Convert shapefile to oracle spatial

    Hi All, I have a road network which is shapefile format and i want to export it to oracle spatial format using any free tool, any suggestion or guidance would be highy appretiated. I am using arcgis 9.3.1 and Oracle Database 11g Enterprise Edition Re