Help! Mid files cause "out of memory" session shut down.

When I export .mid files from Reason 7 and import them into Logic Pro 9, it give me the error message "out of memory" and shuts down the session. Help!!

I tried a number of values that were multiples of 1024 (2048, etc).
The only thing that worked seemed to be changing the number and rebooting the server machine.
Not sure what caused the initial problem though.  Saw very wierd activity in the FMS console.  Had a few hundred users connected, as I typically do for one of my events, and the chat application was repeatedly unloaded from FMS.  When it was reloaded the FMS console would show crazy large numbers in the "Clients" column - numbers like 3954.  Then it would drop to like 340.  Then shoot up again to wildly large numbers.
Is that the result of a memory problem?

Similar Messages

  • Urgent:  JavaScript runtime is out of memory; server shutting down instance

    I'm getting this error with one chat application.
    According to
    http://www.adobe.com/livedocs/flashmediaserver/3.0/docs/help.html?content=09_config_status codes_1.html
    ..I should "Check the JavaScript runtime size for this application in the  configuration file".
    By default it was set to 1024.  Any attempt to increase it causes FMS not to start again.
    This problem just came out of the blue.  This same chat app has been in operation with no trouble for about 10 months.  I restart the server machine on which it resides before each new chat session.
    Any suggestions?

    I tried a number of values that were multiples of 1024 (2048, etc).
    The only thing that worked seemed to be changing the number and rebooting the server machine.
    Not sure what caused the initial problem though.  Saw very wierd activity in the FMS console.  Had a few hundred users connected, as I typically do for one of my events, and the chat application was repeatedly unloaded from FMS.  When it was reloaded the FMS console would show crazy large numbers in the "Clients" column - numbers like 3954.  Then it would drop to like 340.  Then shoot up again to wildly large numbers.
    Is that the result of a memory problem?

  • Each time that I go into edit with Photoshop Elements 8 it causes a physical dump and shuts down

    Each time that I go into my Photoshop Elements 8 it causes a physical dump and shuts down.  I have only used this program a few times although I have had it a few years.  The first couple of times it worked fine; but now, it shuts me down each time.  I have already tried unistall and reinstall.  Help!

    Well, can you try opening just the Photoshop Elements 8 (from the Start menu) and see if it opens Editor from Welcome Screen or not?
    It it still crashes, please try couple of probable solutions as mentioned below:
    Solution 1:
    Close Elements.
    Launch the Photoshop Elements Welcome Screen and hold down ctrl + alt + shift as you click Editor.
    Continue to hold the keys until you see a message box asking if you want to delete Photoshop Elements settings file; click Yes. Elements will open with default preferences.
    Solution 2: In case any network printer is attached try to launch without network or printer uninstall or make different printer as default.
    Solution 3: Try launching with anti-virus off or removing PSE from conflicting list.
    Solution 4:
    On the drive on which you have installed PSE,on my machine it is on C:
    Go  to C:\Program Files\Adobe\Photoshop Elements 10\Locales\<locale>\Plug-Ins\Import-Exportand you will find twain  plug-in. Remove that plug-in from that location and copy it somewhere else.
    Now launch PSE and check if it works.

  • HT5129 I have 30+ iPhoto Libraries, so every time I open one, iphoto wants to copy the 3 events I had out there when MobileMe shut down. Those 3 events are saved in the appropriate iPhoto libraries. Wish I had deleted them on MobileMe! How do I turn this

    I have 30+ iPhoto Libraries, so every time I open one, iPhoto wants to copy the 3 events I had out there when MobileMe shut down. Those 3 events are already saved in the appropriate iPhoto libraries. Wish I had deleted them on MobileMe! How do I turn this process off?

    As Larry mentioned I use a different DAM (digital asset management) appliclation, Media Pro 1. For other possible DAM apps visit  The DAM Forum which is for professional photographers.  There the various apps are discussed along with workflows.  If nothing else you'll get some good information on managing digital "negatives".
    Media Pro lets you manage the folder organization of your photos and write metadata, keywords, titles, and other metadata directly back to the files.  No matter what app you go with have a backup plan in place to keep current backups of your source photos on separate hard drives.

  • Good morning There is a stop at the camera Please help me When you turn on the camera Shut down automatically

    Good morning There is a stop at the camera Please help me When you turn on the camera Shut down automatically
    I'm in iraq

    Adobe Reader is free. It always has been.
    You have obviously downloaded and installed Acrobat (30 day trial version) and the trial period has expired.
    You need to uninstall Acrobat (using the Control Panel and Add/Remove Programs or Programs and Features [Vista/7])
    Then run the Windows Installer Cleanup Utility http://support.microsoft.com/kb/290301
    to completely remove the Acrobat installer files.
    Then download Reader 9, install it, and go about reading PDFs for free.
    EDIT: Please, please please.... do NOT post a phone number in the forums.
    You're inviting trouble if you do.
    Not everyone who reads these pages is here to help fellow Adobe product users.
    Message was edited by: Mac+Win=Me

  • HT201401 What causes the iPhone 4S to shut down to a black screen when it has full power?

    What causes the iPhone 4S to shut down to a black screen when it has full power? I have cleared the open icons and it works for one day then shuts down again.

    Software hiccup, glitch or maybe faulty battery. Restore iPhone with iTunes on computer. See if this helps. If still problem that you think is serious enough to fix, all iPhone 4S have full Warranty. Make Genius Reservation and take iPhone to Apple for resolution.

  • Tried to reinstall OSX Mountain Lion 10.8.4 and every time i start with the installation i get a requested time out message, had to shut down my MacBook Pro retina. Now when i start up it automatically start internet recovery

    tried to reinstall OSX Mountain Lion 10.8.4 and every time i start with the installation i get a requested time out message, had to shut down my MacBook Pro retina. Now when i start up it automatically start internet recovery for two ouers. After the internet recovery the same problem with reinstalling OSX message

    Gave up on the GUI.  This script derived from http://blog.netnerds.net/2012/07/os-x-how-to-setup-nat-on-lion-and-mountain-lion / seems to work for me:
    gwdev=en0 # This is my WiFi connection (has Internet connection)
    targetdev=en1 # This is the USB to Ethernet adapter (to give Internet connection)
    /sbin/ifconfig $targetdev down
    /sbin/ifconfig bridge0 create
    /sbin/ifconfig bridge0 up
    /sbin/ifconfig bridge0 addm $gwdev
    /sbin/ifconfig bridge0 $HOST_ADDR
    /sbin/route add default -interface bridge0 -ifscope bridge0 -cloning
    /usr/sbin/sysctl -w net.inet.ip.forwarding=1
    /sbin/ipfw add 100 divert natd ip from any to any via $gwdev
    /usr/sbin/natd -interface $gwdev -use_sockets -same_ports -unregistered_only -dynamic -clamp_mss -enable_natportmap -natportmap_interface $targetdev

  • PLZ help my ipod my ipod touch is completely shut down and it doesnt even show the battery icon plz help!

    PLZ help my ipod my ipod touch is completely shut down and it doesnt even show the battery icon plz help!

    Please don't use a bolded font.  It's annoying.
    What happened immediately prior to the screen shutting down?

  • Cron running Perl script causes Out Of Memory at 63000kb

    Hi,
    I run a script to record an internet steam at set times using cron (i'm using icecream, a perl script). This runs fine when executed by 'at' or manually form the command line, but when it is started by cron it dies with the error 'Out of memory!'. The machine isn't extremely powerful but is more than capable of doing this simple task!
    From my research I understand that it may have something to do with the /etc/security/limits.conf file, though apparently changing these values often don't have the desired effects.
    The output of 'ulimit -a' is as follows (I can't see any limiting factors here):
    [aratclif@server ~]$ ulimit -a
    core file size (blocks, -c) 0
    data seg size (kbytes, -d) unlimited
    scheduling priority (-e) 20
    file size (blocks, -f) unlimited
    pending signals (-i) 1931
    max locked memory (kbytes, -l) 64
    max memory size (kbytes, -m) unlimited
    open files (-n) 1024
    pipe size (512 bytes, -p) 8
    POSIX message queues (bytes, -q) 819200
    real-time priority (-r) 0
    stack size (kbytes, -s) 8192
    cpu time (seconds, -t) unlimited
    max user processes (-u) 1931
    virtual memory (kbytes, -v) unlimited
    file locks (-x) unlimited
    [aratclif@server ~]$
    and the crond.log output when it died is here (there is nothing else of any relevance in the log - just a load of song names broadcasted by the stream!)
    (aratclif) CMDOUT (
    LED ZEPPELIN - WHOLE LOTTA LOVE [63000 K]Out of memory!)
    The command is started by my user's crontab and so the ulimit displayed by cron is the same as when executed from the command line:
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1308]: (aratclif) CMD (ulimit -a)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (core file size (blocks, -c) 0)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1309]: (CRON) EXEC FAILED (/usr/sbin/sendmail): No such file or directory
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (data seg size (kbytes, -d) unlimited)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (scheduling priority (-e) 20)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (file size (blocks, -f) unlimited)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (pending signals (-i) 1931)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (max locked memory (kbytes, -l) 64)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (max memory size (kbytes, -m) unlimited)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (open files (-n) 1024)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (pipe size (512 bytes, -p) 8)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (POSIX message queues (bytes, -q) 819200)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (real-time priority (-r) 0)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (stack size (kbytes, -s) 8192)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (cpu time (seconds, -t) unlimited)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (max user processes (-u) 1931)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (virtual memory (kbytes, -v) unlimited)
    Apr 10 23:06:01 localhost /USR/SBIN/CROND[1307]: (aratclif) CMDOUT (file locks (-x) unlimited)
    Any help would be greatly appreciated!
    Thanks!
    Last edited by ajratcliffe (2012-04-10 23:13:01)

    No, the files are all the same. It seems like just a few are causing the issue, and what's weird, is some of them are cut versions from clips that elsewhere in the timeline render fine.

  • Large DataTable causes out of memory exception

    Hello Support 
    We have a datatable that returns 112970 records and have 51 columns.
    When we try generate an xls file or display data result in grid view than "System Out of memory exception" is thrown.
    OS : Windows 2003 Enterprise 32 Bit
    HP DL380 G4
    CPU 2x3.6GHZ
    6 GB RAM
    Can you help us to find a resolution to this issue?
    Thank you
    Shrenik
    Maurice

    Thanks for reply.
    1> .XLXS format allows us >= 133000 records in spread sheet some times and some times it throws Exception of type 'System.OutOfMemoryException'.
    2> We are using asp.net gridview.
    Exception:
     ExceptionObject : Message : Exception of type 'System.OutOfMemoryException' was thrown.
    Data : System.Collections.ListDictionaryInternal
    InnerException : Nothing
    TargetSite : System.String ToBase64String(Byte[], Int32, Int32, System.Base64FormattingOptions)
    StackTrace :    at System.Convert.ToBase64String(Byte[] inArray, Int32 offset, Int32 length, Base64FormattingOptions options)
       at System.Web.UI.ObjectStateFormatter.Serialize(Object stateGraph, Purpose purpose)
       at System.Web.UI.ObjectStateFormatter.System.Web.UI.IStateFormatter2.Serialize(Object state, Purpose purpose)
       at System.Web.UI.Util.SerializeWithAssert(IStateFormatter2 formatter, Object stateGraph, Purpose purpose)
       at System.Web.UI.HiddenFieldPageStatePersister.Save()
       at System.Web.UI.Page.SavePageStateToPersistenceMedium(Object state)
       at System.Web.UI.Page.SaveAllState()
       at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) HelpLink : Nothing Source : mscorlib HResult : -2147024882
    Please see below code for more information.
        protected void Button1_Click(object sender, EventArgs e)
            try
                DataTable _dt = GetDataFromDB();
                ViewState["dtQueryResults"] = _dt;
                BindgvQueryResults();
                messageLabel.Text = "";
            catch (Exception ex)
                messageLabel.Text = "Query execute error: " + ex.Message;
        private void BindgvQueryResults()
            if (ViewState["dtQueryResults"] != null)
                DataView _dv = ((DataTable)ViewState["dtQueryResults"]).DefaultView;
                if (ViewState["gvQRSortExpression"] != null && ViewState["gvQRSortDirection"] != null)
                    _dv.Sort = ViewState["gvQRSortExpression"].ToString() + ViewState["gvQRSortDirection"].ToString();
                gvQueryResults.DataSource = _dv;
                gvQueryResults.DataBind();
    private DataTable GetDataFromDB()
            SqlCommand _Cmd = null;
            SqlConnection _Con = null;
            try
                _Con = DBInteraction.InstantiateConnection();
                _Cmd = new SqlCommand();
                _Cmd.Connection = _Con;
                _Cmd.CommandTimeout = 300;
                if (_Con.State != ConnectionState.Open)
                    _Con.Open();
                try
                    _Cmd.CommandText = CriteriaBuilder1.QueryTransformer.Sql;
                catch(NullReferenceException)
                    throw new ApplicationException("Error message.");
                if(string.IsNullOrEmpty(CriteriaBuilder1.QueryTransformer.Sql))
                    throw new ApplicationException("This query does not have any text. Please add views in the query and try again.");
                _Cmd.CommandType = CommandType.Text;
                DataTable _dt = new DataTable();
                SqlDataAdapter adaPTer = new SqlDataAdapter(_Cmd);
                adaPTer.AcceptChangesDuringFill = false;
                adaPTer.Fill(_dt);
                if (_dt.Rows.Count == 0)
                    throw new ApplicationException("Your request did not return any results.");
                return _dt;
            catch (Exception ex)
                throw ex;
            finally
                if (_Con.State == ConnectionState.Open)
                    _Con.Close();
    Maurice

  • Result Set Causing out of memory issue

    Hi,
    I am having trouble to fix the memory issue caused by result set.I am using jdk 1.5 and sql server 2000 as the backend. When I try to execute a statement the result set returns minimum of 400,000 records and I have to go through each and every record one by one and put some business logic and update the rows and after updating around 1000 rows my application is going out of memory. Here is the original code:
    Statement stmt = con.createStatement();
    ResultSet   rs = st.executeQuery("Select * from  database tablename where field= 'done'");
                while(rs!=null && rs.next()){
                System.out.println("doing some logic here");
    rs.close();
    st.close();
    I am planning to fix the code in this way:
    Statement stmt = con.createStatement(ResultSet.TYPE_FORWARD_ONLY,
                          ResultSet.CONCUR_UPDATABLE);
    stmt.setFetchSize(50);
    ResultSet   rs = st.executeQuery("Select * from  database tablename where field= 'done'");
                while(rs!=null && rs.next()){
                System.out.println("doing some logic here");
    rs.close();
    st.close();But one of my colleague told me that setFetchSize() method does not work with sql server 2000 driver.
    So Please suggest me how to fix this issue. I am sure there will be a way to do this but I am just not aware of it.
    Thanks for your help in advance.

    Here is the full-fledged code.There is Team Connect and Top Link Api being used. The code is already been developed and its working for 2-3 hours and then it fails.I just have to fix the memory issue. Please suggest me something:
    Statement stmt = con.createStatement();
    ResultSet   rs = st.executeQuery("Select * from  database tablename where field= 'done'");
                while(rs!=null && rs.next()){
                 /where vo is the value object obtained from the rs row by row     
                if (updateInfo(vo, user)){
                               logger.info("updated : "+ rs.getString("number_string"));
                               projCount++;
    rs.close();
    st.close();
    private boolean updateInfo(CostCenter vo, YNUser tcUser) {
              boolean updated;
              UnitOfWork unitOfWork;
              updated = false;
              unitOfWork = null;
              List projList_m = null;
              try {
                   logger.info("Before vo.getId() HERE i AM" + vo.getId());
                   unitOfWork = FNClientSessionManager.acquireUnitOfWork(tcUser);
                   ExpressionBuilder expressionBuilder = new ExpressionBuilder();
                   Expression ex1 = expressionBuilder.get("application")
                             .get("projObjectDefinition").get("uniqueCode").equal(
                                       "TABLE-NAME");
                   Expression ex2 = expressionBuilder.get("primaryKey")
                             .equal(vo.getPrimaryKey());// primaryKey;
                   Expression finalExpression = ex1.and(ex2);
                   ReadAllQuery projectQuery = new ReadAllQuery(FQUtility
                             .classForEntityName("EntryTable"), finalExpression);
                   List projList = (List) unitOfWork.executeQuery(projectQuery);
                   logger.info("list value1" + projList.size());
                   TNProject project_hist = (TNProject) projList.get(0); // primary key
                   // value
                   logger.info("vo.getId1()" + vo.getId());
                   BNDetail detail = project_hist.getDetailForKey("TABLE-NAME");
                   project_hist.setNumberString(project_hist.getNumberString());
                   project_hist.setName(project_hist.getName());
                   String strNumberString = project_hist.getNumberString();
                   TNHistory history = FNHistFactory.createHistory(project_hist,
                             "Proj Update");
                   history.addDetail("HIST_TABLE-NAME");
                   history.setDefaultCategory("HIST_TABLE-NAME");
                   BNDetail histDetail = history.getDetailForKey("HIST_TABLE-NAME");
                   String strName = project_hist.getName();
                   unitOfWork.registerNewObject(histDetail);
                   setDetailCCGSHistFields(strNumberString, strName, detail,
                             histDetail);
                   logger.info("No Issue");
                   TNProject project = (TNProject) projList.get(0);
                   project.setName(vo.getName());
                   logger.info("vo.getName()" + vo.getName());
                   project.setNumberString(vo.getId());
                   BNDetail detailObj = project.getDetailForKey("TABLE-NAME"); // required
                   setDetailFields(vo, detailObj);//this method gets the value from vo and sets in the detail_up object
                   FNClientSessionManager.commit(unitOfWork);
                   updated = true;
                   unitOfWork.release();
              } catch (Exception e) {
                   logger.warn("update: caused exception, "
                             + e.getMessage());
                   unitOfWork.release();
              return updated;
         }Now I have tried to change little bit in the code. And I added the following lines:
                        updated = true;
                     FNClientSessionManager.release(unitOfWork);
                     project_hist=null;
                     detail=null;
                     history=null;
                     project=null;
                     detailObj=null;
                        unitOfWork.release();
                        unitOfWork=null;
                     expressionBuilder=null;
                     ex1=null;
                     ex2=null;
                     finalExpression=null;
    and also I added the code to request the Garbage collector after every 5th update:
    if (updateInfo(vo, user)){
                               logger.info("project update : "+ rs.getString("number_string"));
                               projCount++;
                               //call garbage collector every 5th record update
                               if(projCount%5==0){
                                    System.gc();
                                    logger.debug("Called Garbage Collectory on "+projCount+"th update");
                          }But now the code wont even update the single record. So please look into the code and suggest me something so that I can stop banging my head against the wall.

  • Loading JPEG/Gif images causes Out Of Memory Error

    Hello Java gurus, please shed some light on this.....
    I am not a java newbie, but I am a java newbie in the area of dealing with graphics. I am using JDK 1.4.2_01 on Win XP with 2.4ghz and 512 RAM. When I try to read in a (or a few) JPEG image into a JLabel using ImageIcon, I get an out of memory error everytime. This happens whether I am reading the image out of a jar file or out of a local directory. The JPEG's are 8.5" x 11", and range between 150kb to 900kb. I am trying to load approximately 10 images in a JTabbedPane using new JLabel(ImageIcon) on individual JPanels (one JPanel for each JPEG). Now I get the java.lang.outofmemory error whether I try to load 1 image or 10 images. Anyone got a clue as to what I am doing wrong? I actually do get one image loaded before the error gets thrown. So I do know that it is trying to load the right image. What is going on here? Is it an ImageIcon problem, or are there others who have run into this. It has brought this project to a grinding halt, and I had thought this would be the easier part!!!! Please help me out gurus! Thanks!

    What options/values are you suggesting that I set? Or do you mean something else? Thank you for any ideas that can help me solve this problem. I never had any issues dealing with using ImageIcon for toolbar button graphics (file sizes of 1-3kb), but now with dealing with full page graphics I have run into a brick wall with this out of memory error.

  • Top Link Causes out of memory issue when millions of records need to update

    Hello everyone,
    I am using TopLink 9.0.4 in a batch process. The batch process reads from the temp table(temp table has millions of records one month worth of data which need be updated). The database being used is sqlserver 2005. Below is the snippet of code. It works for 6-7 hours and crashes after that due of out of memory:
    ExpressionBuilder expressionBuilder = new ExpressionBuilder();
    Statement stmt = con.createStatement();
    ResultSet rs = st.executeQuery("Select * from database tablename where field= 'done'");
    while(rs!=null && rs.next()){
    *//where vo is the value object obtained from the rs row by row*     
    if (updateInfo(vo, user,expressionBuilder )){
                   logger.info("updated : "+ rs.getString("col_name"));
                   projCount++;
    rs.close();
    st.close();
    private boolean updateInfo(ProjectVO vo, YNUser tcUser,expressionBuilder ) {
              boolean updated;
              updated = false;
              try {
                   updated = true;
              } catch (Exception e) {
                   logger.warn("update: caused exception, "
                             + e.getMessage());
              return updated;
    Edited by: user8981696 on Jan 14, 2010 1:00 PM

    Thanks for your reply.
    Please find below the answers to you suggestions/concerns:
    You seem to be using raw JDBC to select all of the records in a single result set, not sure if this may be causing a memory issue. You could try paging through the results instead.
    Ans: I have modified the code to get me 1000 records each time and I am getting the ResultSet by using PrepartedStatement instead of regular Statement object.
    What type of caching are you using?
    Ans: No caching is being used. If you have some thoughts on caching please suggest or put some sample code. Again there is no AppServer is being used, its just a regular java process(Batch process) so I dont know how to do caching in a simple java process.
    You may also wish to try the latest 9.0.4 patch release, or try the 10.1.3 version, or the latest EclipseLink 2.0 release.
    Ans: Where can I find the latest patch release 9.0.4?
    Any help/suggestion is really appreciated!

  • Data merge with football tickets. Printing causes out of memory errors. Can only print 10 at a time!

    Here's the deal. Printing football tickets for the local high school. 10 tickets per 8.5 x 11 sheet. All black and white. I couldn't data merge into one whole document of tickets because it was failing to do that. So, I end up with 3 documents. First two have 50 pages of tickets and the last has 37 pages.
    When I go to print, I can only print 10 pages at a time or InDesign straight up crashes or I get an "Out of Memory" error which is bull because I've got plenty of ram available (unless it's complaining about lack of HD space for some reason)
    Incredibly frustrating.
    Each ticket has 3 images on it. Two of the images are the same. Those two images are about 103kb .ai files. Very very simple. The other image is a TIFF that is about 76kb. So there are 1000 .ai files / document and 500 TIFFs.
    Here's a video I took of it ripping: http://www.youtube.com/watch?v=UCjvdmXuaYs&feature=c4-overview&list=UUf_1UFp80YLZJfCyIxXEg kg
    Also, when printing, I have the Send Data set to Optimized Subsampling instead of "All". That's not seeming to help.
    Any thoughts?
    IDCS 5.5
    OSX 10.8.4

    I just ran a sample on my laptop.
    AI file, duplicated, is 284k. Image is roughly 200k. The CSV has 2054 records, and the merged document with 10 per page is 250 megabytes. PDF is just under 50 megs.
    This laptop has only 4 gigs memory but has a lot of free disk space. It took a good amount of time to do the merge, a bit longer to create the PDF-X1a PDF.
    Mike

  • Out of memory session store + in memory replication

    I think I know the answer, but I am going to ask anyway.
              Appearantly, if I use file based session persistance, weblogic will manage
              the in-memory session cache and store least recently used sessions to disk
              in order to preserve memory.
              Of course, file based persistance is slower than in-memory replication, and
              it doesn't work so well if we don't have a shared filesystem for our cluster
              (we don't), but the ability to manage memory in this way is very compelling.
              Is it possible to have a hybrid, where session state is replicated between
              nodes using in-memory-replication techniques, but each node only keeps a
              limited number of sessions in memory and swaps the rest out to disk?
              I think the answer is no, but it seems like a good feature.
              

              "Erik Speckman" <[email protected]> wrote in message
              news:[email protected]..
              > I think I know the answer, but I am going to ask anyway.
              >
              > Appearantly, if I use file based session persistance, weblogic will manage
              > the in-memory session cache and store least recently used sessions to disk
              > in order to preserve memory.
              >
              > Of course, file based persistance is slower than in-memory replication,
              and
              > it doesn't work so well if we don't have a shared filesystem for our
              cluster
              > (we don't), but the ability to manage memory in this way is very
              compelling.
              >
              > Is it possible to have a hybrid, where session state is replicated between
              > nodes using in-memory-replication techniques, but each node only keeps a
              > limited number of sessions in memory and swaps the rest out to disk?
              >
              > I think the answer is no, but it seems like a good feature.
              >
              >
              I don't think you can explicitly tune it, but of course any stuff in memory
              that doesn't get used for a while is going to end up on disk anyway if your
              physical memory is all allocated. Your swap system will probably do as good
              a job of this as being able to specify how many sessions are kept in memory
              would do.
              Loz
              

Maybe you are looking for

  • Ducker does not appear as an insert

    Ducker does not appear as an insert in my window. Should it not be located within dynamics? Is it possible that the software did not install? How can I get it?

  • Need help regarding Objects ...

    hi All, I am new in SAP so kinda struggling however i've been learning alot with your posts & replies.. hope my query will help someone else too. My issue is I've created an Object class using SU21, say name of the object class: Z_12 & in that i've c

  • Transcode problems in encore!?

    Hello, Wondering if anyone else has seen this... I'm putting together a wedding DVD for a client which needs to be done by tomorrow. I've imported my premiere pro sequence into encore with the chapter markers and I'm trying to transcode it. Unfortuna

  • ABAP Programing

    Can anyone Provide important ABAP Syntax

  • Query result not displaying

    hai experts, I am working on BI7.0. I am facing a problem in getting the query output, it is giving following message, "Exception when calling IWB_HTML_HELP_URL_GET occured" i am unable to resolve this. please help me in this matter as it is very urg