Best Practice - Message Store size

Is there a recommended max size for the message store?
iplanet 5.2

The offical recommendatin is mostly, "that depends" . . .
If you intend to do backups, then store partition size depends on backup parallelism and how long you will tolerate backups running.
If you're talking about maximum size of the store itself, there's no hard limit, only performance limits.
We have seen large systems, with up to 500,000 users on single boxes, with many hundreds of gigs of store, split up into 40 or 50 partitions work very successfully. On large boxes.
Better to give us some idea of your needs, and perhaps we can offer advice.

Similar Messages

  • Best practice?-store images outside the WAR file?

    I have an EAR project with several thousand images that are constantly changing. I do not want to store the images in the WAR project since it will take an extremely long time to redeploy with every image change. What is the best practice for storing images? Is it proper to put them in the WAR and re-deploy? Or is there a better solution?

    Perryier wrote:
    Can you expand on this? Where do they get deployed and in what format? How do I point to them on a jsp?
    I am using Sun Application server 9.0, and I don't really think this has a "stand alone" web server. How will this impact it?You could install any web server you want (Apache?). The request comes in and if the request matches something like .jpg or .gif or whatever, you serve up the file. If you have a request for a jsp or what not, you forward the request to the app server (Sun App Server in your case). i.e. your web server acts as a content-aware proxy.

  • Best practice to store date in database

    which best practices do you know to do this?

    Snoopybad wrote:
    which best practices do you know to do this?1. Understand the business requirements that driver the need to store such a value.
    2. Understand the difference between 'time', 'date' and 'timestamp' (date and time) and also understand that a time interval (measurement of passing time) is not the same as any of those.
    3. Understand what timezone means exactly
    4. Look at the database itself to understand exactly how it stores such values. This includes insuring that you understand exactly how the timezone is handled.

  • Message Store size

    Hi
    1) How i can find the size of message store ? if possible give me the
    commands ??
    2) How to find out the number of mailboxes in the message store ?
    Thanks in Advance

    Hi
    1) How i can find the size of message store ? if
    possible give me the
    commands ??Assuming you're on some version of Unix?
    cd ../store
    du -s
    How to find out the number of mailboxes in the
    message store ?There are several commands that can give you useful data.
    I would look at the documentation for the
    mboxutil
    command. The output can be parsed by grep, or some other kinds of things to give you exactly what you're looking for.
    You could also use find, like this:
    cd ../store
    find . -name TRASH -print | wc -l
    since there is actually no directory named, "INBOX", but any valid mailbox will also have a trash folder, look for that. wc -l counts the number of lines that the find command gives you.
    If you want total number of folders, look for "store.idx". each folder has one.
    mboxutil will be faster, and offer less impact on your system, though.
    >
    >
    Thanks in Advance

  • What are the best practices for average size of an Adobe PDF file on a Web Server to be downloaded?

    We know at a certain point, an extremely large file can be slow in downloading/accessing from a web site, and lessens the end-user's experience.  How large is too large? What is a good average or suggested size for Adobe PDF files in general?
    Thank you for any guidance.

    I'm sure it varies depending on one's opinion but we have a policy that they should be under 1- 1.5MB. This isn't always posible of course. Our PDF's are mostly informational and contain little or no graphics.
    I would say that it greatly depends on the PDF. If you get up over that size, you should do your best to optimize it for your users. Sometimes you can only do so much.

  • Best practice to reduce size of BIA trace files

    Hi,
    I saw alert on BIA monitor says 'check size of trace files'. Most of my trace files are above 20MB. I clicked on details it says "Check the size of your trace files. Remove or move the trace files with the memory usage that is too high or trace files that are no longer needed."
    I would like to reduce them these trace files but not sure what is the safetest way to do it. Any suggestion would be appreciated!
    Thanks.
    Mimosa

    Mimosa,
    Let's be clear here first. The tracing set via sm50 is for tracing on the ABAP side of BI not the BIA.
    Yes, it is safe to move/delete TrexAlertServer.trc, TrexIndexServer.trc, etc from the OS level. You can also right click the individual trace when you enter the "Trace" tab in the TREX Admin Tool (python) and I believe there is options to delete them there but it is certaintly OKAY to do this on the OS level. They are simply recreated when new traces are generated.
    I would recommend that you simply .zip the files and move the .zip files to another folder in case SAP support may need them to analyze an issue. As long as they aren't huge, and if hard disk space permits, this shouldnt be an issue. After this you then will need to delete the trace file. It is important that if a trace file has an open handle registered to it then it wont let you delete/move it. Therefore it might be a good idea to do this task when system activity is low or non-existent.
    2 things also to check:
    1. Make sure the python trace is not on.
    2. In the python TREXAdmin Tool, check the Alerts tab and click "Alert Server Configuration". Make sure the trace level is set to "error".
    Hope that helps. As always check the TOM for any concerns:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/46e11c10-0e01-0010-1182-b02db2e8bafb
    Edited by: Mike Bestvina on Apr 1, 2008 3:59 AM - revised some statements to be more clear

  • ASM on SAN datafile size best practice for performance?

    Is their a 'Best Practice' for datafile size for performance?
    In our current production, we have 25GB datafiles for all of our tablespaces in ASM on 10GR1, but was wondering what the difference would be if I used say 50GB datafiles? Is 25GB a kind of mid point so the data can be striped across multiple datafiles for better performance?

    We will be using Redhat Linux AS 4 update u on 64-bit AMD Opterons. The complete database will be on ASM...not the binarys though. All of our datafiles we have currently in our production system are all 25GB files. We will be using RMAN-->Veritas Tape backup and RMAN-->disk backup. I just didn't know if anybody out there was using smallfile tablespaces using 50GB datafiles or not. I can see that one of our tablespaces will prob be close to 4TB.

  • Best Practices needed -- question regarding global support success stories

    My customer has a series of Go Lives scheduled throughout the year and is now concerned about an October EAI (Europe, Asia, International) go live.  They wish to discuss the benefits of separating a European go Live from an Asia/International go live in terms of support capabilities and best practices.  The European business is definitely larger and more important than the Asia/International business and the split would allow more targeted focus on Europe.  My customer does not have a large number of resources to spare and is starting to think that supporting the combined go live may be too much (i.e., too much risk to the businesses) to handle.
    The question for SAP is regarding success stories and best practices.
    From a global perspective, do we recommend this split?  Do most of our global customers split a go live in Europe from a go live in Asia/International (which is Australia, etc.).  Can I reference any of these customers?  If the EAI go live is not split, what is absolutely necessary for success, etc, etc?  For example, if a core team member plus local support is required in each location, then this may not be possible with the resources they have u2026u2026..
    I would appreciate any insights/best practices/success stories/or u201Cwaru201D stories you might be aware of.
    Thank you in advance and best regards,
    Barbara

    Hi, this is purely based on customer requirement.
    I  have a friend in an Organization which went live in 38 centers at the same time.
    With the latest technologies in networking, distances does not make any difference.
    The Organization where I currently work in, has global business locations. In my current organization the go live was in phases. Here they went live in the region where the business was maximum first because this region was their largest and most important as far as revenue was concerned. Then after stabilizing this region, a group of consultants went to the rest of the regions for the go live in that region.
    Both the companies referred above are successfully into SAP and are leading partners with SAP. Unfortunately I am not authorized to give you the names of the Organizations as a reference for you as you requested.
    But in your case if you have shortage of manpower, you can do it in phases by first going live in the European Market and then in phases you can go live in the other regions.
    Warm Regards

  • Best practice for storing user's generated file?

    Hi all,
    I have this web application that user draws an image off the applet and is able to send the image via mms.
    I wonder what is the best practice to store the user image before sending the mms.
    Message was edited by:
    tomdog

    java.util.prefs

  • Best practice for searching on surname/lastname/name in Dutch

    I'm looking for a best practice to store names of persons, but also names of companies, in my database.
    I always store them as is (seems logical since you need to be able to display the original input-name) but I also want to store them transformed in some sort of way so I can easily search on them with LIKE! (Soundex, Metaphone, Q-Gram, ...)
    I know SOUNDEX and DIFFERENCE are included in SQLServer, but they don't do the trick.
    If somebody searches for the phrase "BAKKER", you should find names like "Backer", "Bakker", ... but also "De Backer", "Debecker", ... and this is where SOUNDEX fails ...
    Does someone know some websites to visit, or someone already wrote a good function to transform a string that I can use to store the names but also to transform my search data?
    (Example:  (Pseudo lang :-))
    function MakeSearchable (sString)
      sString = sString.Replace(" ", ""); //Remove spaces
      sString = sString.Replace("CK", "K");
      sString = sString.Replace("KK", "K");
      sString = sString.Replace("C", "S");
      sString = sString.Replace("SS", "S");
      return sString;
    Greetz,
    Tim

    Thanks for the response, but unfortunately the provided links are not much help:
    - The first link is about an article I don't have access to (i'm not a registered user)
    - The second link is about Integration Services. This is nice for Integration stuff, but I need to have a functionality within a frontend. 
    - The third link is for use in Excel.
    Maybe I'm looking for the wrong thing when wanting to create an extra column with "cleaned" up data. Maybe there's another solution from within my frontend or business layer, but I simply want a textbox on a form where users can type a search-value like
    "BAKKER". The result of the search should return names like "DEBACKER", "DE BEKKER", "BACKER", "BAKRE", ...
    I used to work in a hospital where they wrote their own SQL-function (on an Interbase database) to do this: They had a column with the original name, and a column with a converted name:
    => DEBACKER => Converted = DEBAKKER
    => DE BEKKER => Converted = DEBEKKER
    => BACKER => Converted = BAKKER
    => BAKRE => Converted = BAKKER
    When you searched for "BAKKER", you did a LIKE operation on the converted column ...
    What I am looking for is a good function to convert my data as above.
    Greetz,
    Tim

  • Query Best Practice for Reports

    I am new to Apex and I am wondering what is the best practice for store your sql quries for reports.  I am a believer of storing all sql behind pacakge functions or procedures.  And it looks like the only options for report pages are to use a direct SQL query, or a function that returns a query as a string.  Yes the function method counts as putting the code in Oracle but not really.  It is still getting compiled and parsed on the Apex side.  It would be nice if Apex could handle a cursor but I have read that it doesn't directly. You have to have a function that returns a cursor and then create a pipelined function that calls the cursor function.  That is kind of silly.  Is there some other way to do this?
    Apex 4.2
    Oracle 11.2.0.2
    Thanks for any input.
    Jeff

    Hi Jeff,
    I'm not necessarily a believer in packaging queries. I'm a little more pragmatic in that I believe it may make sense in environments where you have a client environment that just expects a result set that is then manipulated by the client for the purposes of presentation, pagination etc. Apex has a different architecture in that the client is purely an HTML presentation layer (browser) and the presentation, pagination etc is formulated in the database along with the data using the Oracle web toolkit, which is a set of internal packages that produce HTML. Note that handling and manipulating ref cursors inside PL/SQL is not a joy, they were mainly designed to be passed out to external clients. (Often to shield programmers who don't or won't even try to understand relational concepts)
    This means that when you create a report based on a query, the Apex engine will manipulate that base query, depending on the display requirements and pagination requirements of your report, before it submits that query to the database for execution. To get an idea of how this manipulation occurs, you can run your report in debug mode and check the actual query that is submitted to the database. If the query is presented as an already executed ref cursor, then the Apex engine can't execute in the way that it does. As you have already found out, the only way of using packaged queries returning ref cursors is by the use of a pipelined function, so that the Apex engine can treat the result as a normal query.
    This is the architecture of Apex, and I suspect that re-engineering the Apex engine to handle ref cursors natively, as opposed to using a pipelining trick, would be a considerable change. I hope this at least helps to explain why ref cursors and Apex don't mix. I personally don't see the purpose of having an abstraction layer of packaged queries below an abstraction layer of an API such as Apex. SQL is a perfectly good API.
    Regards
    Andre

  • Essbase unix file system best practice

    Is there such thing in essbase as storing files in different file system to avoid i/o contention? Like for example in Oracle, it is best practice to store index files and data files indifferent location to avoid i/o contention. If everything in essbase server is stored under one file directory structure as it is now, then the unix team is afraid that there may run into performance issue. Can you please share your thought?
    Thanks

    In an environment with many users (200+) or those with planning apps where users can run large long-running rules I would recommend you separate the application on separate volume groups if possible, each volume group having multiple spindles available.
    The alternative to planning for load up front would be to analyze the load during peak times -- although I've had mixed results in getting the server/disk SME's to assist in these kind of efforts.
    Some more advanced things to worry about is on journaling filesystems where they share a common cache for all disks within a VG.
    Regards,
    -John

  • Edge Animate Best Practices

    Dear Community,
    I have been wracking my brain to try to understand why my animation appears constantly jerky. I am new to Edge, and not a coder, and Im wondering why this happens:
    http://www.christinaciardullo.com/ESO_HolidayCard_2014_4.html
    What am I missing? I wanted to ask some best practices regarding
    File Sizes: My pngs generally range from 30-90KB, with larger background images up to 500KB. Is this too much? What are appropriate file sizes for this kind of animation?
    Animation Lengths: 2-3 second transitions, and constant back and forth motions.Too much at once?
    Overlapping Layers: There is a lot of opacity changing in this - Is it not good to have things animate one on top of the other?
    Animating Re-sizing: Maybe this is too much to ask? To zoom out?
    Thank You!
    C

    Based on my experience, zoom-in and outs of large files are generally the number one culprit in performance slowdowns. In your animation, between the zoom-out on the ball and background and the animations on the balls (multiple fade transitions and rotations), there's a lot to process at once. My first couple of suggestions are to decrease the the file size on the background image (the tree) and then to only begin the ball animations once the zoom out has completed. I'm guessing the background image is the 500kb image you referred to. Try taking it down to whatever quality is best before it's noticeably compromised. There's no fast rule for appropriate sizes – much of its depends on what you want to do with it and the platform it's going to run on. For instance, I noticed that the animation runs smoothly in Chrome while it has the hardest time in Firefox.

  • Storing data - best practice?

    Hi,
    I wonder if there is any best practice to store data in my EP6.0 portal? For instance, in a standard website if you have a list of events, each event can be stored in a related sql-database and can then be fetched and updated whenever necessary.
    What is the best way to do developing portal content? The reason I am asking is because I want to develop a WebDynpro application where I can select a date and then display all registered events on that day in my portal.
    Best regards
    Øyvind Isaksen

    Okey, and then using a RFC call from the webdynpro application to fetch data from the sap database?
    This answered my question:
    Best regards
    Øyvind Isaksen

  • MRP Exception Messages - Best practices

    Dear colleagues,
    I am currently dealing with a large SAP installation where system is already live in 50+ countries and the number of manufacturing plants are in hundreds. MRP generates exception messages but the volume is significantly high. Planners have lost faith in these messages thus the client is not getting the full benefit of reacting to MRP exceptions. 
    Have any of you dealt with similar business situations? If yes, what best practices and what system design have you put in place to simplify the process of resolving exception messages.
    Thanks,
    PKV

    Hello Pavan,
    MD06 should be the planner's best friend as it helps him be on top of all the future expected shortfalls, talk to production schedulers or Vendors to bring material on time for production/deliver to customer.
    Best way to handle exception messages in a plant is to divide it by MRP controllers, procurement types external or internal.
    Planners have to analyse MD06 after every MRP run if possible. Atleast once a week go through their MD06.
    However as per my experience most places the material master planning data is not as clean as the planners would like to have. But MD06 offers them an opportunity to update the material master as per their choice. ie planners can change planned delivery time, mrp lot size, mrp type etc..
    You may have to find out why the planners have lost faith in MRP messages. System(MRP) can give you a good result only if  master data is good.
    Hope this helps.
    Thanks,
    Ram

Maybe you are looking for