Reducing SQL memory

Forum,
What impact will reducing the memory SQL uses have on SAP B1?
Whilst I appreciate this isn't something that's advised be altered and ideally the more memory the better but I have an instance where there is a 3rd part application and the SQL server is having issues with the about of memory being used thus slowing everything down.
Regards,

you can tune the memory parameter according to your situation, Monitor "Page life expectancy" from DB performance history  and make sure this doesn't go down very low. Microsoft recommends below 300 is a problem but its prefer to have this number very high always.
Thanks
Mushtaq

Similar Messages

  • SQL memory manager latch at top of the list

    select * from V$LATCH order by wait_time desc shows wait time = 451524661079 on SQL memory manager latch. What is making this so high?

    user498912 wrote:
    Thank you .. the pointer to the bug is helpful. We are having performance issues and complaints of slowness only during about a 2 or 3 hour window each weekday from about 7AM to 10AM EDT. This is definitely peak load. AWR is showing mostly disk i/o issues. Plan is to add about 20 GB more memory on the server. SGA currently at around 61GB with Automatic Shared Memory Management enabled. PGA advice showing nothing with a Cache Hit Percentage (%) 95.93
    I would look at v$sga_resize_ops (or v$memory_resize_ops) to see if there is pressure to move memory around in that time period. Your symptoms could simply be showing load on the buffer cache causing the library cache to shrink leading to agressive demands for library cache locks etc. If so, then fixing a minum shared_pool_size (or generally going to manual memory management) may be the best bet. Also worth checking if there is some inefficient SQL running at that time that results in lots of random I/O - eliminating the cause of disk reads may be the best solution to reducing demand for memory.
    Regards
    Jonathan Lewis

  • Reducing the memory utilisation of my database

    Hi,
    I want to reduce the memory utilisation of my database. I want to know which sqls have assigned by the oracle some OS memory in my database.I have the awr reports with me.
    My questions:
    1.Which section of the awr will give exactly this information?
    (SQL ordered by Sharable Memory doesn't help )
    2. Or can you tell me some views or tables wherein I get the needed the information which I can query against in my database?
    3. How can I reduce the memory utilisation in case I get the problematic sqls?
    Thanks,
    Sach

    I'm not sure that I understand your question. Can you clarify a couple points for me?
    What memory are we talking about here? Normally, most of the RAM allocated to Oracle is going to be SGA. But SGA isn't associated with any particular SQL statement, at least not in a fashion that I could contemplate doing reporting on. Individual SQL statements require RAM temporarily in the PGA during execution, but it sounds like you're not interested in that.
    What is the problem you are trying to solve here? If you want to reduce the amount of RAM allocated to Oracle from the operating system, you should be able to do that without analyzing any specific SQL statements by adjusting memory parameters. Mentioning what version of Oracle, what parameters you've set, and how much you'd like to reduce memory consumption would be helpful if you want specific suggestions for parameters to change.
    What does "problematic sqls" mean in this context?
    Justin

  • Sql memory usage increase each time win app runs

    Hi,
    Sql memory usage increases each time when win app runs. Why does it work like this? Is it normal ?
    I restart SQL Server 2012. Only my win app uses SQL server.
    First run of winapp.
    start memory usage : 211.800 KB
    close memory usage: 528.136 KB
    Second run of xaf app.
    start memory usage : 528.136 KB
    close memory usage: 996.844 KB
    Third run of xaf app
    start memory usage : 996.844 KB
    close memory usage: 997.640 KB
    Fourth run of xaf app
    start memory usage : 997.640 KB
    close memory usage: 1.104.864 KB

    Hi,
    Sql memory usage increases each time when win app runs. Why does it work like this? Is it normal ?
    Yes, it is perfectly normal for SQL Server to acquire and hold onto large amounts of memory indefinitely.  This memory improves performance by avoiding disk I/O, query plan compilation and costly memory management. 
    On a dedicated SQL Server you should usually let SQL Server dynamically manage memory.  It will release memory if it detects memory pressure.  But if you often run other applications on the server than need significant amounts of memory (e.g. IIS,
    application services), you may want to set max server memory as suggested.
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • SQL Memory Usage

    Hello
    I am updating some items with the DTW. i am updating 9000 items. I have made 9 files of 1000 records, i am just updating the picturname field. When i start to upload the files the SQL Memory usage starts to grow more and more. When i try to upload the third file the memory has grown too much, and the DTW says the file update ok but it doesnt update the database. So i need to restart the SQL service (mssqlserver) and do it again. Some times it doesnt work so i need to restart the server.
    Any ideas??
    Jacobo

    hi,
    Check this information abt DI API Memory Consumption in WIKI SDK FAQ's.
    https://www.sdn.sap.com/irj/scn/wiki?path=/display/b1/faq_sdk
    DI API
    Memory consumption
    New connection mechanism in 2007 vs. classical
    1. In the old method the DI API was loaded into the same process with the Add-On and was an "actual part" of it, so calls to the DI-API were very quick and direct
    2. In the new method there is one common DI-API which is part of the Core B1. All the Add-Ons will use the same DI-API IF (!!!) they work in the new method. The calls to the DI-API have now to "go between" two processes. This means that they go through more processing and although it's all in the same machine and no actual communication (i.e., network traffic) is actually happening, the system's CPU and memory are "working harder".
    The impact is extremely small on an individual call level, but for an Add-On that makes a large amount of calls this difference accumulates...
    There is no huge additional CPU or Memory consumption. Most of the impact is on the Response Time level. Some of it is CPU consumption and some of it is Context Switch waiting.
    3. This trade-off between Memory consumption and Response Time is actually another reason why R&D thought it's a good idea to leave the new method to be optional based on a decision from the developer.
    Jeyakanthan

  • Best practice on SQL memory allocation

    Hi experts,
    Is there a best practice to set the max amount of allocated SQL memory?
    Regards

    AWE must be enabled just if we are speaking about 32 bits OS.
    Max Memory limit must be specified always because SQL Server 2005 or 2008 is trying to use how much is available and the problem it is that actually is not releasing the memory after that.
    Amount of memory allocate for each component is depending but processes running for that application.
    In your case you just have to put Max memory for SQL server to be around 40% from total amount of memory from that server.
    Into SSAS to set to take between 30 and 40% of total amount of memory and te rest will be used by application server and OS.
    Again this configuration doesn't means it is optimal but it will avoid problems with out of memory into that server.
    Kind Regards
    Sorin Radulescu

  • Latch free(SQL memory manager latch)

    Hi All,
    Can somene help me in finding anything about "latch free(SQL memory manager latch)". This latch is causing sqls to perform badly. Will be thankful if someone provide the resolution plan for this latch.
    Thanks in advance.

    Hi,
    Thanks for your reply but here in this case I know this is "SQL memory manager latch". All I wanted to know about this latch and resolution.

  • Reduce Physical Memory Amount on Windows 2003

    Hi all,
    I need more information about reducing physical memory amount on windows 2003 64bit. 
    I have Front End Exchange Server 2003 with 48 Gb Physical Memory, and i want to reduce physical memory to 16Gb. Is there any impact to exisiting OS or Exchange Server?
    Regards,
    Rengga Patria 

    Hi Rengga,
    For the impact to the whole operating system, it depends on the server’s workload.
    As to the impact to the function of the Exchange Server role, as far as I know, 4 GB of random access memory (RAM) is the maximum amount of memory that an Exchange Server
    computer can efficiently use.
    Regarding physical memory and Exchange server 2003, the following articles can be referred to for more information.
    Pushing the Limits of Windows: Physical Memory
    http://blogs.technet.com/b/markrussinovich/archive/2008/07/21/3092070.aspx
    Exchange Server 2003 Processor and Memory Scalability
    http://technet.microsoft.com/en-us/library/aa996184(v=exchg.65).aspx
    Exchange 2003 Memory Configuration change for Windows 2003 (PAE Support)
    http://blogs.technet.com/b/exchange/archive/2005/07/05/407330.aspx
    In addition, for this is concerning Exchange Server role, in order to get better advice, we can ask for suggestions in the following Microsoft TechNet Exchange Server Forum.
    Exchange Server Forum
    http://social.technet.microsoft.com/Forums/exchange/en-US/home?category=exchangeserver
    Best regards,
    Frank Shen

  • SSIS / SQL Memory Contention

    Hi all,
    I'm running SSIS and SQL server on a single PC. 
    I have an ETL process consisting of several SSIS packages to load data from a source OLTP to a data warehouse.
    OLTP DB resides on SQL 2012 instance.
    Datawarehouse resides on SQL 2008R2.
    My SSIS packages 'hangs' and reports it is failing to swap out memory buffers - they're all held by the SQL 2012 (source db) instance.
    On my PC I've configured max memory on each SQL instance:
    PC Memory = 16GB
    SQL 2012 Max Memory = 6GB
    SQL 2008r2 Max Memory = 4GB
    My ETL package now runs smoothly. 
    My question is - what is best practice for managing this memory contention in a production environment. I do not want to limit the available memory to my OLTP db. 
    My ETL is ran once per day  - overnight.  Perhaps I could reconfigure max memory before and after each ETL processs? But I don't like this as it would require a server restart & cold cache for my OLTP apps.
    Thanks for reading. Appreciate suggestions.
    Clay

    OK, It is more clear now :-)
    This is  a developing server, and I assume that you have to hold several instances in order to re-create an environment that is like the production's environment  (I hope this is the reason, otherwise you need a very good reason for holding several
    instances on one machine, which has resources problem, since each instance need lot of resources only to keep it running and most time it is better to have one instance with all the databases, than several instaces on the same machine). I actually have the
    same problem on my developing environment by the way (I have about 8 instances of different database servers). Everything hat i write from now is only recommend for this situation for developing environment!
    The following points should be taken into account and act according:
    This is a small list that can take 2 books if we have time... this is just some point to think about.
    * Remember that SQL Server is trying to use as much memory as you let him in order to execute more efficience, It dose not take into direct consideration of other instances or application. 
    * In order to understand what need to be done you need to monitor what specificly use all the memory.
        ** look at sys.dm_os_memory_clerks
        ** look for DBCC MEMORYSTATUS
        ** look at this link: http://www.mssqltips.com/sqlservertip/2304/how-to-identify-microsoft-sql-server-memory-bottlenecks/
        ** It is good time for Google https://www.google.co.il/search?q=SQL+Server+monitoring+memory
    * Once you found the cause and source for the memory use, you can focus on solving and start reduce memory usage before you start the ETL procces (and mabe during as well).
        **  look for DBCC FREEPROCCACHE 
        ** look for  DBCC DROPCLEANBUFFERS
        ** We can build a better query sometimes
        ** we can pass some of the operation to the disk instead of the RAM
        ** Setting the Memory Options (as you did control max server memory,
    check this link for more info)
        ** look for SET AUTO_CLOSE ON (This is usually not recommended but in your case using several instances this can help a lot sometimes)
    * The ETL proccess do not use only SQL Server but the application that execute it as well. The SQL Server will see the queries. While SQL Server is great in menaging multiple users and multiple paralel actions (or example using locks), some ETL are not so
    good. You need to make sure if the SQL Server return an error or the failed come from the ETL proccess it self. Please post the full error message for mor specific information.
    * Remember that some actions in SQL Server use memory out of the Buffer pool memory and some action are not part of the max server memory (for example some CLR used).
    as I mentioned i can continue this documents for days :-)
    I hope this is helpfull for you as it is :-), and if you have more information (like error messages from log files) we might focuse on other option.
    [Personal Site] [Blog] [Facebook]

  • Reduce SQLDeveloper memory footprint with JDK 1.7

    Hi!
    Some time ago in another thread (Re: Memory problems with Oracle sql developer there was a suggestion to try the new Garbage-First Garbage Collector. which should be production in JDK 1.7.
    I use SQLDeveloper with JDK 1.7 on 64bit Linux with good results:
    - everything feels faster, snappier
    - fonts rendering is different, but it is OK
    - the bugs noted in other threads are not a showstopper for me (the connections pane not showing up on startup, not being able to scroll more than 1 OCI array size of records in results grid)
    In the above mentioned thread there is a suggestion that the new garbage collector should improve memory footprint of SQLDeveloper, however, this is not my experience, since it behaves pretty much the same as with JDK 1.6 (resident size between 700 and 900 MB).
    Do I need to use these opotions (as per reffering thread) to enable the new garbage collector (see below) or is it switched on by default in JDK 1.7? The reduced memory footprint would be very welcomed, because I use Oracle Warehouse Builder at the same time (also a java app) and there is always much pressure on memory.
    AddVMOption -XX:+UnlockExperimentalVMOptions
    AddVMOption -XX:+UseG1GC
    AddVMOption -XX:+G1YoungGenSize=25m
    AddVMOption -XX:+G1ParallelRSetUpdatingEnabled
    AddVMOption -XX:+G1ParallelRSetScanningEnabled
    Thanx
    Aleksander

    Hi Aleksander,
    Glad to hear of your good report on Java 7's HotSpot VM regarding performance -- it has various enhancements, of which the new garbage collector is just one. In terms of interpreting memory footprints, take a look at:
    http://www.oracle.com/technetwork/java/javase/gc-tuning-6-140523.html#generation_sizing
    Note the diagram indicates total heap size does not include the permanent generation memory. Xmx limits the heap size (the young and tenured generation). MaxPermSize limits class and method metadata plus static variable content. (Apparently starting back in Java 5 there are even some cases where the permanent generation space can be shared by multiple VM instances to improve start-up time and reduce memory usage.) These two limits control distinct, non-overlapping areas of memory.
    When monitoring a Java application's heap consumption with a profiling tool, I doubt the reported usage will exceed the Xmx limit by much. Monitoring with Windows Task Manager, however, can be a bit misleading. I have read several critiques in years past on how Task Manager reports program memory consumption. "Mem Usage" is actually the working set size. "VM Size" is program private memory rather than the true virtual size. And who knows how it tracks the Java VM's permanent generation size. Will it depend on whether it is shared or not?
    So I cannot really recommend any additional parameters to you. Just trust in the Xmx setting and hope that SQL Developer keeps any memory leaks to a minimum.
    Hope this helps,
    Gary

  • SQL Memory issue

    I am storing files in sql lite, but when i store large file 400+Mb system throw this exception
    Error #1000: The system is out of memory.
        at flash.data::SQLStatement/internalExecute()
        at flash.data::SQLStatement/execute()
    Please give suggestion
    Thanks

    One sql server box:
    SYstem administrator pointed out: SQL server almost uses all memory and server memory usage almost hits 100%
    How to fix this problem?
    This is not a problem but normal behavior.SQL Server will use as much memory you provide to it.So its good to set proper value for max server memory setting in sp_configure.See example in below MS link . Set proper  memory usage for SQL Server leaving
    enough RAM (4-5 G ) for OS.This will only include memory set for buffer pool there are memory allocations outside buffer pool which is directly done by OS
    http://technet.microsoft.com/en-us/library/ms178067.aspx
    To read more about memory
    http://social.technet.microsoft.com/wiki/contents/articles/22316.sql-server-memory-and-troubleshooting.aspx
    Please mark this reply as the answer or vote as helpful, as appropriate, to make it useful for other readers

  • Creating large polygons in PL/SQL - memory leak?

    I'm using PL/SQL to create a polygon with a very large number of vertices. I use SQL*Plus to execute the procedure. I check the memory usage in Windows Task Manager before and after the procedure. After the procedure finishes, 200K or more of memory does not get released. When I exit SQL*Plus, the memory is released. I wonder if I'm mis-understanding garbage collection in PL/SQL?
    Background: This is an experiment to prove the feasibility of composing polygons in PL/SQL. I need a procedure that selects x and y vertex coordinates from a table where the x and y are stored as normal numbers. The procedure will then compose an SDO_GEOMETRY and insert it into a spatial table. The procedure may process many thousands of polygons in this way. My experiment is to show that PL/SQL can 1) be used to populate geometry with a large number of vertices, and 2) that performance is adequate for this job.
    As a side note: I found that PL/SQL will generate an error when the ordinate array exceeds about 1,040,000 ordinates (500,000 2D vertices).
    My real question though is this: What should I do in my procedure to release the memory? Once I figure this out, I will be composing and destroying the geometry object once for each iteration of a loop that generates the object.
    Sample:
    SET SERVEROUTPUT ON
    DECLARE
    geom MDSYS.SDO_GEOMETRY;
    arrElemInfo MDSYS.SDO_ELEM_INFO_ARRAY;
    arrOrd MDSYS.SDO_ORDINATE_ARRAY;
    x integer := 0;
    y integer := 0;
    BEGIN
    arrElemInfo := MDSYS.SDO_ELEM_INFO_ARRAY(1,3,1); -- Start offset 1, etype is polygon, Interpret ETYPE as Simple polygon whose vertices are connected by straight line segments
    arrOrd := MDSYS.SDO_ORDINATE_ARRAY();
    FOR i IN 1..500000 LOOP
    x := 1;
    y := i;
    arrOrd.EXTEND(2);
    arrOrd(arrOrd.COUNT-1) := x;
    arrOrd(arrOrd.COUNT) := y;
    END LOOP;
    -- Close the polygon
    arrOrd.EXTEND(2);
    arrOrd(arrOrd.COUNT-1) := arrOrd(1);
    arrOrd(arrOrd.COUNT) := arrOrd(2);
    geom := mdsys.sdo_geometry(2003,NULL,NULL,arrElemInfo,arrOrd);
    DBMS_OUTPUT.PUT_LINE(geom.sdo_ordinates.count);
    END;
    null

    Hi Dale,
    There have been a couple of PL/SQL bug fixes in 9i related to memory use with objects. It is possible you have run into one of those issues.
    Also, I've noticed a one-time cost associated with certain functions that cause memory usage to jump once, then stabilize.
    Also, the maximum size for a varray column in Oracle (and hence the max size for the sdo_ordinates and sdo_elem_info arrays) is 1048576.
    hope this helps,
    dan
    null

  • Reducing the memory footprint of our Sybase ASE based SolMan install

    Hello All,
    We are doing a test install of SAP Solution Manager 7.01 on Sybase ASE 15.7.
    Since this is just a test setup, we started off with a lower-than-recommended hardware configuration (4 GB RAM only) due to time constraints and since we were 'assured' that we could do basic testing with this setup.
    While post install performance of SolMan was decent, performance during solman_setup (setting up technical monitoring) has become appalling. We are not able to complete the configuration process at all as the SolMan configuration web application has become very unpredictable and extremely slow.
    The SolMan install is centralized and on a windows 2008 box. Windows task manager shows consistent memory usage of up to 90 - 95%. We also tried reducing the total number of work processes to just 8 but that did not help much. We see in 'task manager > resource monitor' that sqlserver.exe process is taking a shareable working set close to 2 GB of RAM whereas the committed memory much less (34 MB). Please tell us about any memory optimization we can perform for SolMan / Sybase ASE in order to complete technical monitoring setup using Solman_setup. We were hoping that we could change the  'total logical memory' setting for the DB directly using DBACOCKPIT tcode (in order to reduce the max memory setting) but could not do so as the it seems to be read-only. We could not find much documentation/posts regarding memory optimization for the DB. Please help out. Thanks!
    -Regards,
    Arvind

    FWIW ... ASE's 'max memory' setting can be changed on the fly, while 'total logical memory' is a calculated value that you cannot change (ie, it's 'read only'; changing 'max memory' will cause 'total logical memory' to change automatically). [NOTE: DBACOCKPIT is a SAP-provided application that sits on top of ASE; while I know what's doable when connected directly to ASE I do not know if DBACOCKPIT has disabled the ability to change some configuration settings like 'max memory'.]
    As for the SolMan performance issues ... I'd recommend reposting your issue in the SAP Applications on ASE discussion group where you're likely to get the attention of more folks with SAP application (on ASE) experience.  (While someone may jump in here with SolMan suggestions, SolMan is a SAP application and this group isn't really geared towards SAP applications.)

  • Is this the best approach to reduce the memory ??

    Hi -
    I have been given a task to reduce the HEAP memory so that the system can support more number of users. I have used various suggestions given in this forum to find out the size of the object in memory. I have reached to a point that where i think i got an approx size of the object in memory.(not 100%)
    I basically have some objects of some other class which are created when this object is created . The intent was to initialize the nested objects once and use them in the main object. I saw some significant difference reduction in size of the object when i create these objects local to the methods which use it.
    Before moving the objects to method level
    Class A {
        Object b = new Object();
        Object c = new Object();
        Object d = new Object();
         public void method1 () {
             b.someMethod();
         public void method2 () {
             b.someMethod();
         public void method3 () {
             c.someMethod();
         public void method4 () {
             c.someMethod();
         public void method5 () {
             d.someMethod();
         public void method6 () {
             d.someMethod();
    After moving the objects to method level
    Class A {
         public void method1 () {
           Object b = new Object();
             b.someMethod();
         public void method2 () {
            Object b = new Object();
             b.someMethod();
         public void method3 () {
           Object c = new Object();
             c.someMethod();
         public void method4 () {
          Object c = new Object();
             c.someMethod();
         public void method5 () {
            Object d = new Object();
             d.someMethod();
         public void method6 () {
            Object d = new Object();
             d.someMethod();
    }Note : This object remains in the http session atleast 2 hrs. I cannot change the session time out.
    Is this the better approach to reduce the heap size? What are the side effects of creating all objects in the local methods which will be on stack?
    Thanks in advance

    The point is not that the objects are on the stack - they aren't, all objects are in heap, but that they have a much shorter life. They'll become unreachable as soon as the method exits, rather than surviving until the session times out. And the garbage collector will probably recycle them pretty promptly, because they remain in "Eden space".
    (In future versions of the JVM Sun is hoping to use "escape analysis" to reclaim such objects even faster).
    Of course some objects might have a significant creation overhead, in which case you might want to consider creating some kind of pool of them from which one could get borrowed for the duration of the call. With simple objects, though, the overheads of pooling are likely to be higher.
    Are these objects modified during use? If not then you might simply be able to create one instance of each for the whole application, and simply change the fields in the original class to static. The decision depends on thread safety.

  • PL/SQL memory issues

    Hi,
    is there any way to test how many memory is consumed by PL/SQL program? I need to test memory consumption for examples listed below. Many many thanks for solution. Cejen
    declare
    type dci_test1 is table of varchar2(4000) index by binary_integer;
    type ds_test1 is record (
    i1 varchar2(4000),
    i2 varchar2(4000),
    i3 varchar2(4000)
    sTest ds_test1; -- How many memory is allocated now? 12kB + overhead or 0kB + overhead ?
    ciTest dci_test;
    begin
    sTest.i1 := 'A'; -- How many memory is allocated now? 12kB + overhead or 4kB + overhead or 1B + overhead ?
    ciTest(1) := 'A'; -- How many memory is allocated now? 4kB + overhead or 1B + overhead ?
    end;
    /

    Quick google search..
    http://oracle-online-help.blogspot.com/2006/11/precaution-while-defining-data-types-in.html

Maybe you are looking for

  • Unable to Access Resource Error in AE CS6

    Hello, I'm running After Effects CS6 11.0 on MAC OSX  and everytime I attempt to apply text, either by creating a new text layer from the drop down menu or dragging open a text box I get "Unable to Access Resource    P_TextSmallCaps_Sm_N_D (type png)

  • Mapping ISA and GS information

    I am fairly new to BizTalk. I have become pretty good at using the mapping tool as well as XSLT (which I prefer), but there is one thing I'm not sure how to do. I need to insert info into a SQL table for each claim that we process. Part of the info I

  • External Monitor Orientation

    Hi. I have a MacBook Pro (early 2011) with thunderbolt and I'm using an external monitor. I used to use it vertically and had the orientation appropriate for it. However, now that I want to use it horizontally, my computer keeps detecting and display

  • Anybody adapt a iPhone case with or without Cover for a Z

    I recently bought a Zen. I wanted to try out a case before buying out, but haven't been able to find a case in a store. Last night I was looking at MicroCenter for a Zen case and I found an iphone case. I like it, but I wish it had a cover. http://ww

  • Wish to copy 400 emails to dvd for further editing

    as above inbox has 400+ and wish to copy to dvd-rw to get rid of chaff.and edit.