High database growth

Hi ,
W e are in version 4.7 .
Our worry is current database is 980 GB and per month database growth is 55GB . We want to restrict the database growth as early as possible otherwise it is hampering system performance.
We want to know the followings :
1. Steps to follow to control the growth
2. At what level we have to start our analysis database / application level
3. What are the areas we have to check database / application level
4. At applicable level any way to check the customization which is unncessary creating chain / multiple transactions.
Regards,
mukesh Badhwar
+91 - 9810449604

Hi
I would suggest you to go through given below Link,which will give you enough idea which are the tables normally need space management at regular intervals.
http://help.sap.com/saphelp_nw70/helpdata/EN/08/5742084ae611d1894f0000e829fbbd/frameset.htm
Tables are given below.
PSAPBTAB
Transaction data tables. Objects in this tablespace might expand very rapidly.
PSAPSTAB
Master data tables. Objects in these tablespaces might expand very rapidly.
PSAPCLU
Clustered tables, such as financial tables. Objects in these tablespaces might expand very rapidly.
PSAPPOOL
Pool tables, containing customization tables.
PSAPPROT
Spool (that is, print) requests, protocols
Based on the size you many need to use brtool to add tablespace.
Thanks
Sukrut S

Similar Messages

  • One of the BizTalk Server processes in the affected computer is being throttled for significant periods because of high database size exceeding the threshold

    Hello Experts,
    I have complex Biz Talk 2013 farm having 20 servers,15 Hosts. In my production environment even if there is no traffic i am getting throttling errors from SCOM for my all the production hosts.
    Error : One of the BizTalk Server processes in the affected computer is being throttled for significant
    periods because of high database size exceeding the threshold
     I checked following things:
    1. MsgBoxDB size 748732 KB
    2. Spool table size  53 MB
    3. Tracking DB size 26724 KB
    4. Host settings --  Message Queue Size = 100, MsgCount = 50000 , Spool MP = 10 , Tracking MP = 10
    5. Ran Message Box viewer  and did not found any error related to DB size. (which counter i should focus on in MBV)
    Note -- for DB i am sharing full backup size because it does not have log file size.
    Please suggest where i should focus?
    Is SCOM reporting correctly because everything is fine in biztalk ?
    Thanks
    Yagya
    https://www.mcpvirtualbusinesscard.com/VBCServer/card.aspx?tag=YagyaDattMishra&wa=wsignin1.0

    Hi Yagyam
    I remember this error from SCOM when you use the standard SCOM BizTalk pack.
    Check the eventlog of the server, do you see any errors in event log, this could give some clue to the root cause. Whenever you have this alert from SCOM, you must have
    some entries in eventlog relating to the alert raised by SCOM.
    Is your message processing by BizTalk hosts are normal? Run the performance monitor to see the bottleneck. Check if there is really throttling.
    As mentioned in this blog, check the state of the SQL Agent on BizTalk's SQL database server, may be try restart the SQL Agent on BizTalk's SQL database server.
    http://blogs.msdn.com/b/timdel/archive/2008/11/19/why-i-love-scom.aspx
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • How to determine database growth for new B1 instalation

    Hello B1 people,
    I'm working on the capacity planning of my new business one project and must know how to determine database growth. Does anyone did something like that before?
    Wich tables should I consider?
    Thanks in advance.

    Hello Gabriel,
    I think it is difficult to make any predictions about database growth because this depends on the volume of your business transactions, the number of users, the continuity of your master data...
    For example a small number of users can create a lot of transactions (accounting, logistics) with big documents like sales orders with more than 100 items, or a lot of users who primarily look up things but only create small transaction won't create too much data and thus influence database growth in totally different ways.
    You should collect information about the volume of daily business to make any forecasts.
    If you start with an empty database, the first big growth will be when you upload your master data, but if only few business transactions follow, you shouldn't use that first increase for your calculation.
    Is this your very first B1 installation, or do you already have B1 systems running? If you use the EWA service on a running B1 installation on a regular basis, then you will get a good impression of database growth and transaction volume but you should only use this as a basis for your calculation if the business of the new installation is similar to already running company's.
    Hope that helps,
    Sandra

  • Scripts To Check Database Growth in Oracle 10g

    Hi All,
    I need your help developing a script to find out the database growth in Oracle 10G on daily, weekly and monthly basis.
    In our production database tablespace growth is huge and we are adding data files frequently. Management is asking about the database growth report and I need to present it. Is there any such script which will suffice the purpose.
    My database version is 10.2.0.5.
    Please help.
    Regards,
    Arijit

    1000103 wrote:
    Hi All,
    I need your help developing a script to find out the database growth in Oracle 10G on daily, weekly and monthly basis.
    In our production database tablespace growth is huge and we are adding data files frequently. Management is asking about the database growth report and I need to present it. Is there any such script which will suffice the purpose.
    only the report that you create
    How do I ask a question on the forums?
    SQL and PL/SQL FAQ

  • How to monitor the database growth?

    Dear All,
    How to monitor the database growth in SAP.
    Is there any t.code available to check the same.
    advance thanks,
    Sundar  C
    Note: suitable answers will get maximum reward points.

    Hai,
          You can check the database growth using tcode -db02.
    and also use tcode-db02old(if the SAP is Netweaver2004s version)  and click on space statistics for monitoring the datbase growth,
    Thanks and Regards,

  • How to check Database Growth for a DB on ASM?

    Hi there
    I have been using the following script to check the database growth (for DBs on Filesystem):
    SELECT b.tsname tablespace_name ,
      MAX(b.used_size_mb) cur_used_size_mb ,
      ROUND(AVG(inc_used_size_mb),2)avg_increas_mb
    FROM
      (SELECT a.days,
        a.tsname ,
        used_size_mb ,
        used_size_mb - LAG (used_size_mb,1) OVER ( PARTITION BY a.tsname ORDER BY a.tsname,a.days) inc_used_size_mb
      FROM
        (SELECT TO_CHAR(sp.begin_interval_time,'MM-DD-YYYY') days ,
          ts.tsname ,
          MAX(ROUND((tsu.tablespace_usedsize* dt.block_size )/(1024*1024),2)) used_size_mb
        FROM dba_hist_tbspc_space_usage tsu ,
          dba_hist_tablespace_stat ts ,
          dba_hist_snapshot sp,
          dba_tablespaces dt
        WHERE tsu.tablespace_id    = ts.ts#
        AND tsu.snap_id            = sp.snap_id
        AND ts.tsname              = dt.tablespace_name
        AND sp.begin_interval_time > sysdate-7
        GROUP BY TO_CHAR(sp.begin_interval_time,'MM-DD-YYYY'),
          ts.tsname
        ORDER BY ts.tsname,
          days
        ) a
      ) b
    GROUP BY b.tsname
    ORDER BY b.tsname;And I think it always gave me good results until I ran this script on a DB (10.2.0.5) on ASM.
    Is it because databases on ASM are maintaied differently OR is it that most proabbly there has been no activity on this database in the last 7 days? I even ran this query for last 90/180-days and it still returned me following results:
    TABLESPACE_NAME                CUR_USED_SIZE_MB AVG_INCREAS_MB
    SYSAUX                                   574.38            .36
    SYSTEM                                   514.69              0
    DATA                                    1593.25              0
    IDX                                         .06              0
    UNDOTBS1                                  69.06          -3.84
    USERS                                     96.13              0Thanks in advance!

    I have no reason to believe tablespaces under ASM are maintained differently, so the most likely explanation is change in DB activity.
    What is your AWR retention? The default is 7 days, so if yours is set at the default then running the query for 90/180 days would not give you more information.
    If you want to get useful output from your scripts, you might need to adjust the AWR retention and your SYSAUX tablespace respectively.
    If the DB is monitored by Oracle Enterprise Manager(OEM), you can do tablespace forecasting based on the metrics collected by OEM. I did a presentation about this (NYOUG,VirtaThon) a while back:
    http://iiotzov.files.wordpress.com/2011/08/iotzov_oem_repository.pdf
    http://iiotzov.files.wordpress.com/2012/05/oem-repository-a-second-look.doc
    Iordan Iotzov

  • Huge database Growth

    Hello Guys,
    We have been observing very huge database growth in PRD environment.
    We have to add atleast 25GB datafile weekly to TS PSAPSR3.
    I had a look at DB02 for TOP SIZES and TOP GROWTH.
    Owner     Name     Partition     Type     Tablespace     Size(MB)     Chg.Size/day     #Extents     #Blocks     Next Extent(MB)
    SAPSR3     LIPS          TABLE     PSAPSR3     21367.000     364.433     520     2734976     2.500
    SAPSR3     BSIS          TABLE     PSAPSR3     16460.000     277.667     442     2106880     10.000
    SAPSR3     CE11000          TABLE     PSAPSR3     16360.000     262.500     441     2094080     10.000
    SAPSR3     VBFA          TABLE     PSAPSR3     15402.000     265.133     425     1971456     10.000
    SAPSR3     GLPCA          TABLE     PSAPSR3     15171.000     259.867     425     1941888     10.000
    SAPSR3     FAGLFLEXA          TABLE     PSAPSR3     13738.000     232.667     399     1758464     10.000
    SAPSR3     ACCTIT          TABLE     PSAPSR3     12788.000     215.067     384     1636864     10.000
    SAPSR3     ARFCSDATA          TABLE     PSAPSR3     12350.000     410.400     380     1580800     2.500
    SAPSR3     RFBLG          TABLE     PSAPSR3     11433.000     194.667     363     1463424     2.500
    SAPSR3     CE41000_ACCT          TABLE     PSAPSR3     11177.000     184.000     360     1430656     10.000
    SAPSR3     VBAP          TABLE     PSAPSR3     9663.000     156.433     336     1236864     10.000
    SAPSR3     VBRP          TABLE     PSAPSR3     8308.000     140.800     313     1063424     2.500
    SAPSR3     FAGL_SPLINFO          TABLE     PSAPSR3     7960.000     135.200     308     1018880     20.000
    SAPSR3     MSEG          TABLE     PSAPSR3     7936.000     134.400     307     1015808     10.000
    SAPSR3     BSIS~0          INDEX     PSAPSR3     7488.000     132.267     300     958464     2.500
    SAPSR3     VBFA~0          INDEX     PSAPSR3     7304.000     123.533     299     934912     2.500
    SAPSR3     DBTABLOG          TABLE     PSAPSR3     7303.000     83.200     300     934784     10.000
    SAPSR3     COEP          TABLE     PSAPSR3     6991.000     119.467     293     894848     10.000
    SAPSR3     CE41000          TABLE     PSAPSR3     6144.000     91.733     279     786432     10.000
    SAPSR3     FAGLFLEXA~3          INDEX     PSAPSR3     6028.000     104.533     278     771584     2.500
    SAPSR3     FAGL_SPLINFO_VAL~0          INDEX     PSAPSR3     5702.000     98.133     273     729856     2.500
    SAPSR3     FAGLFLEXA~0          INDEX     PSAPSR3     5568.000     98.133     270     712704     2.500
    We have daily sales order of around 12000.
    I want to know why it growing at such alarming pace or atleast find the Transactions which are causing huge inserts and updates.
    Regards
    Abhishek

    Hi Abhishek,
    In addition to the above, a very interesting area to work upon periodically is Data Volume Management.
    SAP has released 6.3 version of this guide.
    Click on this link
    https://websmp101.sap-ag.de/~sapidb/011000358700005044382000E
    This guide covers almost all tables which have considerable data growth and what preventive actions can be taken to keep the total database size under control. Basically, this guide covers, Prevention, Aggregation, Deletion, Archiving areas.
    Coupled with the guide's recommendations with good space management activities like table reorgs would definitely keep the system away from performance issues due to database size.
    This is an on-going project at some customer places.
    Br,
    Venky

  • Reg: DB2 Database growth

    Dear Friends,
    I want how do show database growth daily wise and monthly wise and Module wise?
    Like
    MM, FICO, SD, PP, QM and etc.
    How do i show DB growth? wheather any standard reports are there in SAP?
    Regards
    kesav

    > I want how do show database growth daily wise and monthly wise and Module wise?
    The "module wise" is very difficult to archive. To what would you relate a material master data? MM? SD?
    > How do i show DB growth? wheather any standard reports are there in SAP?
    Check transaction DBACOCKPIT.
    Markus

  • Estimating database growth

    Hi All,
    My Requirement: I am trying to estimate the database growth of the R3 PRD system inorder to comeup with some statistics on how much needs to be archived.
    I right now know the size of the database which is some X TB. We are adding data at the rate of some 100-150 GB every month.
    How can i come up with an estimate of what could be the possible size of the data base at the end of fiscal years 2007,2008 in order to roughly estimate what should be the extent of archiving?
    Thanks
    Janani

    Hi Janani,
    very tricky process.... anyway estimate is always an 'estimate', what i mean is actuals will be different. here is how i did it...
    Have at hand the following:
    - present size X TB
    - yearly growth
    put it in an excel sheet and calculate the expected growth for coming years... in my calculations i assume that DB growth will be 125% in the successive years.
    if you are already using archiving, then you might want to calculate based on archiving also... so take into consideration the (assumed) percentage of data which can be archived in an year... say 20% and calculate the savings... add to that the space required for storing the archive files, which are compressed ofcourse.
    apart from this, you may take into consideration the QA, DEV systems, mirrors, DR systems etc which will add up to the storage space.
    did i make it confusing?
    regards,
    naveen

  • Sybase database growth statistics lost after reboot

    Hi all gurus,
    I would to check database growth in dbacockpit of ASE database (Database SDI, Space, Show growth); but the DB has only statistics since the start of Database (2 days ago for maintenance), so do I have to assume that we lost all previuos statistics? Or we can view also the previous?
    Thanks a lot,
    SP

    It seems that I have solved the problem with the character set. It's not really something that I understand but it works: I have changed the startup script that is used in /etc/init.d/rc5 to start the server. I changed the shell for the script. It was not working when I used #/bin/bash or #/bin/sh but it works when I use #/bin/csh.

  • Live cache database growth

    Dear all,
    How can be we get the database growth of livecache database for a period of time?
    Please guide me the path to find these values.
    Thanks n Regards,
    KK

    Hello,
    "How can be we get the database growth of livecache database for a period of time?
      Please guide me the path to find these values."
    1. Please let us know if you need the general statistics of the database growth or please collobarate more about your question.
        The Fill level of the database (for example, the size of the data volumes, number of permanently and temporarily used pages) statistics is collected by the DB analyzer every 15min. You could use the DB analyzer & check the statistics in DBAN_FILLING.csv
    file which is created every day, if the DB analyzer is activated.
       Please review SAP Note 530394 (Bottleneck Analysis with Database Analyzer).
    See the MAXDB library documentation:
    http://maxdb.sap.com/doc/7_7/default.htm -> Tools -> Database Analyzer <  then go to More information: Database Analyzer Log Files >
    2. Please let us know the version of the database.
    Thank you and best regards, Natalia khlopina

  • How to find the database growth rate?

    Wanted to do the forecasting of disk growth for one year. How to find the database growth rate?
    Rahul

    This is code authored by Richard Ding that will log database sizes to a table.  If you run it every day, then you can go back and compare the database size differences day to day... week to week... month to month... and year over year.  That is
    how I forecast growth over time.
    Note:  There is a database name required that is local to your environment, so change [YOURDATABASENAME] to whatever local database you wish to use.  I will also post the DDL to create the target table.  Create that table in the database you
    name in the stored procedure code and all should run fine.
    USE [master]
    GO
    /****** Object: StoredProcedure [dbo].[sp_SDS] Script Date: 04/22/2015 09:32:53 ******/
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    CREATE PROCEDURE [dbo].[sp_SDS]
    @TargetDatabase sysname = NULL, -- NULL: all dbs
    @Level varchar(10) = 'Database', -- or "File"
    @UpdateUsage bit = 0, -- default no update
    @Unit char(2) = 'MB' -- Megabytes, Kilobytes or Gigabytes
    AS
    ** author: Richard Ding
    ** date: 4/8/2008
    ** usage: list db size AND path w/o SUMmary
    ** test code: sp_SDS -- default behavior
    ** sp_SDS 'maAster'
    ** sp_SDS NULL, NULL, 0
    ** sp_SDS NULL, 'file', 1, 'GB'
    ** sp_SDS 'Test_snapshot', 'Database', 1
    ** sp_SDS 'Test', 'File', 0, 'kb'
    ** sp_SDS 'pfaids', 'Database', 0, 'gb'
    ** sp_SDS 'tempdb', NULL, 1, 'kb'
    SET NOCOUNT ON;
    IF @TargetDatabase IS NOT NULL AND DB_ID(@TargetDatabase) IS NULL
    BEGIN
    RAISERROR(15010, -1, -1, @TargetDatabase);
    RETURN (-1)
    END
    IF OBJECT_ID('tempdb.dbo.##Tbl_CombinedInfo', 'U') IS NOT NULL
    DROP TABLE dbo.##Tbl_CombinedInfo;
    IF OBJECT_ID('tempdb.dbo.##Tbl_DbFileStats', 'U') IS NOT NULL
    DROP TABLE dbo.##Tbl_DbFileStats;
    IF OBJECT_ID('tempdb.dbo.##Tbl_ValidDbs', 'U') IS NOT NULL
    DROP TABLE dbo.##Tbl_ValidDbs;
    IF OBJECT_ID('tempdb.dbo.##Tbl_Logs', 'U') IS NOT NULL
    DROP TABLE dbo.##Tbl_Logs;
    CREATE TABLE dbo.##Tbl_CombinedInfo (
    DatabaseName sysname NULL,
    [type] VARCHAR(10) NULL,
    LogicalName sysname NULL,
    T dec(10, 2) NULL,
    U dec(10, 2) NULL,
    [U(%)] dec(5, 2) NULL,
    F dec(10, 2) NULL,
    [F(%)] dec(5, 2) NULL,
    PhysicalName sysname NULL );
    CREATE TABLE dbo.##Tbl_DbFileStats (
    Id int identity,
    DatabaseName sysname NULL,
    FileId int NULL,
    FileGroup int NULL,
    TotalExtents bigint NULL,
    UsedExtents bigint NULL,
    Name sysname NULL,
    FileName varchar(255) NULL );
    CREATE TABLE dbo.##Tbl_ValidDbs (
    Id int identity,
    Dbname sysname NULL );
    CREATE TABLE dbo.##Tbl_Logs (
    DatabaseName sysname NULL,
    LogSize dec (10, 2) NULL,
    LogSpaceUsedPercent dec (5, 2) NULL,
    Status int NULL );
    DECLARE @Ver varchar(10),
    @DatabaseName sysname,
    @Ident_last int,
    @String varchar(2000),
    @BaseString varchar(2000);
    SELECT @DatabaseName = '',
    @Ident_last = 0,
    @String = '',
    @Ver = CASE WHEN @@VERSION LIKE '%9.0%' THEN 'SQL 2005'
    WHEN @@VERSION LIKE '%8.0%' THEN 'SQL 2000'
    WHEN @@VERSION LIKE '%10.0%' THEN 'SQL 2008'
    WHEN @@VERSION LIKE '%11.0%' THEN 'SQL 2012'
    WHEN @@VERSION LIKE '%12.0%' THEN 'SQL 2014'
    END;
    SELECT @BaseString =
    ' SELECT DB_NAME(), ' +
    CASE WHEN @Ver = 'SQL 2000' THEN 'CASE WHEN status & 0x40 = 0x40 THEN ''Log'' ELSE ''Data'' END'
    ELSE ' CASE type WHEN 0 THEN ''Data'' WHEN 1 THEN ''Log'' WHEN 4 THEN ''Full-text'' ELSE ''reserved'' END' END +
    ', name, ' +
    CASE WHEN @Ver = 'SQL 2000' THEN 'filename' ELSE 'physical_name' END +
    ', size*8.0/1024.0 FROM ' +
    CASE WHEN @Ver = 'SQL 2000' THEN 'sysfiles' ELSE 'sys.database_files' END +
    ' WHERE '
    + CASE WHEN @Ver = 'SQL 2000' THEN ' HAS_DBACCESS(DB_NAME()) = 1' ELSE 'state_desc = ''ONLINE''' END + '';
    SELECT @String = 'INSERT INTO dbo.##Tbl_ValidDbs SELECT name FROM ' +
    CASE WHEN @Ver = 'SQL 2000' THEN 'master.dbo.sysdatabases'
    WHEN @Ver IN ('SQL 2005', 'SQL 2008', 'SQL 2012', 'SQL 2014') THEN 'master.sys.databases'
    END + ' WHERE HAS_DBACCESS(name) = 1 ORDER BY name ASC';
    EXEC (@String);
    INSERT INTO dbo.##Tbl_Logs EXEC ('DBCC SQLPERF (LOGSPACE) WITH NO_INFOMSGS');
    -- For data part
    IF @TargetDatabase IS NOT NULL
    BEGIN
    SELECT @DatabaseName = @TargetDatabase;
    IF @UpdateUsage <> 0 AND DATABASEPROPERTYEX (@DatabaseName,'Status') = 'ONLINE'
    AND DATABASEPROPERTYEX (@DatabaseName, 'Updateability') <> 'READ_ONLY'
    BEGIN
    SELECT @String = 'USE [' + @DatabaseName + '] DBCC UPDATEUSAGE (0)';
    PRINT '*** ' + @String + ' *** ';
    EXEC (@String);
    PRINT '';
    END
    SELECT @String = 'INSERT INTO dbo.##Tbl_CombinedInfo (DatabaseName, type, LogicalName, PhysicalName, T) ' + @BaseString;
    INSERT INTO dbo.##Tbl_DbFileStats (FileId, FileGroup, TotalExtents, UsedExtents, Name, FileName)
    EXEC ('USE [' + @DatabaseName + '] DBCC SHOWFILESTATS WITH NO_INFOMSGS');
    EXEC ('USE [' + @DatabaseName + '] ' + @String);
    UPDATE dbo.##Tbl_DbFileStats SET DatabaseName = @DatabaseName;
    END
    ELSE
    BEGIN
    WHILE 1 = 1
    BEGIN
    SELECT TOP 1 @DatabaseName = Dbname FROM dbo.##Tbl_ValidDbs WHERE Dbname > @DatabaseName ORDER BY Dbname ASC;
    IF @@ROWCOUNT = 0
    BREAK;
    IF @UpdateUsage <> 0 AND DATABASEPROPERTYEX (@DatabaseName, 'Status') = 'ONLINE'
    AND DATABASEPROPERTYEX (@DatabaseName, 'Updateability') <> 'READ_ONLY'
    BEGIN
    SELECT @String = 'DBCC UPDATEUSAGE (''' + @DatabaseName + ''') ';
    PRINT '*** ' + @String + '*** ';
    EXEC (@String);
    PRINT '';
    END
    SELECT @Ident_last = ISNULL(MAX(Id), 0) FROM dbo.##Tbl_DbFileStats;
    SELECT @String = 'INSERT INTO dbo.##Tbl_CombinedInfo (DatabaseName, type, LogicalName, PhysicalName, T) ' + @BaseString;
    EXEC ('USE [' + @DatabaseName + '] ' + @String);
    INSERT INTO dbo.##Tbl_DbFileStats (FileId, FileGroup, TotalExtents, UsedExtents, Name, FileName)
    EXEC ('USE [' + @DatabaseName + '] DBCC SHOWFILESTATS WITH NO_INFOMSGS');
    UPDATE dbo.##Tbl_DbFileStats SET DatabaseName = @DatabaseName WHERE Id BETWEEN @Ident_last + 1 AND @@IDENTITY;
    END
    END
    -- set used size for data files, do not change total obtained from sys.database_files as it has for log files
    UPDATE dbo.##Tbl_CombinedInfo
    SET U = s.UsedExtents*8*8/1024.0
    FROM dbo.##Tbl_CombinedInfo t JOIN dbo.##Tbl_DbFileStats s
    ON t.LogicalName = s.Name AND s.DatabaseName = t.DatabaseName;
    -- set used size and % values for log files:
    UPDATE dbo.##Tbl_CombinedInfo
    SET [U(%)] = LogSpaceUsedPercent,
    U = T * LogSpaceUsedPercent/100.0
    FROM dbo.##Tbl_CombinedInfo t JOIN dbo.##Tbl_Logs l
    ON l.DatabaseName = t.DatabaseName
    WHERE t.type = 'Log';
    UPDATE dbo.##Tbl_CombinedInfo SET F = T - U, [U(%)] = U*100.0/T;
    UPDATE dbo.##Tbl_CombinedInfo SET [F(%)] = F*100.0/T;
    IF UPPER(ISNULL(@Level, 'DATABASE')) = 'FILE'
    BEGIN
    IF @Unit = 'KB'
    UPDATE dbo.##Tbl_CombinedInfo
    SET T = T * 1024, U = U * 1024, F = F * 1024;
    IF @Unit = 'GB'
    UPDATE dbo.##Tbl_CombinedInfo
    SET T = T / 1024, U = U / 1024, F = F / 1024;
    SELECT DatabaseName AS 'Database',
    type AS 'Type',
    LogicalName,
    T AS 'Total',
    U AS 'Used',
    [U(%)] AS 'Used (%)',
    F AS 'Free',
    [F(%)] AS 'Free (%)',
    PhysicalName
    FROM dbo.##Tbl_CombinedInfo
    WHERE DatabaseName LIKE ISNULL(@TargetDatabase, '%')
    ORDER BY DatabaseName ASC, type ASC;
    SELECT CASE WHEN @Unit = 'GB' THEN 'GB' WHEN @Unit = 'KB' THEN 'KB' ELSE 'MB' END AS 'SUM',
    SUM (T) AS 'TOTAL', SUM (U) AS 'USED', SUM (F) AS 'FREE' FROM dbo.##Tbl_CombinedInfo;
    END
    IF UPPER(ISNULL(@Level, 'DATABASE')) = 'DATABASE'
    BEGIN
    DECLARE @Tbl_Final TABLE (
    DatabaseName sysname NULL,
    TOTAL dec (10, 2),
    [=] char(1),
    used dec (10, 2),
    [used (%)] dec (5, 2),
    [+] char(1),
    free dec (10, 2),
    [free (%)] dec (5, 2),
    [==] char(2),
    Data dec (10, 2),
    Data_Used dec (10, 2),
    [Data_Used (%)] dec (5, 2),
    Data_Free dec (10, 2),
    [Data_Free (%)] dec (5, 2),
    [++] char(2),
    Log dec (10, 2),
    Log_Used dec (10, 2),
    [Log_Used (%)] dec (5, 2),
    Log_Free dec (10, 2),
    [Log_Free (%)] dec (5, 2) );
    INSERT INTO @Tbl_Final
    SELECT x.DatabaseName,
    x.Data + y.Log AS 'TOTAL',
    '=' AS '=',
    x.Data_Used + y.Log_Used AS 'U',
    (x.Data_Used + y.Log_Used)*100.0 / (x.Data + y.Log) AS 'U(%)',
    '+' AS '+',
    x.Data_Free + y.Log_Free AS 'F',
    (x.Data_Free + y.Log_Free)*100.0 / (x.Data + y.Log) AS 'F(%)',
    '==' AS '==',
    x.Data,
    x.Data_Used,
    x.Data_Used*100/x.Data AS 'D_U(%)',
    x.Data_Free,
    x.Data_Free*100/x.Data AS 'D_F(%)',
    '++' AS '++',
    y.Log,
    y.Log_Used,
    y.Log_Used*100/y.Log AS 'L_U(%)',
    y.Log_Free,
    y.Log_Free*100/y.Log AS 'L_F(%)'
    FROM
    ( SELECT d.DatabaseName,
    SUM(d.T) AS 'Data',
    SUM(d.U) AS 'Data_Used',
    SUM(d.F) AS 'Data_Free'
    FROM dbo.##Tbl_CombinedInfo d WHERE d.type = 'Data' GROUP BY d.DatabaseName ) AS x
    JOIN
    ( SELECT l.DatabaseName,
    SUM(l.T) AS 'Log',
    SUM(l.U) AS 'Log_Used',
    SUM(l.F) AS 'Log_Free'
    FROM dbo.##Tbl_CombinedInfo l WHERE l.type = 'Log' GROUP BY l.DatabaseName ) AS y
    ON x.DatabaseName = y.DatabaseName;
    IF @Unit = 'KB'
    UPDATE @Tbl_Final SET TOTAL = TOTAL * 1024,
    used = used * 1024,
    free = free * 1024,
    Data = Data * 1024,
    Data_Used = Data_Used * 1024,
    Data_Free = Data_Free * 1024,
    Log = Log * 1024,
    Log_Used = Log_Used * 1024,
    Log_Free = Log_Free * 1024;
    IF @Unit = 'GB'
    UPDATE @Tbl_Final SET TOTAL = TOTAL / 1024,
    used = used / 1024,
    free = free / 1024,
    Data = Data / 1024,
    Data_Used = Data_Used / 1024,
    Data_Free = Data_Free / 1024,
    Log = Log / 1024,
    Log_Used = Log_Used / 1024,
    Log_Free = Log_Free / 1024;
    DECLARE @GrantTotal dec(11, 2);
    SELECT @GrantTotal = SUM(TOTAL) FROM @Tbl_Final;
    INSERT INTO [YOURDATABASENAME].[dbo].[DBSize]
    ([Weight]
    ,[DBName]
    ,[Used]
    ,[Free]
    ,[Total]
    ,[Data]
    ,[Data_Used]
    ,[Log]
    ,[Log_Used]
    ,[DT])
    SELECT
    CONVERT(dec(10, 2), TOTAL*100.0/@GrantTotal) AS 'WEIGHT (%)',
    DatabaseName AS 'DATABASE',
    CONVERT(VARCHAR(12), used) AS 'USED',
    CONVERT(VARCHAR(12), free) AS 'FREE',
    TOTAL,
    CONVERT(VARCHAR(12), Data) AS 'DATA',
    CONVERT(VARCHAR(12), Data_Used) AS 'DATA_USED',
    CONVERT(VARCHAR(12), Log) AS 'LOG',
    CONVERT(VARCHAR(12), Log_Used) AS 'LOG_USED',
    GETDATE()
    FROM @Tbl_Final
    WHERE DatabaseName LIKE ISNULL(@TargetDatabase, '%')
    ORDER BY DatabaseName ASC;
    IF @TargetDatabase IS NULL
    SELECT CASE WHEN @Unit = 'GB' THEN 'GB' WHEN @Unit = 'KB' THEN 'KB' ELSE 'MB' END AS 'SUM',
    SUM (used) AS 'USED',
    SUM (free) AS 'FREE',
    SUM (TOTAL) AS 'TOTAL',
    SUM (Data) AS 'DATA',
    SUM (Log) AS 'LOG'
    FROM @Tbl_Final;
    END
    RETURN (0)
    GO
    USE [YOURDATABASENAME]
    GO
    /****** Object: Table [dbo].[DBSize] Script Date: 04/22/2015 09:49:10 ******/
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    SET ANSI_PADDING ON
    GO
    CREATE TABLE [dbo].[DBSize](
    [UID] [int] IDENTITY(1,1) NOT NULL,
    [Weight] [decimal](18, 2) NULL,
    [DBName] [varchar](250) NULL,
    [Used] [decimal](18, 2) NULL,
    [Free] [decimal](18, 2) NULL,
    [Total] [decimal](18, 2) NULL,
    [Data] [decimal](18, 2) NULL,
    [Data_Used] [decimal](18, 2) NULL,
    [Log] [decimal](18, 2) NULL,
    [Log_Used] [decimal](18, 2) NULL,
    [DT] [datetime] NULL,
    CONSTRAINT [PK_DBSize] PRIMARY KEY CLUSTERED
    [UID] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO
    SET ANSI_PADDING OFF
    GO

  • Rapid database growth - how to check?

    We have been experiencing rapid database growth in the past few days.
    How do we check what is causing this?
    We are on ECC 5.0 / Win / SQL Server
    Thanks
    Prasad

    Hi,
    How do we check what is causing this?
    What is your business ?  what interfaces do you have into your system
    Have you done a recent take on or influx of orders....
    What do you call  rapid DB growth..
    Mark

  • Monitor Database Growth

    Hello,
    I just took over the role as a junior DBA in a new organization and would like to found out how much our db's have grown in the past year. Unfortuneatly the last DBA did not monitor growth over time. Is there a way to determine how much the entire database has grown from one point in time to another without having monitored the growth? Our database versions range from 7 to 10g.

    hi,
    look at this thread
    Re: DB growth!
    this looks at reporting database growth/shrinkage at monthly intervals
    regards
    Alan

  • High Database server load from expensive SQL statements

    Dear all,
    I am facing problem in the production  sever there is high Database server load from expensive SQL statements as per EVA report
    Buffer Load [%]     Disk Load [%]     CPU Load [%]
    55     69     0
    Analysis of DB SQL CACHE
    EXPENSIVE SQL STATEMENTS OVERVIEW
    Object Name     CPU Load [%]     I/O Load [%]     Elapsed Time [%]     Executions     Records Processed
    BSIS        1     9     0     22     90462
    CDEF$     6     6     0     2131113     2575694
    BSAD     1     3     0     21     408576
    MKPF     1     32     0     180     3899
    ICOL$     9     1     0     2575694     8703798
    OBJ$     6     1     0     3405254     3400023
    COL$     12     0     0     2138793     22919657
    MKPF     1     13     0     75     396
    MCHB     7     0     0     366543     41708
    Please suggest the step by step to reduce  the  expensive SQL  from these table.
    Regards

    Hi,
    In tx code ST05 i have find the following deatails.
    Duration |Obj. name |Op.    |Recs.|RC    |Statement  
         4 TSP03A     REOPEN             0 SELECT WHERE "NAME" = 'LOCL' AND "P" = 'S_CLIENTS'                                                                                344 TSP03A     FETCH       1      0                                                                                42 ZRIN    DECLARE            0 SELECT WHERE "MANDT" = :A0 AND "BCQ" = :A1                                                                                442 ZRIN    PREPARE            0 SELECT WHERE "MANDT" = :A0 AND "BCQ" = :A1                                                                                3 ZRIN    OPEN               0 SELECT WHERE "MANDT" = '600' AND "BCQ" = 'BCQ'                                                                               
    2,181,565 ZRIN  FETCH       4   1403                                                                               
    706,267 DBA_SEGME FETCH      99      0                                                                                8,248 DBA_SEGME FETCH      99      0                                                                                44,994 DBA_SEGME FETCH      99      0                                                                                67,713 DBA_SEGME FETCH      99      0   
    1,367,923 DBA_SEGME FETCH      99      0
      16,735|TADIR     |FETCH  |    1|  1403|                                                                                |
    5
    TADIR
    REOPEN
    0
    SELECT WHERE "PGMID" = 'R3TR' AND "OBJECT" = 'TABL' AND "OBJ_NAME" = 'CME__TEXT'
    22,415
    TADIR
    FETCH
    1
    1403
    4
    TADIR
    REOPEN
    0
    SELECT WHERE "PGMID" = 'R3TR' AND "OBJECT" = 'TABL' AND "OBJ_NAME" = 'CME_PATTERN_SIGN'
    232
    TADIR
    FETCH
    1
    1403
    4
    TADIR
    REOPEN
    0
    SELECT WHERE "PGMID" = 'R3TR' AND "OBJECT" = 'TABL' AND "OBJ_NAME" = 'CML_ARC_DEADLINE'
    19,189
    TADIR
    FETCH
    1
    1403
    4
    TADIR
    REOPEN
    706,267
    DBA_SEGME
    FETCH
    99
    0
    |
    44,994
    DBA_SEGME
    FETCH
    99
    0
    67,713
    DBA_SEGME
    FETCH
    99
    0
    Please suggest how to  reduce it.
    Regards,

Maybe you are looking for

  • Error when creating a new user

    Hi all When I try to create a new user I'm getting an error message Unknown message (ID = LOWERCASE_REQUIRED) I entered all the required fields in the way it was defined and still I get this error. I never had this problem before. Can anyone please h

  • Clients running OS 10.4.6 on older iMac (slot loading) cannot login

    Clients running OS 10.4.6 on older iMac (slot loading) cannot login. I get: "Logging into the account failed because an error occurred. The home folder for the user account is located on an AFP or SMB server. Contact your system administrator for hel

  • Customized Acrobat 11 installer fails on Mac OS 10.8.2

    We created a customized installer for Acrobat 11 using the Adobe Customization Wizard for Macintosh. The resulting .pkg installer installed successfully on three Macs (two on 10.8.2 and one on 10.7.5) but the installer fails on the fourth machine, wh

  • E-Mail shows up on alert, but not under mail

    First time owner of an iPhone...I have the 4S.  I setup my personal and work e-mail.  The "front page" shows 100 new e-mails, but when I go into mail it is on the "welcome to mail" page.  I found one thread that suggests going into Settings...Mail, C

  • No logon after migration assistant for migrated account

    I just installed a new SSD in my Mac Pro. Installed 10.8 Created a temp account. Upgraded to 10.9 Ran Migration Assistant and migrated profile from the original hard drive still installed. Everything transferred but no logon on is available for the m