Question on gather table stats.

Guys,
I am trying to gather stats on a table and it is taking a long time to finish up ( it had been running for about 28 hours ), after dropping some partitions and creating couple of partitions by splitting the part lead partition I executed the below command for gathering the stats on the table along with the newly created partitions. Our database in on Oracle 10g and OS is windows 2003 server. Can someone please let me know if I am missing any parameters.
exec dbms_stats.gather_table_stats (-
ownname => 'TRACK', -
tabname => 'TRK_DTL_STAGE', -
estimate_percent => dbms_stats.auto_sample_size, -
cascade => true, -
degree => 5);
Thanks for all your help.

user12055522 wrote:
Thanks for the reply. I am not seeing any waits and also I see activity against this session as I looked in the long ops. Will I be better off to kill the current session and just grab the stats for the partitions which I created from the split.How can that be possible that you do not see any waits, it could be that you did not cared to check.
Killing the existing session would be your decision and i do not see any problem in doing that so far with it.
Regards
Anurag

Similar Messages

  • Question on Exporting Table Stats to another database

    I have a question exporting Table Stats from one schema in database A to another schema in another database B.
    Currently running Oracle 9.0.2.6 for unix in both prod and dev.
    Currently table stats are gathered using the ANALYZE TABLE command. We currently don't use the DBMS_STATS package to gather table statistics.
    Question:
    If I execute DBMS_STATS.EXPORT_TABLE_STATS in database A can I import them to database B if I'm only using the ANALYZE TABLE to gather table stats? Do I need to execute the DBMS_STATS.GATHER_TABLE_STATS package in database A prior to excuting DBMS_STATS.EXPORT_TABLE_STATS ?
    The overall goal is to take table stats from Production in its current state and import them into a Development environment to be used for testing data / processes.
    Yes we will be upgrading to Oracle 10 / 11g in near future.
    Yes we will be changing our method of gathering table stats by using the DBMS_STATS package.
    Thanks,
    Russ D

    Hi,
    If I execute DBMS_STATS.EXPORT_TABLE_STATS in database A can I import them to database B if I'm only using the ANALYZE TABLE to gather table stats? You need using the DBMS_STAT package for get and export statistics process if you want migrate the stats to other database.
    Do I need to execute the DBMS_STATS.GATHER_TABLE_STATS package in database A prior to excuting DBMS_STATS.EXPORT_TABLE_STATS ?Yes, you need executing first DBMS_STATS.GATHER_TABLE_STATS.
    Good luck.
    Regards.

  • Gather table stats taking longer for Large tables

    Version : 11.2
    I've noticed that gathers stats (using dbms_stats.gather_table_stats) is taking longer for large tables.
    Since row count needs to be calculated, a big table's stats collection would be understandably slightly longer (Running SELECT COUNT(*) internally).
    But for a non-partitioned table with 3 million rows, it took 12 minutes to collect the stats ? Apart from row count and index info what other information is gathered for gather table stats ?
    Does Table size actually matter for stats collection ?

    Max wrote:
    Version : 11.2
    I've noticed that gathers stats (using dbms_stats.gather_table_stats) is taking longer for large tables.
    Since row count needs to be calculated, a big table's stats collection would be understandably slightly longer (Running SELECT COUNT(*) internally).
    But for a non-partitioned table with 3 million rows, it took 12 minutes to collect the stats ? Apart from row count and index info what other information is gathered for gather table stats ?
    09:40:05 SQL> desc user_tables
    Name                            Null?    Type
    TABLE_NAME                       NOT NULL VARCHAR2(30)
    TABLESPACE_NAME                        VARCHAR2(30)
    CLUSTER_NAME                             VARCHAR2(30)
    IOT_NAME                             VARCHAR2(30)
    STATUS                              VARCHAR2(8)
    PCT_FREE                             NUMBER
    PCT_USED                             NUMBER
    INI_TRANS                             NUMBER
    MAX_TRANS                             NUMBER
    INITIAL_EXTENT                         NUMBER
    NEXT_EXTENT                             NUMBER
    MIN_EXTENTS                             NUMBER
    MAX_EXTENTS                             NUMBER
    PCT_INCREASE                             NUMBER
    FREELISTS                             NUMBER
    FREELIST_GROUPS                        NUMBER
    LOGGING                             VARCHAR2(3)
    BACKED_UP                             VARCHAR2(1)
    NUM_ROWS                             NUMBER
    BLOCKS                              NUMBER
    EMPTY_BLOCKS                             NUMBER
    AVG_SPACE                             NUMBER
    CHAIN_CNT                             NUMBER
    AVG_ROW_LEN                             NUMBER
    AVG_SPACE_FREELIST_BLOCKS                   NUMBER
    NUM_FREELIST_BLOCKS                        NUMBER
    DEGREE                              VARCHAR2(10)
    INSTANCES                             VARCHAR2(10)
    CACHE                                  VARCHAR2(5)
    TABLE_LOCK                             VARCHAR2(8)
    SAMPLE_SIZE                             NUMBER
    LAST_ANALYZED                             DATE
    PARTITIONED                             VARCHAR2(3)
    IOT_TYPE                             VARCHAR2(12)
    TEMPORARY                             VARCHAR2(1)
    SECONDARY                             VARCHAR2(1)
    NESTED                              VARCHAR2(3)
    BUFFER_POOL                             VARCHAR2(7)
    FLASH_CACHE                             VARCHAR2(7)
    CELL_FLASH_CACHE                        VARCHAR2(7)
    ROW_MOVEMENT                             VARCHAR2(8)
    GLOBAL_STATS                             VARCHAR2(3)
    USER_STATS                             VARCHAR2(3)
    DURATION                             VARCHAR2(15)
    SKIP_CORRUPT                             VARCHAR2(8)
    MONITORING                             VARCHAR2(3)
    CLUSTER_OWNER                             VARCHAR2(30)
    DEPENDENCIES                             VARCHAR2(8)
    COMPRESSION                             VARCHAR2(8)
    COMPRESS_FOR                             VARCHAR2(12)
    DROPPED                             VARCHAR2(3)
    READ_ONLY                             VARCHAR2(3)
    SEGMENT_CREATED                        VARCHAR2(3)
    RESULT_CACHE                             VARCHAR2(7)
    09:40:10 SQL> >
    Does Table size actually matter for stats collection ?yes
    Handle:     Max
    Status Level:     Newbie
    Registered:     Nov 10, 2008
    Total Posts:     155
    Total Questions:     80 (49 unresolved)
    why so many unanswered questions?

  • Analyze Vs Gather Table Stats

    hi!
    I have some confusions about analyze and gather table stats command. Please answers my questions to remove these confusions:
    1 - What’s major difference between analyze and gather table stats ?
    2 - Which is better?
    3 - Suppose i am running analyze/stats table command when some transactions are being performed on the process table then what will be affected of performance?
    Hopes you will support me.
    regards
    Irfan Ahmad

    [email protected] wrote:
    1 - What’s major difference between analyze and gather table stats ?
    2 - Which is bet
    3 - Suppose i am running analyze/stats table command when some transactions are being performed on the process table then what will be affected of performance?1. analyze is older and probably not being extended with new functionality/support for new features
    2. overall, dbms_stats is better - it should support everything
    3. Any queries running when the analyze takes place will have to use the old stats
    Although dbms_stats is probably better, I find the analyze syntax LOTS easier to remember! The temptation for me to be lazy and just use analyze in development is very high. For advanced environments though dbms_stats will probably work better.
    There is one other minor difference between analyze and dbms_stats. There's a column in the user/all/dba_histograms view called endpoint_actual_value that I have seen analyze populate with the data value the histogram was created for but have not seen dbms_stats populate.

  • How to gather table stats for tables in a different schema

    Hi All,
    I have a table present in one schema and I want to gather stats for the table in a different schema.
    I gave GRANT ALL ON SCHEMA1.T1 TO SCHEMA2;
    And when I tried to execute the command to gather stats using
    DBMS_STATS.GATHER_TABLE_STATS (OWNNAME=>'SCHMEA1',TABNAME=>'T1');
    The function is failing.
    Is there any way we can gather table stats for tables in one schema in an another schema.
    Thanks,
    MK.

    You need to grant analyze any to schema2.
    SY.

  • Gather table stats

    Hi All,
    DB version:10.2.0.4
    OS:Aix 6.1
    I want to gather table stats for a table since the query which uses this table is running slow. Also I noticed that this table is using full table scan and it was last analyzed 2 months back.
    I am planning to execute the below query for gathering the stats. The table has 50 million records.
    COUNT(*)
    51364617
    I expect this gonna take a long time if I execute the query Like below.
    EXEC DBMS_STATS.gather_table_stats('schema_name', 'table_name');
    My doubts specified below.
    1. can i use the estimate_percent parameter also for gathering the stats ?
    2. how much percentage should I specify for the parameter estimate_percent?
    3. what difference will it make if I use the estimate_percent parameter?
    Thanks in advance
    Edited by: user13364377 on Mar 27, 2012 1:28 PM

    If you are worried about the stats gathering process running for a long time, consider gathering stats in parallel.
    1. Can you use estimate_percent? Sure! Go for it.
    2. What % to use? Why not let the database decide with auto_sample_size? Various "rules of thumb" have been thrown around over the years, usually around 10% to 20%.
    3. What difference will it make? Very little, probably. Occasionally you might see where a small sample size makes a difference, but in general it is perfectly ok to estimate stats.
    Perhaps something like this:
    BEGIN
      dbms_stats.gather_table_stats(ownname => user, tabname => 'MY_TABLE',
       estimate_percent => dbms_stats.auto_sample_size, method_opt=>'for all columns size auto',
       cascade=>true,degree=>8);
    END;

  • Temp table, and gather table stats

    One of my developers is generating a report from Oracle. He loads a subset of the data he needs into a temp table, then creates an index on the temp table, and then runs his report from the temp table (which is a lot smaller than the original table).
    My question is: Is it necessary to gather table statistics for the temp table, and the index on the temp table, before querying it ?

    It depends yesterday I have very bad experience with stats one of my table has NUM_ROWS 300 and count(*)-7million and database version is 9206(bad every with optimizer bugs) so queries starts breaking and lot of buffer busy and latch free it took while to figure out but I have deleted the stats and every thing came under control - my mean to say statistics are good and bad. Once you start collecting you should keep an eye.
    Thanks.

  • Gather table stats after shrink?

    Hello,
    do I need (or is it useful) to run dbms_stats.gather_table_stats after table shrink or does oracle update statistics automatically?
    Regards,
    Arndt

    Hi,
    I'd suggest you to write some script to run it from crontab to gather statistics automaticaly.
    You can do the same via Enterprise Manager scheduling it on daily/hourly basis you wish.
    select 'Gather Schema Stats' from dual;
    select 'Started: '|| to_char(sysdate, 'DD-MM-YYYY HH24:MI:SS') from dual;
    -- exec dbms_stats.gather_schema_stats('<my_schema>');
    select '- for all columns size auto' from dual;
    exec DBMS_STATS.gather_schema_stats(ownname => '<my_schema>', estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZE, method_opt => 'FOR ALL COLUMNS SIZE AUTO', degree => DBMS_STATS.AUTO_DEGREE, cascade => true, force => true);
    select '- for all indexed columns size skewonly' from dual;
    exec  DBMS_STATS.gather_schema_stats(ownname => '<my_schema>', estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZE, method_opt => 'FOR ALL INDEXED COLUMNS SIZE SKEWONLY', degree => DBMS_STATS.AUTO_DEGREE, cascade => true, force => true);
    select 'Finished: '|| to_char(sysdate, 'DD-MM-YYYY HH24:MI:SS') from dual;Good luck!
    http://dba-star.blogspot.com/

  • Gather table stats takes time for big table

    Table has got millions of record and it is partitioned. when i analyze the table using the following syntax it takes more than 5 hrs and it has got one index.
    I tried with auto sample size and also by changing the estimate percentage value like 20, 50 70 etc. But the time is almost same.
    exec dbms_stats.gather_table_stats(ownname=>'SYSADM',tabname=>'TEST',granularity =>'ALL',ESTIMATE_PERCENT=>100,cascade=>TRUE);
    What i should do to reduce the analyze time for Big tables. Can anyone help me. l

    Hello,
    The behaviour of the ESTIMATE_PERCENT may change from one Release to another.
    In some Release when you specify a "too high" (>25%,...) ESTIMATE_PERCENT in fact you collect the Statistics over 100% of the rows, as in COMPUTE mode:
    Using DBMS_STATS.GATHER_TABLE_STATS With ESTIMATE_PERCENT Parameter Samples All Rows [ID 223065.1]For later Release, *10g* or *11g*, you have the possibility to use the following value:
    estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZEIn fact, you may use it even in *9.2*, but in this release it is recommended using a specific estimate value.
    More over, starting with *10.1* it's possible to Schedule the Statistics collect by using DBMS_SCHEDULE and, specify a Window so that the Job doesn't run during production hours.
    So, the answer may depends on the Oracle Release and also on the Application (SAP, Peoplesoft, ...).
    Best regards,
    Jean-Valentin

  • Gather Schema Stat running long in R12

    Hi:
    I scheduled Gather Schema Stat (all with 10%) on two PROD instances last night. One completed about 2 hrs (this one normally has more activeties) while the other is still running since 12:00am. Please give me the steps and commands how I should troubleshoot. Is it still running? I have checked there is no block. BTW, I did one Gather Schema Stats for AP with 10% yesterday afternoon. It completed less than 2 min. They both are on 12.1.3 and 11.1.0.7 on Linux. Thank you in advance.

    I guess I have to do one of these. Thank you.
    Another question: Why some tables dont have date under last_analyzed? When I checked the all_tables and sorted on the last_analyzed, saw 1/3 of the tables with date blank. They are from all kinds of schemas, ex, two tables below. But some AR, or AP are analyzed.
    AR_SUBMISSION_CTRL_GT
    AP_PERIOD_CLOSE_EXCPS_GT

  • Different Ways To Gather Table Statistics

    Can anyone tell me the difference(s), if any, between these two ways to refresh table stats:
    1. Execute this procedure which includes everything owned by the specified user
    EXEC DBMS_STATS.gather_schema_stats (ownname => 'USERNAME', degree => dbms_stats.auto_degree);
    2. Execute this statement for each table owned by the specifed user
    analyze table USERNAME.TABLENAME compute statistics;
    Generally speaking, is one way better than the other? Do they act differently and how?

    In Oracle's automatic stats collection, not all object are included in stats collection.
    Only those tables, which has stale stats are taken for stats collection. I don't remember on top of my head, but its either 10% or 20% i.e. the tables where more than 10% (or 20%) data has changed are maked as stale. And only those stale objects will be considered for stats collection.
    Do you really think, each and every object/table has to be analyzed every day? How long does it take when you gather stats for all objects?

  • Gather table Statistics

    i need collect the statistics for few tables from a schema. i am doing the following:
    i am creating a table to save the statistics.
    begin
    dbms_stats.create_stat_table(ownname =>'TEST',
    statab =>'TEST_ORD',);
    end;
    and then i am gathering the table stats for a particular table as follows:
    exec dbms_stats.gather_table_stats(
    ownname => 'BNS',
    estimate_percent =>'10,
    statown =>'TEST',
    tabname =>'ORDERS',
    stattab =>'TEST_ORD',
    statid =>'CR');
    my question is how can do for multiple tables ? the reason is i have to create a Job to run this script automatically daily.
    can i put this into a procedure? or any other ideas?
    Appreciate all your help!

    In many environments the default job does a pretty good job of generating statistics, but I have to agree that there are environments where the default job does not produce very good statistics which is why I mentioned the ability to lock statistics. For anyone who has not had to deal with statistics much you can generate a workable set of statistics manually and then lock them in place so the provided automatich statistics generation task does not re-generate a specific set of statistics.
    Also by default the 10g task collects histograms. If histograms were not in use in 9i then removing the histograms often 'fixes' most of the query performance issues encounted upon upgrading to 10g in these cases. 11g adds the ability set dbms_stats parameters at the table level and have those settings retained for use with future stats collection.
    Yes cron assumes a UNIX (or Linux) platform but Windows also has a job scheduler available. I am not sure what difference RAC would make. We use cron tasks against RAC all the time. We do normally just run the tasks against the local instance.
    Nicely organized script/code library.
    HTH -- Mark D Powell --

  • Why Oracle not using the correct indexes after running table stats

    I created an index on the table and ran the a sql statement. I found that via the explain plan that index is being used and is cheaper which I wanted.
    Latter I ran all tables stats and found out again via explain plan that the same sql is now using different index and more costly plan. I don't know what is going on. Why this is happening. Any suggestions.
    Thx

    I just wanted to know the cost using the index.
    To gather histograms use (method_opt is the one that causes the package to collect histograms)
    DBMS_STATS.GATHER_SCHEMA_STATS (
    ownname => 'SCHEMA',
    estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZE,
    block_sample => TRUE,
    method_opt => 'FOR ALL COLUMNS SIZE AUTO',
    degree => 4,
    granularity => 'ALL',
    cascade => TRUE,
    options => 'GATHER'
    );

  • How to gather schema stats in 11.5.9

    Hi,
    I am trying to gather schema stats for 11.5.9. But schemaname does not have the list of schemas. can i enter ALL.
    Regards
    Taher

    Hi Taher,
    You can either enter a schema name or even ALL should work.
    If you refer the link Helios has referred, Gather schema statistics is run on the schemas or tables on where there is a change in data frequently. So if you enter ALL then it would gather statistics for all the schemas which might not be required. Since it is run on all the schemas it might take time to complete.
    It is a good practice to run gather statistics at object level where there are lot of data growth or change.
    Regards,
    Mithun

  • Create Create table statement dynamicallly

    I am trying to create a SP that  return a “Create table  “statement dynamically using a table called “Employee” in database. How can I create a dynamic Create Table
    statement using sys.table? The create table statement should contains all the columns from Employee table.. i am using SQL server 2008 R2

    Hi SSAS_5000,
    If you don't care about the constraints,dependency on the Table Employee and what you'd like is the table structure, you may use the below statement where the create statement generated from the SP is executed.
    SELECT TOP 1 * INTO desiredTable FROM Eemploy;
    TRUNCATE TABLE desiredTable
    If you have question, feel free to let me know.
    Eric Zhang
    TechNet Community Support

Maybe you are looking for

  • Accessing a array through a servlet and outputting info in a JSP?

    I have researched and have not come up with any examples and no answer as to how to do this. Please help... The molecular2.java does all the computation. I will only include a snippet of it. This file is like this because I needed to return multiple

  • Embedded Database - data does not load

    Hello. I am currently using NetBeans 6.5 and am experimenting with the CarsApp tutorial. Whenever I create an application using the server database, the data loads automatically when the application starts. However, when I create the application usin

  • How can I pause the screensaver

    How can I pause the screensaver. I am using osx 10.8.2 Thanks in advance

  • Error 109 - Solution 1 did not work. Help me, please.

    About a month ago, I was able to download the PS trial completely fine with no problems. It's since then been uninstalled and everything. I'm now learning Dreamweaver, but I can't get the trial download to work. Everytime I try, I get error 109 - a n

  • Lightroom hanging on import from SD Card

    When importing Nikon RAW images from an SD card on Lightroom 5.7 and copying to harddrive (Macbook Pro 5,4 15", OSX Mavericks, 8Gb Ram, 1Tb HDD), Lightroom hangs. When 'import' is stopped, it is not possible to import from the same source a second ti