Dyanamic sampling and statistics collection

HI all,
im a newbie and i have a question. please correct me if im wrong too.
The documentation I studied specifies that Dyanmic sampling collects the statistics of the objects. im not sure during query execution or while an auto job.
i also read that starting from 10g , Oracle collects stats for objects as a job (using dbms_stats package). the job is scheduled for new or stale objects , running between 10 pm and 6 am.
if oracle already runs a job for collecting stats what is dynamic sampling good for.
Please fill me in . can some body also explaing the query optimizer components.
thanks in advance.
Dev

Assume stats are collected every day at 02:00. Beginning at 08:00 users start making changes. By 11:45 the stats may not bear a close relationship to what was collected the previous morning.
I thought the explanation in the docs was clear. What doc are you reading (please post the link).

Similar Messages

  • (10g) 자동 통계정보 수집(AUTOMATIC OPTIMIZER STATISTICS COLLECTION)

    제품 : ORACLE SERVER
    작성날짜 : 2006-07-21
    PURPOSE
    이 문서는 10g의 new feature인 자동 통계정보 수집(Automatic Optimizer
    Statistics Collection)에 대한 소개와 기능에 대한 자료이다.
    Explanation
    1. 개요
    Optimizer statistics는 GATHER_STATS_JOB에 의해서 자동으로
    수집된다. 이 JOB은 SYS 소유로서 OBJECT_TYPE이 JOB이다.
    이 JOB은 통계정보가 없거나 stale 상태의 통계정보를 갖는 DB 내의
    모든 OBJECT들에 대한 통계정보들을 수집한다.
    2. 자동 통계정보 수집을 위한 설정과 방식
    1) STATISTICS_LEVEL = TYPICAL | ALL
    2) 통계정보들은 predefined GATHER_STATS_JOB에 의해 수집된다.
    3) JOB이 수행될 때 JOB은 다음과 같은 사항들을 결정한다.
    - missing 또는 stale 상태의 통계정보를 갖는 object를 결정한다.
    - 좋은 통계정보를 생성하기 위해 필요한 적당한 sampling percentage.
    - histogram과 histogram의 사이즈를 요구하는 적절한 column.
    - 통계정보 수집에 대한 parallelism의 degree.
    - 어느 object에 대한 통계정보를 수집할지에 대한 우선순위
    3. GATHER_STATS_JOB에 대한 설명
    이 job은 데이타베이스 생성 시점에 생성되고 스케줄러에 의해 관리된다.
    GATHER_STATS_JOB 은 DBMS_STATS.GATHER_DATABASE_STATS_JOB_PROC procedure를
    call함으로써 통계정보를 수집한다.
    이 프로시져는 'GATHER AUTO' 옵션을 사용한 DBMS_STATS.GATHER_DATABASE_STATS
    procedure와 아주 유사한 형태로 동작한다. 이것과 다른 점은
    GATHER_DATABASE_STATS_JOB_PROC procedure는 통계정보를 수집해야 할
    Object에 대해 우선순위를 두고 순서대로 처리한다. 즉, 가장 많이
    통계정보가 update가 되어야 할 object를 가장 먼저 처리하는 것이다.
    이것은 maintenance window가 close되기 전에 가장 필요한 통계정보가
    먼저 수집되도록 하기 위함이다.
    4. Dictionary Objects에 대한 통계정보
    1) Oracle Database 10g부터 최적의 performance 결과를 얻기 위해 dictionary
    table들에 대한 통계정보도 수집할 수 있다.
    언제라도, DBMS_STATS.GATHER_SCHEMA_STATS procedure를 사용하여
    dictionary table들에 대한 통계정보를 수집하는 것이 가능하다.
    이 때 GATHER_SYS argument는 TRUE로 셋팅되어 있어야 한다.
    2) DBMS_STATS.GATHER_DICTIONARY_STATS라 하는 새로운 procedure도 사용
    하는 것이 가능하다. 이것을 사용하기 위해서는 ANALYZE ANY DICTIONARY
    라는 새로운 system privilege가 있어야 한다.
    이 권한은 만약 어떤 user가 SYSDBA 권한이 없는 경우 dictionary object와
    fixed object들을 analyze할 수 있도록 한다.
    3) GATHER_DATABASE_STATS라는 프로시져는 GATHER_FIXED라 불리우는 새로운
    argument를 가진다. 이 값은 default로 FALSE로 셋팅된다. 즉, 기본적으로
    fixed table들에 대해서는 통계정보를 생성하지 않도록 한다.
    전형적인 System WorkLoad가 있는 동안에는 fixed table들에 대하여
    한번만 analyze하면 충분하다.
    4) GATHER_FIXED_OBJECTS_STATS라는 procedure를 사용하여 fixed table들에
    대한 통계정보를 모으는 것도 가능하다. 또한 모든 fixed table들에 대하여
    통계정보를 delete하는 것도 가능하고, fixed table에 통계정보를
    export 또는 import하는 것도 가능하다.
    Example
    none
    Reference Documents
    <Note:266040.1>

    Hi,
    Please see here,
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14211/stats.htm#i41448
    If the table/s are changing very frequently than its better to gather the stats manually.This would lead teh volatile table coming up into the stats job again and again.
    For the system stats and data dictionary stats,they are not collected by default.So there is no choice but to gather them manually.
    Aman....

  • How stop the automatic statistics collection job after the maintenance wind

    Hi,
    we are for a solution to stop the automatic statistics collection job after the maintenance window finished.
    we disable all jobs except the automatic statistics collection, because this is the only one we want to run. Then we define specific values for the interval and duration parameters of the maintenance window to customize this task.
    But for their systems it is very important that this job/task will immediately stop when the window is closed!!!
    So, how could we ensure this behavior.
    For Oracle 10g it is easy because the statistic job always exists and it is possible to set its duration and create an addtional event based job which kills all jobs that are running over duration.
    In Oracle 11g the statistic job is created by the system during the maintenance window is open.
    We are not able to modify parameters of this system job. After the maintenance window closed the job is already running - only with another resource priority - but it is running.
    Please help me in this scenario
    Thanks&Regards
    Prem

    ?So basically you are saying is if none of the tables are changed then GATHER_STATS_JOB will not run, but i see tables are updated still the job is not running. I did >query dba_scheduler_jobs and the state of the job is true and scheduled. Please see my previous post on the output
    Am i missing anything here, do i look for some parameters settings
    So basically you are saying is if none of the tables are changed then GATHER_STATS_JOB will not run,GATHER_STATS_JOB will run and if there are any table in which there's a 10 percent change in data, it will gather statistics on that table. If no table data have changes less than 10 percent, it will not gather statistics.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14211/stats.htm#i41282
    Hope this helps.
    -Anantha

  • How to disable automatic statistics collections on tables

    Hi
    I am using Oracle 10g and we have few tables which are frequently truncated and news rows added to it. Oracle automatically analyzes the table by some means which collects statistics of the table but at the wrong time(when the table is empty). This makes my query to do a full table scan rather using indexes since the statistics was collected when the table was empty.Could any one please let me know how to disable the automatic statistics collection feature of Oracle?
    Cheers
    Anantha PV

    Hi
    I am using Oracle 10g and we have few tables which
    are frequently truncated and news rows added to it.
    Oracle automatically analyzes the table by some means
    which collects statistics of the table but at the
    wrong time(when the table is empty). This makes my
    query to do a full table scan rather using indexes
    since the statistics was collected when the table was
    empty.Could any one please let me know how to disable
    the automatic statistics collection feature of
    Oracle?
    First of all I think it's important that you understand why Oracle collects statistics on these tables: Because it considers the statistics of the object to be missing or stale. So if you just disable the statistics gathering on these tables then you won't have statistics at all or outdated statistics.
    So as said by the previous posts you should gather the statistics manually yourself anyway. If you do so right after loading the data into the truncated table, you don't need to disable the automatic statistics gathering as it only processes objects that are stale or don't have statistics at all.
    If you still think that you need to disable it there are several ways to accomplish it:
    As already mentioned, for particular objects you can lock the statistics using DBMS_STATS.LOCK_TABLE_STATS, or for a complete schema using DBMS_STATS.LOCK_SCHEMA_STATS. Then these statistics won't be touched by the automatic gathering job. You still can gather statistics using the FORCE=>true option of the GATHER__STATS procedures.
    If you want to change the automatic gathering job that it only gathers statistics on objects owned by Oracle (data dictionary, AWR etc.), then you can do so by calling DBMS_STATS.SET_PARAM('AUTOSTATS_TARGET', 'ORACLE'). This is the recommended method.
    If you disable the schedule job as mentioned in the documentation by calling DBMS_SCHEDULER.DISABLE('GATHER_STATS_JOB') then no statistics at all will be gathered automatically, causing your data dictionary statistics to be become stale over time, which could lead to suboptimal performance of queries on the data dictionary.
    All this applies to Oracle 10.2, some of the features mentioned might not be available in Oracle 10.1 (as you haven't mentioned your version of 10g).
    Regards,
    Randolf
    Oracle related stuff blog:
    http://oracle-randolf.blogspot.com/
    SQLTools++ for Oracle:
    http://www.sqltools-plusplus.org:7676/
    http://sourceforge.net/projects/sqlt-pp/

  • Sample and Hold Application.

    I am acquiring data with the NI ELVIS on an analog input channel A0.
    I need to sample and hold the data according to a 480 Hz (2.08 ms) clock as follows:
    1. Acquire data for the 0.69 ms (1st third of the cycle) and place the data into a new waveform.
    2. Acquire data from 0.69 ms to 1.39 ms (the 2nd third of the cycle) and place the data into a new waveform.
    3. Acquire data from 1.39 ms to 2.08 ms (last third of the cycle) and place the data into a new waveform.
    Currently, I have built a VI (attached) to generate two clock signals. The first clock signal turns on a red LED for 0.69ms then turns off for the rest of the cycle. The second clock signal turns on when the first clock signal transititions to a low state for the next 0.69 ms and remains off for the rest of the cycle. Both LED illumination signals are picked up by a photodetector, so this is why I need the sample and hold application explained above.
    If you could help that would be great. My current VI is attached. Thanks so much!
    -David
    Attachments:
    SaO2.vi ‏146 KB

    Correct me if I'm wrong, you are turning on two LEDs in a pattern and reading the light level. You want to have three separate waveforms for the light reading, based on when the lights are turned on and off.
    If this is the case, here is what I would try:
    Trigger your analog input reading off of the turning on of the first LED. You might have to put an initial delay into your clock pulse (that turns on the first LED). This initial delay will insure that the LED is not turned on before the analog input has been started. Data collection will start when the first LED is turned on (the trigger) and will end after the appropriate amount of time. (You could end it in many ways; user presses a stop button, configure collection for a finite number of data points, final pulse has ended,..). After data collection is finished take the single waveform that is storing all of the light data and use the waveform functions in the Waveform palette to split it into your three waveform subsets.
    I suppose another way to produce the three waveforms would be to wire your light sensor into three analog inputs, then configure three separate voltage inputs with appropriate triggers based on the turning on and off of the LEDs. CH 1 triggers when LED 1 is turned on, CH 2 triggers when the LED 1 is turned off, CH 3 is triggered when LED 2 is turned on.
    I hope this is helpful in some way,
    Sam

  • Optimizer Statistics collection after upgrade from 8i to 10R2

    I just upgraded database from 8.1.7 to 10R2 .
    What would the best approach for Optimizer Statistics collection. We would like to open database for test , but I afraid some quries going to run slow without latest stats. Should I run it manually or let Oracle run it’s default stats collection job later on.
    Any suggestions?

    user594143
    You really need a strategy before an upgrade like this, but you have two options -
    a) try to make the 10g stats collection identical to the 8i stats collection. Check the code you used to run, check the 8i default values for the parameters in your current dbms_stats() calls, and write them in explicitly when you run the code under 10g.
    OR
    b) do a full 10g conversion. Get rid of your own collection code, clear out most of the old settings you had in your parameter file for fiddling with the optimizer, do a 'gather_schema_stats' then leave 10g to do its default thing and fix any problems that appear.
    If you have testing time on a non-production system, then (b) is the strategic option - although personally I think it tends to collect too many histograms and still needs some refinement; if you don't have any testing time and you're going straight into production then (a) is the least threatening option (and if someone's made you do that, you might also set the optimizer_features_enable to 8.1.7 until you can do some proper upgrade tests).
    Regards
    Jonathan Lewis
    http://jonathanlewis.wordpress.com
    http://www.jlcomp.demon.co.uk

  • CDMC - Activate Statistics Collection

    Hi,
    I have a question related to CDMC Statistics Collection.It is not clear for me, after activate the collection what are the periods to evaluate and period type?
    Please can you clarify?
    Best Regards,
    Zsuzsanna

    Hi Zsuzsanna,
    As shown in the screenshot in my first reply it will take input as per the given period and perid type and will fetch usage statistics data for the given time period but to shcedule the same for every three months you need to give inout in the next pop-up screen here you can give input to schedule three different jobs for future as per your requirement, According to the period and period type given these jobs will be scheduled and run in the statistics/production system and will update usage information in the CDMC table in the statistics system.
    According to the given input it will schedule jobs and run in the statistics/produdction system , if you give 1 day then job will run daily and will update and store usage statistics data in the CDMC database table in the statistics system itself and not in Solution manager system.
    This data will only get imported in the solman when you create and execute new Clearing analysis project and execute activities in the same.
    If you are expecting the latest usage results then yes,you need to create new CDMC Clearing Analysis project every time as this usage information will not get updated automatically in the CDMC project.
    Please let me know if you have any doubts.
    Best Regards,
    Pritish

  • Why does the snmpcoll process always start up even if SNMP Statistics Collection is turned off?

    Why does the snmpcoll process always start up even if SNMP
    Statistics Collection is turned off?
    <P>
    The SNMP Statistics Collection field on the Server Status|SNMP
    Subagent Configuration form affects only the SNMP subagent, which is a separate entity from
    the collector process (snmpcoll). Currently, snmpcoll does not look
    at the SNMP configuration data when it starts up. It is started by
    the dispatcher and terminates when the dispatcher terminates. Many
    of the statistics are supposed to reflect cumulative values recorded since MTA
    initialization, such as the total number of messages sent and received.
    As a convenience, the collector process collects information in the
    background even while the SNMP subagent is turned off; this way,
    the values are available when the SNMP subagent is configured and
    turned on.

    Collections fall off on the Date of First Deliquency 7-7.5 years later; that is supposed to be reported as the first time you went late (1-30 days) on the OC (original creditor) in the chain of events leading up to it's being farmed out for collection as I underestand it. DOFD though sometimes doesn't get reported right, and by default it will be the date the collection is added to the bureaus. As an example I have an awkward collection (not sure how I got it looking at the payment history thought I'd cancelled that account but TWC billing apparently didn't think so) which I went late on in 2009, but didn't get farmed out till late 2010.  The DOFD is 6/09 looking at it, and it's expected to come off according to the bureaus 6/16 as a result. Different negatives have their own rules though.

  • Performance Problems - Index and Statistics

    Dear Gurus,
    I am having problems lossing indexes and statistics on cubes ,it seems my indexes are too old which in fact are not too old just created a month back and we check indexes daily and it returns us RED on the manage TAB.
    please help

    Dear Mr Syed ,
    Solution steps I mentioned in my previous reply itself explains so called RE-ORG of tables;however to clarify more on that issue.
    Occasionally,ORACLE <b>Cost-Based Optimizer</b> may calculate the estimated costs for a Full Table Scan lower than those for an Index Scan, although the actual runtime of an access via an index would be considerably lower than the runtime of the Full Table Scan,Some Imperative points to be considered in order to perk up the performance and improve on quandary areas such as extensive running times for Change runs & Aggregate activate & fill ups.
    Performance problems based on a wrong optimizer decision would show that there is something serious missing at Database level and we need to RE_ORG  the degenerated indexes in order to perk up the overall performance and avoid daily manual (RSRV+RSNAORA)activities on almost similar indexes.
    For <b>Re-organizing</b> degenerated indexes 3 options are available-
    <b>1) DROP INDEX ..., and CREATE INDEX …</b>
    <b>2)ALTER INDEX <index name> REBUILD (ONLINE PARALLEL x NOLOGGING)</b>
    <b>3) ALTER INDEX <index name> COALESCE [as of Oracle 8i (8.1) only]</b>
    Each option has its Pros & Cons ,option <b>2</b> seems to be having lot of advantages to
    <b>Advantages- option 2</b>
    1)Fast storage in a different table space possible
    2)Creates a new index tree
    3)Gives the option to change storage parameters without deleting the index
    4)As of Oracle 8i (8.1), you can avoid a lock on the table by specifying the ONLINE option. In this case, Oracle waits until the resource has been released, and then starts the rebuild. The "resource busy" error no longer occurs.
    I would still leave the Database tech team be the best to judge and take a call on these.
    These modus operandi could be institutionalized  for all fretful cubes & its indexes as well.
    However,I leave the thoughts with you.
    Hope it Helps
    Chetan
    @CP..

  • Hello.... I already loaded logic 9 on my laptop hard drive... I would to load all the samples and loops on an external drive... Do I need to reinstall from DVD's or can I drag and drop?

    Hello.... I already loaded logic 9 on my laptop hard drive... I would to load all the samples and loops on an external drive... Do I need to reinstall from DVD's or can I drag and drop?

    Thank you.... Would you know where they would be on my hard dive? Are they all together?

  • Is there any way to have a COMPLETE list of all samples and loops ( Logic 8

    I'm sure it's been posted previously, but anyway, I'm looking to buy new Apple Loops libraries and I have faced the fact , that many of the sounds and loops produced by third parties manufacturers have been already included in Logic installation discs.
    I have a list of Apple Loops DVD's that have been included with Logic somewhere, the problem is, Apple didn't post ( I believe) the credits of all 3rd party manufacturers that have produced these libraries that are included with Logic or Garage band. Is there any way to have a COMPLETE list of all samples and loops that are included with Apple DAW's so I wouldn't duplicate anything ? I'm pretty much positive that Apple had other companies to sound design and sample all libraries for them, so is there also a way to have a list of those manufacturers? Again, the objection is to start upgrading the sound library without any possible duplication?
    Thanks in advance!

    Chris, I certainly don't mind additional questions and postings.
    I believe there are many issues with Apple Loops and Logic that need to be resolved and people need to be aware of that. Unfortunately, in opposite to the old "german" version of Logic , there's no lifetime tech support, you can't even call and ask the question after 60 days , which isn't right for a professional software of this level, especially considering the fact that many things still remain vague in Logic even for developers and tech support people!( believe me, I've called and asked!)
    Issues like that need to be resolved over the phone with the company, period!
    One thing I also learned over the years as a Mac OSX user, if something doesn't work, don't mess with it. Delete your drive and re-install everything. This is very frustrating , I know, but unfortunately this is the only way to deal with OSX issues, if you got a problem with your system, don't try to fix it. It's never gonna be the same again. I know , it's off the topic a little bit, but if your content is missing from the system files, before installing your new Logic, back up your important files, wipe up your drive and clean install Mac OSX , run updates and then install the Logic. Most likely , everything will be in it's place, at least 90% or more. I gave up trying to make two system in my house to be compatible 100% , but it's OK if they're 90% or more identical. I spent enormous amount of time trying to find out what's missing and where, I visited most of the forums and there's no clean answer.
    Message was edited by: Moderator

  • How to output sample and convert clocks to PFI lines of E-Series DAQ (DAQPad-60​15)

    Hi,
    Can someone tell me how to output sample and convert clocks to PFI lines of E-Series DAQ (DAQPad-6015)?
    Thank you very much.
    Jack

    John --
    Windows is not an option for me. I like your idea of using a counter output -- it may be helpful as I am getting ramped up, but my application will eventually require both timer outputs.
    I have a legacy C application written for Macintosh, and I am in the process of moving it onto OS X. So my options are to use DAQmx Base, or write an in-kernel driver. I actually have already done the latter for 6024/6025 E-series boards (for another company); for this client I was hoping to use the DAQmx Base to allow an easy transition to M-series boards, without the cost of writing and supporting a low-level driver.
    The specific task I am doing is relatively straightforward. I record 2 channels of AI for a short period (usually about 250 ms.) and during this time I drive 2 external digital signals. Right now, I use the 2 timer outputs, which allows precise synchronization with the output and AI sampling.
    I appreciate your comments, and thanks in advance for any additional suggestions you can lob my way.
    --spg
    Scott Gillespie
    Applied Brain, Inc.
    scott gillespie
    applied brain, inc.

  • Help required for sample and Hold

    Hello,
    I am working on a project Cerebral Oxygenation Monitoring. The concept is similar to pulse oximetry. I help already developed the hardware that includes the timing circuit , led driver for driving Red and IR Led, Amplifier and Filter Stage. I am getting the pulse signal but it is multiplexed signal corresponding to effect of both Red/IR Led. Now to separate it I need to use sample and hold circuit, whihc I can achieve with and IC LF398 but I want to minimise this part and directly take this multiplexed signal through DAQ Card in to LabVIEW and further create sample and hold in LabVIEW.
    The sampling needs to be in synchronus with the Timing signal given to Red and IR. How I can achieve this in LabVIEW.
    For eg: the Red Led is triggered with a pulse of 1 ms with a repeatation rate after 10 ms. So this timing pulse should trigger the sampling part.
    Please Help.

    CoastalMaineBird wrote:
    Not sure what you need the HOLD part for.
    Correct me if I'm wrong:
    You have a 1 mSec pulse, every 10 mSec.
    Each pulse triggers two LEDs:  RED and IR
    You have a single signal which contains the processed (through the body, or whatever) responses to BOTH of those signals.
    So, how does a S&H, hardware or otherwise, separate the two responses?
    Is one delayed in time, relative to the other?
    YES you are right, 1 mSec pulse, every 10 mSec. 
    Above figure shows the trigger pulses generated using standard hardware. One triggers IR and other triggers RED. In above case  I had kept repetation rate 4msec
    This is the signal which I may get
    Now how do I separate both of these signals in LabVIEW

  • Hello, I have a 5min. part of a piece of meditation music (I think its from the 1970-80s) is there a website that can upload my sample and tell me who it is ? I want to get a new version....Thanks

    Hello, I have a 5min. part of a piece of meditation music (I think its from the 1970-80s) is there a website that can upload my sample and tell me who it is ? I want to get a new version....Thanks............Its in Mp3 format

    Well... it's not actually an iTunes or even an Apple question is it? But you can use either Shazam or Soundhound.
    If you do a search for both of them, you will find the various options available to you, which include apps for portable devices and in the case of Shazam, a work-round that allows you to use an online option using your computer's microphone on either a PC or Mac.
    There may be other services as well, but I know these two.
    I do not receive any reward (financial or otherwise) as a result of mentioning either Shazam or Soundhound.

  • Report to list all computers and their collection membership

    Hi
    I am currently working on a site where direct membership is used for collections but a need has arisen to move to AD Queries.
    I have created a simple powershell script that creates groups based on the contents of a csv file and another script which populates this with the members listed in another csv file.
    To help speed up the process is there a way to generate a report that lists ALL Computers and their Collection membership?
    The only reports I seem to find that are built in require an inputted value of either computer name of collection ID. I simply need a report that lists Computer Name is column 1 and Collection Name in column 2 for all computers and all collections.
    Many Thanks,
    Matt Thorley

    select 
    FCM.Name,
    C.Name
    from 
    dbo.v_Collection C
    join dbo.v_FullCollectionMembership FCM on C.CollectionID = FCM.CollectionID
    Thanks to Garth for original query. I just modified it :)
    Anoop C Nair (My Blog www.AnoopCNair.com)
    - Twitter @anoopmannur -
    FaceBook Forum For SCCM

Maybe you are looking for