SCHEMA STATS OR ANALYZE TABLE

Hi,
Can any one tell me what are the advantages of schema stats or table stats over analyzing the table.
Regards,
Umair

Hi,
Following are the advantages of using the DBMS_STATS to gather either table or schema or d/b stats
A. ANALYZE always runs serially. DBMS_STATS can run in serial or parallel.
Whenever possible, DBMS_STATS calls a parallel query to gather statistics with
the specified degree of parallelism; otherwise, it calls a serial query or the
ANALYZE statement. Index statistics are not gathered in parallel.
B. ANALYZE calculates global statistics for partitioned tables and indexes
instead of gathering them directly. This can lead to inaccuracies for some
statistics, such as the number of distinct values.
C. For partitioned tables and indexes, ANALYZE gathers statistics for the
individual partitions and then calculates the global statistics from the
partition statistics. DBMS_STATS can gather separate statistics for each
partition as well as global statistics for the entire table or index. Depending
on the SQL statement being optimized, the optimizer may choose to use either
the partition (or subpartition) statistics or the global statistics.
D. For composite partitioning, ANALYZE gathers statistics for the subpartitions
and then calculates the partition statistics and global statistics from the
subpartition statistics. DBMS_STATS can gather separate statistics for
subpartitions, partitions, and the entire table or index. Depending on the SQL
statement being optimized, the optimizer may choose to use either the partition
(or subpartition) statistics or the global statistics.
E. ANALYZE cannot overwrite or delete some of the values of statistics that were
gathered by DBMS_STATS.
F. ANALYZE can gather additional information that is not used by the optimizer,
such as information about chained rows and the structural integrity of indexes,
tables, and clusters. DBMS_STATS does not gather this information. DBMS_STATS
gathers statistics only for cost-based optimization; it does not gather other
statistics. For example, the table statistics gathered by DBMS_STATS include
the number of rows, number of blocks currently containing data, and average row
length but not the number of chained rows, average free space, or number of
unused data blocks.
G. DBMS_STATS does not call statistics collection methods associated with
individual columns. Use the ANALYZE statement to gather such information.

Similar Messages

  • OMBPlus "Analyze table statements" and "Analyze table sample percentage"

    Hello!
    I want to use OMBPLUS to fetch what properties a mapping have for “Analyze table statements” and “Analyze table sample percentage”.
    I have tried to use OMBRETERIVE MAPPING but I can’t find the right GET PROPERTIES.
    I hope YOU can help me.
    Best regards,
    Tina

    Hi Tina,
    try
    OMBRETRIEVE MAPPING 'MAP_DUMMY' GET PROPERTIES (ANALYZE_TABLE_STATEMENTS)
    OMBRETRIEVE MAPPING 'MAP_DUMMY' GET PROPERTIES (ANALYZE_TABLE_SAMPLE_PERCENTAGE)
    Regards,
    Carsten.

  • Sql slow afterweekly STATS-JOB,then run analyze table it is fast again

    Oracle R11.2.0.2 :
    I hade some slow sql / reports and found the effect,
    that the sql is slow obvious after the weekend ,
    when STATS - JOB BSLN_MAINTAIN_STATS_JOB and other Jobs were running weekly on SYS.
    I did run dbms_stats.GATHER_TABLE_STATS on schema
    it doesn't help.
    But when
    run ANALYZE TABLE afterwards on only one or two tables of the schema
    the sql / reports performance is well and fast.
    in the dba_tables I can see the last_analyze - date
    and in GLOBAL_STATS = NO ( when Table runs with ANALYZE ),
    GLOBAL_STATS = YES( when Table runs with STATS )
    what does the ANALYZE TABLE command doing good and let my sql run well and fast,
    while dbms_stats.GATHER_TABLE_STATS
    seems not work well in this situation ?
    regards

    astramare wrote:
    Oracle R11.2.0.2 :
    I hade some slow sql / reports and found the effect,
    that the sql is slow obvious after the weekend ,
    when STATS - JOB BSLN_MAINTAIN_STATS_JOB and other Jobs were running weekly on SYS.
    I did run dbms_stats.GATHER_TABLE_STATS on schema
    it doesn't help.
    What options do you use for the gather_stats command ?
    Do you also have collected system stats?
    But when
    run ANALYZE TABLE afterwards on only one or two tables of the schema
    the sql / reports performance is well and fast.Analyze table is deprecated, but still does its work for some part. It is not as complete as dbms_stats
    >
    in the dba_tables I can see the last_analyze - date
    and in GLOBAL_STATS = NO ( when Table runs with ANALYZE ),
    GLOBAL_STATS = YES( when Table runs with STATS )
    It must have to do something with the way you use it..
    HTH
    FJFranken

  • Exclude MV Table from gather schema stats

    Hey,
    I am running the daily stats gathering procedure below.
    But it is running at the same time with an MV that is refreshing,and it's failing due to that.
    Is it possible to exclude the Materialized view from the below schema stats procedure?
    BEGIN
    dbms_stats.gather_schema_stats(ownname=>'SCOTT',estimate_percent=>dbms_stats.auto_sample_size,degree=>2);
    END;
    This is running on Oracle 10.2.1.0, on Linux env.

    You could lock the statistics onto the tables you don't want to re-gather stats : dbms_stats.lock_table_stats
    http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_stats.htm#i1043993
    Then run the mentioned command will not compute stats on those objects.
    Nicolas.

  • Analyze table

    Dear all,
    What are the advantages if we do
    analyze table <table name> compute statisticsWhen ever i run this it is computing huge CPU utilization and taking long time........
    if we do this as far as i know
    1)it will analyze the table and count number of rows in a table properly.......
    Is there any advantage for INDEX if we do this operation.
    Regards,
    Vamsi

    Hey,
    Actually this command is old. I think it is 9i and below; It is still there for backwards compatibility.
    Since you are computing the statitics, oracle is going over all the records to get the stats. You can do estimate statics instead.
    OR, even better, you can use the new DBMS_Stats Package:
    EXEC DBMS_STATS.GATHER_TABLE_STATS (ownname=>'SCHEMA_NAME', tabname => 'TABLE_NAME', estimate_percent=>dbms_stats.auto_sample_size, degree=>2);
    This uses DBMS stats package, to estimate the statistics via a sample size automatically set through oracle.
    You can also use a similar command to estimate the stats of the whole schema
    EXEC DBMS_STATS.GATHER_SCHEMA_STATS (ownname=> 'SCHEMA_NAME', estimate_percent=>dbms_stats.auto_sample_size, degree=>2);
    From previous experience, creating an index is not enough, you have to gather the stats on the related table, so that the execution plan gets optimized.
    This command get table stats, histogram, uniqueness.. etc

  • ORA-20000: Unable to analyze TABLE "ECI"."COUNTRY"

    Oracle9i 9.2.0.7 on Windows Server 2003 32bit
    Using the "ANALYZE" in the Enterrpise Manager Console
    begin
    dbms_stats.gather_table_stats(ownname=>'ECI',tabname=>'COUNTRY',partname=>NULL);
    end;
    ORA-20000: Unable to analyze TABLE "ECI"."COUNTRY", insufficient priviledges or does not exist
    ORA-06512: at "SYS.DBMS_STATS", line 10292
    ORA-06512: at "SYS.DBMS_STATS", line 10315
    ORA-06512: at line 2
    Using SQLPLUS
    SQL>begin
    2>dbms_stats.gather_table_stats(ownname=>'ECI',tabname=>'country',partname=>NULL);
    3>end;
    4>/
    ORA-20000: Unable to analyze TABLE "ECI"."COUNTRY", insufficient priviledges or does not exist
    ORA-06512: at "SYS.DBMS_STATS", line 10292
    ORA-06512: at "SYS.DBMS_STATS", line 10315
    ORA-06512: at line 2
    COMMENT:I noticed here that eventhough I specifically used (tabname=>'country') it still used "ECI"."COUNTRY"(ALL CAPS) in executing my statement
    I also tested on other procedure.
    Using SQLPLUS
    SQL>begin
    2>dbms_redefinition.can_redef_table('ECI','country',dbms_redefinition.cons_use_pk);
    3>end;
    4>/
    BEGIN
    ERROR at line 1:
    ORA-00942: table or view does not exist
    ORA-06512: at "SYS.DBMS_REDEFINITION", line 8
    ORA-06512: at "SYS.DBMS_REDEFINITION", line 247
    ORA-06512: at line 2
    I don't understand why this error happens because
    a) the schema and table exist (I double checked)
    b) the error only happens on a single schema for only the old tables, when I create new tables I could "ANALYZE" it. I also can "ANALYZE" the indexes.
    c)I have used both the sys and system user logging in as SYSDBA
    In the ff exercise, I noticed that "ECI"."productrange" will work but "ECI"."PRODUCTRANGE" won't:
    SQL>select count(*) from "ECI"."productrange";
    COUNT(*)
    8
    SQL>select count(*) from "ECI"."PRODUCTRANGE";
    select count(*) from "ECI"."PRODUCTRANGE"
    ERROR at line 1:
    ora-00942: TABLE OR VIEW DOES NOT EXIST
    Can anyone kindly help me?

    You should not be creating tables in Oracle with names enclosed in double quotes. In that case Oracle preserves the case, making it difficult for others to identify the table.
    Create the table without using double quotes (may be CTAS) and everything should work fine.

  • OWB ORA-2000 unable to analyze table

    During execution I received the error above. In the mapping, I have two targets in two different schema's. I receive the error on the target that is in a different schema than the mapping.
    Is there a workaround? We do not have rights to grant analyze any table to the schema owner of the mapping.

    In OWB select Mapping, context menu Properties -> Code generation options -> Change Analyze table statements to false.
    Bye
    Detlef

  • Gather Schema Stat running long in R12

    Hi:
    I scheduled Gather Schema Stat (all with 10%) on two PROD instances last night. One completed about 2 hrs (this one normally has more activeties) while the other is still running since 12:00am. Please give me the steps and commands how I should troubleshoot. Is it still running? I have checked there is no block. BTW, I did one Gather Schema Stats for AP with 10% yesterday afternoon. It completed less than 2 min. They both are on 12.1.3 and 11.1.0.7 on Linux. Thank you in advance.

    I guess I have to do one of these. Thank you.
    Another question: Why some tables dont have date under last_analyzed? When I checked the all_tables and sorted on the last_analyzed, saw 1/3 of the tables with date blank. They are from all kinds of schemas, ex, two tables below. But some AR, or AP are analyzed.
    AR_SUBMISSION_CTRL_GT
    AP_PERIOD_CLOSE_EXCPS_GT

  • Analyze table to flush cache

    Hi
    In the Oracle 8i Concepts book it states that
    "when the ANALYZE statement is used to update or delete the statistics of a table, cluster, or index, all shared SQL areas that contain statements referencing the analyzed schema object are flushed from the shared pool".
    I ANALYZEd a table on my server but when I subsequently looked in V$SQLAREA statements referencing the table were still there.
    Any ideas ?
    Richard Hennessy

    Hi
    In the Oracle 8i Concepts book it states that
    "when the ANALYZE statement is used to update or delete the statistics of a table, cluster, or index, all shared SQL areas that contain statements referencing the analyzed schema object are flushed from the shared pool".
    I ANALYZEd a table on my server but when I subsequently looked in V$SQLAREA statements referencing the table were still there.
    Any ideas ?
    Richard Hennessy

  • Analyze table 10g steps

    Hi,
    DB: 10.2.0.4 RAC ASM
    OS: AIX 5.3L 64-bit
    I want to do analyze tables for all users.Please give me the steps for table and schema level.
    Thanks & Regards,
    Sunand

    CJ,
    dbms_utility.analyze_schema has been deprecated since 9i- you should be using dbms_stats.
    Sunand, by default there will be a gather stats job running on your database picking up any 'stale' statistics, have you disabled it?
    If you want/need to run it manually, dbms_stats.gather_database_stats is what you need. Documentation is here http://download.oracle.com/docs/cd/B19306_01/server.102/b14211/stats.htm#i41448
    Carl

  • Analyze table taking lot of time

    HI,
    I am analyzing fact table. its taking almost 1 hour to. is there any solution for the same?
    i am using compute statistics.
    regards,
    sandeep

    Hi,
    Why not DBMS_STATS which collects stats in parallel so much faster then analyze command.I strongly recommand use DBMS_STATS in case of partition tables.Much faster, collects local and global stats both.
    Analyze cmd for partitioned table:--
    analyze table <schema>.<table> partition (<partition_name>) estimate statistics sample 5 percent;
    Use GATHER_SCHEMA_STATS for whole schema analysis .

  • Analyze tables using procedure .

    Hello Friends,
    I my schema , I had to analyze all the tables ( gather statistics ) before I start the loading process to target tables.
    I have created a procedure -
    create or replace procedure proc_analyze_tables as
    table_count number :=0;
    sqlstatement varchar2(4000);
    begin
    for i in ( select TNAME from TAB where TABTYPE='TABLE' AND TNAME ='ABC') loop
    sqlstatement := 'ANALYZE TABLE ' || i.TNAME || 'ESTIMATE STATISTICS ';
    EXECUTE IMMEDIATE sqlstatement ;
    dbms_output.put_line ( 'table name is ' || i.TNAME );
    end loop;
    end proc_analyze_tables;
    is it appropiate to use the folllowing statment.
    exec dbms_stats.gather_schema_stats(ownname=>'myschema_name', options=>'GATHER AUTO');
    When I execute the procedure , I am getting invalid analyze command .
    How I can use the procedure or is their any command that can be executed by execute immediate statement !!
    thanks/kumar

    kumar73 wrote:
    Hello Friends,
    I my schema , I had to analyze all the tables ( gather statistics ) before I start the loading process to target tables.
    I have created a procedure -
    create or replace procedure proc_analyze_tables as
    table_count number :=0;
    sqlstatement varchar2(4000);
    begin
    for i in ( select TNAME from TAB where TABTYPE='TABLE' AND TNAME ='ABC') loop
    sqlstatement := 'ANALYZE TABLE ' || i.TNAME || 'ESTIMATE STATISTICS ';
    EXECUTE IMMEDIATE sqlstatement ;
    dbms_output.put_line ( 'table name is ' || i.TNAME );
    end loop;
    end proc_analyze_tables;
    When I execute the procedure , I am getting invalid analyze command .
    How I can use the procedure or is their any command that can be executed by execute immediate statement !!
    thanks/kumarANALYZE TABLE is obsoleted & deprecated.
    use DBMS_STATS instead
    The standard advice when (ab)using EXECUTE IMMEDIATE is to compose the SQL statement in a single VARCHAR2 variable
    Then print the variable before passing it to EXECUTE IMMEDIATE.
    COPY the statement & PASTE into sqlplus to validate its correctness.

  • ANALYZE Tables

    I am trying to find out what is the best approach for Optimizer_Mode and ANALYZE Tables. In v11.0.3 NAC, is it still RULE Mode and NO ANALYZE?
    What is the scenario in 11i?
    RDBMS is currently 8.1.5.1 but will be 8.1.6.x this weekend.
    Also does anyone have a shareable list of parameters for the init<SID>.ora? I have a machine with 18 CPU and a huge chunk of RAM.
    Regards

    Remember :
    - RULE Mode is for transactionnal Oracle Applications. Therefore, Forms screens and so on are coded with RULE optimization in mind.
    - Benefits of ANALYZE can be proven with some selected reports that uses HINTS in the SELECT statement.
    => I suggest u to monitor reports submitted and their execution time, collect their names, verify their codes. U'll have a deep vision of ur system.
    Instead of that, analyze all schemas without questions (careful it can be long !!!) but u will never know whether it is needed or not.

  • Stopping the schema stats gathering process

    Hi,
    I have a large schema in my DB (7 TBs) on which I had triggered a UNIX background job to compute the statistics on that schema. The SQL which i had used was:
    EXEC DBMS_UTILITY.ANALYZE_SCHEMA('<<schema_name>>','COMPUTE');
    However, since the size of the schema is quite big, this stats computation process is running since the last 7 hours and I am not sure about the time it would take to finish with this process.
    My questions are as below:
    1. Is there any method to determine the status of this stats computation process?
    2. If I kill this background job because i now have to run other data loading jobs on the schema and I dont want them to error out because of the exclusive locks on those objects, will it affect the DB performance in any way?
    3. If i kill this background process, will the entire stats computation process be rolled back or would it be stopped at the place from where it was halted?
    Kindly advice.
    Thanks in advance.

    Hi,
    Thanks for the prompt response. I agree that i should be using dbms_stats and estimate for computing the schema stats. I re-analyzed the tables under the schema yeterday and hence i hope i should not face the incorrect cardinality info problem now.
    However, I am facing another problem with the DB now. As a result of the data loading operation on the table, i see that all the subsequent DML operations are now taking a long log time to complete, typically 30 mins for a 10 seconds operation earlier.
    What do you suggest is causing the problem and needs rectification?
    Thanks in advance.

  • Sql statement in a table not accepting variable

    I have the following problem on 10.1.0.3.0 with varialbe in an execute immediate statement
    here is the code that I am using
    declare
    remote_data_link varchar2(25) := 'UDE_DATATRANSFER_LINK';
    FROM_SCHEMA VARCHAR2(40) := 'UDE_OLTP';
    l_last_process_date date := to_date(to_char(sysdate,'mm-dd-yyyy hh:mi:ss'),'mm-dd-yyyy hh:mi:ss') - 1;
    stmt varchar2(4000) := 'MERGE into applicant_adverseaction_info theTarget USING (select * from '||FROM_SCHEMA||'.applicant_adverseaction_info@'||remote_data_link||' where last_activity > :l_last_process_date ) theSource ON(theTarget.applicant_id = theSource.applicant_id) WHEN MATCHED THEN UPDATE SET theTarget.cb_used = theSource.cb_used, theTarget.cb_address = theSource.cb_address, theTarget.scoredmodel_id = theSource.scoredmodel_id, theTarget.last_activity = theSource.last_activity WHEN NOT MATCHED THEN INSERT(CB_USED, CB_ADDRESS, SCOREDMODEL_ID, APPLICANT_ID, LAST_ACTIVITY) values(theSource.cb_used, theSource.cb_address, theSource.scoredmodel_id, theSource.applicant_id, theSource.last_activity)';
    stmt2 varchar2(4000) := 'MERGE into edm_application theTarget USING (select * from '||from_schema||'.edm_application@'||remote_data_link||' where last_activity > :l_last_process_date) theSource ON (theTarget.edm_appl_id = theSource.edm_appl_id) WHEN MATCHED THEN UPDATE SET theTarget.APP_REF_KEY = theSource.APP_REF_KEY, theTarget.IMPORT_REF_KEY = theSource.IMPORT_REF_KEY, theTarget.LAST_ACTIVITY = theSource.LAST_ACTIVITY WHEN NOT MATCHED THEN INSERT (EDM_APPL_ID, APP_REF_KEY, IMPORT_REF_KEY, LAST_ACTIVITY) values(theSource.EDM_APPL_ID, theSource.APP_REF_KEY, theSource.IMPORT_REF_KEY, theSource.LAST_ACTIVITY)';
    v_error varchar2(4000);
    T_MERGE VARCHAR2(4000);
    stmt3 varchar2(4000);
    BEGIN
    select merge_sql
    INTO T_MERGE
    from transfertables
    where table_name= 'edm_application';
    remote_data_link:= 'UDE_DATATRANSFER_LINK';
    FROM_SCHEMA := 'UDE_OLTP';
    --DBMS_OUTPUT.PUT_LINE(SUBSTR(stmt2,1,200));
    --STMT2 := T_MERGE;
    dbms_output.put_line(from_schema||' '||remote_data_link||' '||l_last_process_date);
    EXECUTE IMMEDIATE stmt2 using l_last_process_date;
    --execute immediate stmt3 ;
    dbms_output.put_line(from_schema||' '||remote_data_link||' '||l_last_process_date);
    dbms_output.put_line(substr(stmt2,1,200));
    commit;
    EXCEPTION
    WHEN OTHERS THEN
    V_ERROR := SQLCODE||' '||SQLERRM;
    v_ERROR := V_ERROR ||' '||SUBSTR(stmt2,1,200);
    DBMS_OUTPUT.PUT_LINE(V_ERROR);
    --dbms_output.put_line(substr(stmt2,1,200));
    END;
    This works perfectly
    but if I change it to get the same statement in a db table
    declare
    remote_data_link varchar2(25) := 'UDE_DATATRANSFER_LINK';
    FROM_SCHEMA VARCHAR2(40) := 'UDE_OLTP';
    l_last_process_date date := to_date(to_char(sysdate,'mm-dd-yyyy hh:mi:ss'),'mm-dd-yyyy hh:mi:ss') - 1;
    stmt varchar2(4000) := 'MERGE into applicant_adverseaction_info theTarget USING (select * from '||FROM_SCHEMA||'.applicant_adverseaction_info@'||remote_data_link||' where last_activity > :l_last_process_date ) theSource ON(theTarget.applicant_id = theSource.applicant_id) WHEN MATCHED THEN UPDATE SET theTarget.cb_used = theSource.cb_used, theTarget.cb_address = theSource.cb_address, theTarget.scoredmodel_id = theSource.scoredmodel_id, theTarget.last_activity = theSource.last_activity WHEN NOT MATCHED THEN INSERT(CB_USED, CB_ADDRESS, SCOREDMODEL_ID, APPLICANT_ID, LAST_ACTIVITY) values(theSource.cb_used, theSource.cb_address, theSource.scoredmodel_id, theSource.applicant_id, theSource.last_activity)';
    stmt2 varchar2(4000) := 'MERGE into edm_application theTarget USING (select * from '||from_schema||'.edm_application@'||remote_data_link||' where last_activity > :l_last_process_date) theSource ON (theTarget.edm_appl_id = theSource.edm_appl_id) WHEN MATCHED THEN UPDATE SET theTarget.APP_REF_KEY = theSource.APP_REF_KEY, theTarget.IMPORT_REF_KEY = theSource.IMPORT_REF_KEY, theTarget.LAST_ACTIVITY = theSource.LAST_ACTIVITY WHEN NOT MATCHED THEN INSERT (EDM_APPL_ID, APP_REF_KEY, IMPORT_REF_KEY, LAST_ACTIVITY) values(theSource.EDM_APPL_ID, theSource.APP_REF_KEY, theSource.IMPORT_REF_KEY, theSource.LAST_ACTIVITY)';
    v_error varchar2(4000);
    T_MERGE VARCHAR2(4000);
    stmt3 varchar2(4000);
    BEGIN
    select merge_sql
    INTO T_MERGE
    from transfertables
    where table_name= 'edm_application';
    remote_data_link:= 'UDE_DATATRANSFER_LINK';
    FROM_SCHEMA := 'UDE_OLTP';
    --DBMS_OUTPUT.PUT_LINE(SUBSTR(stmt2,1,200));
    STMT2 := T_MERGE;
    dbms_output.put_line(from_schema||' '||remote_data_link||' '||l_last_process_date);
    EXECUTE IMMEDIATE stmt2 using l_last_process_date;
    --execute immediate stmt3 ;
    dbms_output.put_line(from_schema||' '||remote_data_link||' '||l_last_process_date);
    dbms_output.put_line(substr(stmt2,1,200));
    commit;
    EXCEPTION
    WHEN OTHERS THEN
    V_ERROR := SQLCODE||' '||SQLERRM;
    v_ERROR := V_ERROR ||' '||SUBSTR(stmt2,1,200);
    DBMS_OUTPUT.PUT_LINE(V_ERROR);
    --dbms_output.put_line(substr(stmt2,1,200));
    END;
    I get ora-00900 invalid sql statement
    can somebody explain why this happens
    Thanks

    I agree with jan and anthony. Your post is too long and ill-formatted. However here's my understanding of your problem (with examples though slightly different ones).
    1- I have a function that returns number of records in a any given table.
      1  CREATE OR REPLACE FUNCTION get_count(p_table varchar2)
      2     RETURN NUMBER IS
      3     v_cnt number;
      4  BEGIN
      5    EXECUTE IMMEDIATE('SELECT count(*) FROM '||p_table) INTO v_cnt;
      6    RETURN v_cnt;
      7* END;
    SQL> /
    Function created.
    SQL> SELECT get_count('emp')
      2  FROM dual
      3  /
    GET_COUNT('EMP')
                  14
    2- I decide to move the statement to a database table and recreate my function.
    SQL> CREATE TABLE test
      2  (stmt varchar2(2000))
      3  /
    Table created.
    SQL> INSERT INTO test
      2  VALUES('SELECT count(*) FROM p_table');
    1 row created.
    SQL> CREATE OR REPLACE FUNCTION get_count(p_table varchar2)
      2     RETURN NUMBER IS
      3     v_cnt number;
      4     v_stmt varchar2(4000);
      5  BEGIN
      6     SELECT stmt INTO v_stmt
      7     FROM test;
      8     EXECUTE IMMEDIATE(v_stmt) INTO v_cnt;
      9     RETURN v_cnt;
    10  END;
    11  /
    Function created.
    SQL> SELECT get_count('emp')
      2  FROM dual
      3  /
    SELECT get_count('emp')
    ERROR at line 1:
    ORA-00942: table or view does not exist
    ORA-06512: at "SCOTT.GET_COUNT", line 8
    ORA-06512: at line 1
    --p_table in the column is a string and has nothing to do with p_table parameter in the function. And since there's no p_table table in my schema function returns error on execution. I suppose this is what you mean by "sql statement in a table not accepting variable"
    3- I rectify the problem by recreating the function.
      1  CREATE OR REPLACE FUNCTION get_count(p_table varchar2)
      2     RETURN NUMBER IS
      3     v_cnt number;
      4     v_stmt varchar2(4000);
      5  BEGIN
      6     SELECT replace(stmt,'p_table',p_table) INTO v_stmt
      7     FROM test;
      8     EXECUTE IMMEDIATE(v_stmt) INTO v_cnt;
      9     RETURN v_cnt;
    10* END;
    SQL> /
    Function created.
    SQL> SELECT get_count('emp')
      2  FROM dual
      3  /
    GET_COUNT('EMP')
                  14
    Hope this gives you some idea.-----------------------
    Anwar

Maybe you are looking for