Old Statistics

Dear All Guru
I m using oracle 10.2.0.3 database on window environment. I need two month before statistics because Last month my assistant analyzed tables that make the database performance slow.
Today i find out the problem. Please tell me how i import old statistics or how i improve my database performance.
Shahid

Shahid,
SQL> SELECT DBMS_STATS.GET_STATS_HISTORY_AVAILABILITY FROM DUAL;
GET_STATS_HISTORY_AVAILABILITY
01-AUG-09 10.28.50.421000000 AM +05:30
It means i am not able to restore statistics before 01 Aug-2009.
Now if i wish to restore statistics; but only after 01 Aug 2009 then;
exec dbms_stats.restore_table_stats ( -
‘SCOTT’, -
‘DEPT’, -
‘02-AUG-09 11.00.00.000000 AM -05:00′);
If you wish to take a fresh statistics then :
EXEC DBMS_STATS.gather_table_stats('SCOTT', 'EMPLOYEES');
Hth
Girish Sharma
Edited by: Girish Sharma on Sep 1, 2009 4:47 AM

Similar Messages

  • How to retrieve old statistics backed up in SYSAUX tablespace.

    Hi, Folks
    From 10g onward, Oracle stores statistics collected previously when new statistics is gathered. Old statistics can be queryed via dba_tab_stats_history. but dba_tab_stats_history view doesn't show values like histogram, rows, min, and max. I would like to compare old statistics with new ones. Is there any way to retrieve previous statistics data backed up in SYSAUX tablespace?
    [oracle@rh01 ~]$ sqlplus '/as sysdba'
    SQL*Plus: Release 10.2.0.3.0 - Production on Wed Sep 2 10:26:27 2009
    Copyright (c) 1982, 2006, Oracle. All Rights Reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    SQL> desc dba_tab_stats_history
    Name Null? Type
    OWNER VARCHAR2(30)
    TABLE_NAME VARCHAR2(30)
    PARTITION_NAME VARCHAR2(30)
    SUBPARTITION_NAME VARCHAR2(30)
    STATS_UPDATE_TIME TIMESTAMP(6) WITH TIME ZONE
    SQL>

    If it is stored you can query it. The trick is knowing where it is stored and without specific information about what you are looking for there's not much I can do other than refer you to tables whose names begin with the prefix 'WRH.''
    Try this for starters:
    select table_name from dba_tables where table_name like 'WRH%STAT%';

  • Tables with Old Statistics

    Hello guys,
    In my database I checked the log of "Check and Update optimizer statistics" (DB13) and I saw the stats_change_treshold is 50 days, then I did a query and I realized the 99% of the tables have old statistics.
    So my questions are:
    Why this process doesn´t update the statistics of the tables that have old statistics?
    Exist other conditions for the process of DB13 than stats_change_treshold parameter?
    Thanks in advance.

    So in my opinion it method isn´t efficient, because on Table B`s case, if I delete 10.000.000 rows the statistics won´t update and the table had have a change of 20% in its rows. What do you think about this? Exist any way to indicate to SAP these exceptions?
    Sure, there is
    Look for DBSTATC table (maintenance transaction DB21) in sap notes.
    You'll find a full fledged explanation of what can be done with this table...
    However, your example is a bit flawed (just as the whole heuristic here is).
    Basically two key factors influence whether or not new statistics are required or not:
    1. the amount of change in data volume
    and
    2. the change of the data distribution.
    For 1. the 50% rule is usually more than sufficient.
    If I have a rather small table, add more than 50% of data volume than it might be that the new statistics lead to a more efficient access plan of the optimizer (e.g. using an index instead of doing a quick full table scan).
    For 2. the rule is not too well suited, as even a small amount of changed data could change the data distribution in a way that would lead to very different access plans. However, changes of this kind usually need to be covered with special attention anyhow, e.g. using histograms and literals instead of bind variables.
    Based on the experience with SAP systems, the 50% rule is not too bad.
    There's even a note of Jürgen Kirschner describing a different approach to statistics handling.
    In this note the statistics are more seen as table models for the CBO. In this view the ultima ratio is: get your statistics to a point where your system does perform as you like it and then freeze the statistics (or just stop collecting them).
    That way there is no risk of having changed execution plans due to new statistics (bad Monday morning syndrome...).
    And for several tables SAP also releases custom tailored statistics that should be implemented and frozen, just because of the special nature of the table usage (famous example TRFCQ... tables).
    As I already wrote: make sure to check the notes!
    All the stuff I mentioned here is fully explained in them - far better than I can do it here.
    Thanks for all.
    PD: If in any moment we find, we will drink a beer pair together !
    Looking forward for that
    Cheers,
    Lars

  • SQL Performance - Old Statistics

    Hi,
    Could someone please explain whether or not statistics influence query performance - other than selecting correct indexes?
    I have a query that is performing badly. The statistics on the query table are quite old, and the data volumes
    have changed significantly (from 1mil to 15mil).
    The query plan is using the expected index, so i am reluctant to gather new Stats, in case of additional unwanted impact.
    Should i be considering other benefits of Statistics other than just the Execution Path?
    (The data is linier, so skew should not be a factor.)
    Thanks

    Hi,
    user9515105 wrote:
    Could someone please explain whether or not statistics influence query performance - other than selecting correct indexes?
    I have a query that is performing badly. The statistics on the query table are quite old, and the data volumes
    have changed significantly (from 1mil to 15mil).
    The query plan is using the expected index, so i am reluctant to gather new Stats, in case of additional unwanted impact.such an difference in the data volume can be a reason, why the optimizer should better use a full table scan than an index access.
    Oracle recommends, if more than 10 percent of a table have to be read, it is better to use a full table scan instead of an index.
    The optimizer has the old information, that there are only 1 million records, so it seems to be a good way to use the index.
    But probably the statement has to read 2 million of the 15 million records and in worst case this records have also to be processed in a join with
    a nested loop with another table. Then the performance can be get very bad.

  • SQL Tuning Advisor says I have old statistics but they were collected today

    Hi all,
    Oracle 10.2.0.4.0 64-bit
    Win 2003 Standard Edition 64-bit
    I was looking at some problem code of ours earlier through Enterprise Manager, and decided to use the advisor to see what it recommended about one statement. (Apologies at this point for being 100% on the name of the advisor as I am working in Spanish, but I guess it may be "SQL Tuning Advisor"?)
    The recommendations were all to do with gathering optimizer statistics on the various tables and indexes involved as they were "out-dated". I have checked the LAST_ANALYZED columns in DBA_TABLES and they were collected automatically, as always, at 0230 today. The advisor task was run at about 1030.
    Has anyone else seen this? Is it a bug? Or is the advisor very intelligent and suggesting that the data in the tables has changed dramatically since 0230 this morning and that yes, those stats really need collecting again?
    Regards,
    Ados

    Hi Niall,
    Thanks for replying. My guess was that it was probably doing something based on how much the content of the table had changed, but I didn't know about the 10% threshold. Thanks for that.
    However, would it be looking in the STALE_STATS column in ALL/DBA_TAB_STATISTICS and ALL/DBA_IND_STATISTICS to see this?
    (This is how I woud do it, however I am a mere mortal, unlike the Oracle advisor.. )
    I checked there already and all of the tables and indexes in question have the value "NO".
    So I still don't get it
    ?:|
    Regards,
    Ados

  • Reporting on old and new statistics data

    Hello,
    I have an upgraded system from 3.5 to BI 7.o version, and i have installed the Admin cockpit for statistics content.
    Also in 3.5 version we have created Z* reports based on the multiprovider.
    After upgrade it is not possible to get data from RSDDSTAT because it is obsolete and we can get data from new tables.
    Now i want report based on the old statistics content as well as with new content.
    How can i report based on old data and new data?
    Please help me with valuable inputs.
    Thanks in Advance.
    Regards
    M.A
    Edited by: M.A on May 3, 2010 12:31 PM
    Please give me suggestion how we can report based on old and new statistics data..

    Hello Anand,
    Thanks for the response.
    My requierment is I have my 3.5 statistics data in my info cube (0BWTC_C01 till C05).  Now the system upgrade is done.
    Now my report has to show me the statistics of 3.5 and BI 7.0 data together. Is it possible to create multiprovider based on 3.5 statistics content cube and BI 7.0 statistics cube?
    Thanks,
    Regards
    M.A

  • Disable Statistics for specific Tables

    Is it possible to disable statistics for specific tables???

    If you want to stop gathering statistics for certain tables, you would simply not call DBMS_STATS.GATHER_TABLE_STATS on those particular tables (I'm assuming that is how you are gathering statistics at the moment). The old statistics will remain around for the CBO, but they won't be updated. Is that really what you want?
    If you are currently using GATHER_SCHEMA_STATS to gather statistics, you would have to convert to calling GATHER_TABLE_STATS on each table. You'll probably want to have a table set up that lists what tables to exclude and use that in the procedure that calls GATHER_TABLE_STATS.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • SQL 2008 R2 Best Practices for Updating Statistics for a 1.5 TB VLDB

    We currently have a ~1.5 TB VLDB (SQL 2008 R2) that services both OLTP and DSS workloads pretty much on a 24x7x365 basis. For many years we have been updating statistics (full scan- 100% sample size) for this VLDB once a week on the weekend, which
    is currently taking up to 30 hours to complete.
    Somewhat recently we have been experiencing intermitent issues while statistics are being updated, which I doubt is just a coincidence. I'd like to understand exactly why the process of updating statistics can cause these issues (timeouts/errors). My theory
    is that the optimizer is forced to choose an inferior execution plan while the needed statistics are in "limbo" (stuck between the "old" and the "new"), but that is again just a theory. I'm somewhat surprised that the "old" statistics couldn't continue to
    get used while the new/current statistics are being generated (like the process for rebuilding indexes online), but I don't know all the facts behind this mechanism yet so that may not even apply here.
    I understand that we have the option of reducing the sample percentage/size for updating statistics, which is currently set at 100% (full scan).  Reducing the sample percentage/size for updating statistics will reduce the total processing time, but
    it's also my understanding that doing so will leave the optimizer with less than optimal statistics for choosing the best execution plans. This seems to be a classic case of not being able to have one’s cake and eat it too.
    So in a nutshell I'm looking to fully understand why the process of updating statistics can cause access issues and I'm also looking for best practices in general for updating statistics of such a VLDB. Thanks in advance.
    Bill Thacker

    I'm with you. Yikes is exactly right with regard to suspending all index optimizations for so long. I'll probably start a separate forum thread about that in the near future, but for now lets stick to the best practices for updating statistics.
    I'm a little disappointed that multiple people haven't already chimed in about this and offered up some viable solutions. Like I said previously, I can't be the first person in need of such a thing. This database has 552 tables with a whole lot more statistics
    objects than that associated with those tables. The metadata has to be there for determining which statistics objects can go (not utilized much if at all so delete them- also produce an actual script to delete the useless ones identified) and what
    the proper sample percentage/size should be for updating the remaining, utilized statistics (again, also produce a script that can be used for executing the appropriate update statistics commands for each table based on cardinality).
    The above solution would be much more ideal IMO than just issuing a single update statistics command that samples the same percentage/size for every table (e.g. 10%). That's what we're doing today at 100% (full scan).
    Come on SQL Server Community. Show me some love :)
    Bill Thacker

  • Statistics in BI 7.0

    Hi All,
    We are in the process of BI technical upgrade. I read somewhere the BI Statistics concept has changed in BI 7.0. The statistics are part of our scope.
    It would be great if anyone shares "How to Statistics in BI 7.0?" or let me know the changes and the necessary actions to be taken to make sure Statistics work in BI 7.0
    Regards,
    Suman

    Suman,
    Some of the key points from the presentation below while configuring the BI Stats in v 7.0. 
    a) The new technical content for BI statistics renders the previous u201CBW
    statisticsu201D content obsolete with SAP NetWeaver 7.0 BI.
    b) The BI Administration Cockpit is based on new InfoProviders that are
    delivered with the new technical content for BI statistics.
    u2013 It is not possible to migrate old statistics into new InfoCubes, and there is no
    provision for migrating old statistics into the new InfoCubes.
    c) The new BI statistics content provides:
    u2013 InfoCubes for historical data
    u2013 Virtual Providers for current data
    u2013 MultiProviders for combined view of current and historical
    d) OLAP statistics table RSDDSTAT no longer is updated, and it has been
    split / extended to several different RSDDSTAT* tables.
    e) As of SAP NetWeaver 7.0 BI, transaction ST03 is based on the Technical
    Content InfoProviders (unlike prior releases). Therefore, using transaction
    ST03 for BI Monitoring requires the Technical Content to be activated and
    to be populated periodically with statistics data.
    See full details in this useful presentation:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/50f95b21-7fb8-2910-0c81-f6935d4c654b?quicklink=events&overridelayout=true
    Regards,
    Ramesh.

  • System Statistics

    Hi,
    I am moving from HPUX PARISK to HPUX iTANIUM my 10203 instance.
    I am planing to gather system and dictionary stats based on oracle metalink note :
    How to gather statistics on SYS objects and fixed_objects? (Doc ID 457926.1)
    To gather the dictionary stats:-
    SQL> EXEC DBMS_STATS.GATHER_SCHEMA_STATS ('SYS');
    SQL> exec DBMS_STATS.GATHER_DATABASE_STATS (gather_sys=>TRUE);
    SQL> EXEC DBMS_STATS.GATHER_DICTIONARY_STATS;
    Gather_fixed_objects_stats also gathers statistics for dynamic tables e.g. the X$ tables which
    loaded in SGA during the startup. Gathering statistics for fixed objects would normally be recommeded if poor performance is encountered while querying dynamic views e.g. V$ views.
    Since fixed objects record current database activity; statistics gathering should be done when database has a representative load so that the statistics reflect the normal database activity .
    To gather the fixed objects stats use:- EXEC DBMS_STATS.GATHER_FIXED_OBJECTS_STATS; I have two questions :
    1. Show i run all the 4 commands mention above in order collect system stats ?
    2. How can i rollback those statistics in case of poort performance against the dictionary table ?
    Thanks
    Yoav

    SQL> exec dbms_stats.create_stat_table(user,'STAT_TIMESTAMP');
    PL/SQL procedure successfully completed.
    SQL> exec dbms_stats.export_system_stats('STAT_TIMESTAMP');
    PL/SQL procedure successfully completed.
    After moving to production i will gather system/dictionary stats. and in case of problem i should import thos stats .
    Is that currect ? YES
    Second , what about the 4 command :
    SQL> EXEC DBMS_STATS.GATHER_SCHEMA_STATS ('SYS');
    SQL> exec DBMS_STATS.GATHER_DATABASE_STATS (gather_sys=>TRUE);
    SQL> EXEC DBMS_STATS.GATHER_DICTIONARY_STATS;
    SQL> EXEC DBMS_STATS.GATHER_FIXED_OBJECTS_STATS;Should i run them all ?If you run the statistics these commands, later if you want to roll back these statistics again you need to run IMPORT statistics package command.. so that old statistics will be reloaded to database. :)

  • DELETE STATISTICS

    I was wondering if we delete statistics on a table is it possible to restore those old statistics back to the table?? Thank you in advance.

    Prior to the delete of the statistics, you can export them using DBMS_STATS.EXPORT_TABLE_STATS and restore them later if needed using the corresponding import procedures.

  • Cost Based Optimizer Statistics

    Hi,
    I just wanted to check on how to do this activity............
    In my production system in R3........on HPUX 11.23 and Oracle DB 9i and ECC 5.0....
    When I go to DB02.......Checks.......Date of Table Analysis......The output shows as below
      Date of last analysis           SAPDAT         SYSTEM        others
    never analyzed                        0            129            195
    older one year                        0              0         40,871
    31 - 365 days                         0              0            782
    8 -  30 days                         0              0          2,942
    0 -   7 days                         0              0            337
      Total                               0            129         45,127
    How do I go about do the analysis for all the files so that there is an up-to-date status....
    Thanks in advance.
    Alfred

    You can force a creation of new statistics by using the "-f collect" force option:
    brconnect -u / -c -f stats- t all -f collect
    Nevertheless I usually would not recommend this because you can expect a high runtime of the statistics creation and "old" statistics are not "bad" statistics. Instead it is normal that statistics of static tables are months or years old. See note 825653 (7) for more information.
    Regards
    Martin

  • Automatic statistics generation in 11.2

    Dear experts!
    I've initialized the automatic statistics collector in my 11.2 db as follows:
    BEGIN
    DBMS_AUTO_TASK_ADMIN.ENABLE(
    client_name => 'auto optimizer stats collection',
    operation => NULL,
    window_name => NULL);
    DBMS_SCHEDULER.SET_ATTRIBUTE('SYS.MONDAY_WINDOW', 'repeat_interval', 'freq=daily;byday=MON;byhour=11;byminute=0;bysecond=0');
    DBMS_SCHEDULER.SET_ATTRIBUTE('SYS.WEDNESDAY_WINDOW', 'repeat_interval', 'freq=daily;byday=WED;byhour=11;byminute=0;bysecond=0');
    DBMS_SCHEDULER.SET_ATTRIBUTE('SYS.FRIDAY_WINDOW', 'repeat_interval', 'freq=daily;byday=FRI;byhour=11;byminute=0;bysecond=0');
    DBMS_SCHEDULER.SET_ATTRIBUTE('SYS.THURSDAY_WINDOW', 'repeat_interval', 'freq=daily;byday=THU;byhour=11;byminute=0;bysecond=0');
    DBMS_SCHEDULER.SET_ATTRIBUTE('SYS.TUESDAY_WINDOW', 'repeat_interval', 'freq=daily;byday=TUE;byhour=11;byminute=0;bysecond=0');
    DBMS_STATS.GATHER_SCHEMA_STATS(ownname => 'DBUSER',
    options => 'GATHER AUTO',
    estimate_percent => dbms_stats.auto_sample_size
    END;
    After this initial procedure the automatic maintenance task checks from monday to friday at 11 am if the statistics of the database objects are still ok. For objects with old statistics they are newly generated.
    The first question is with which parameters the automatic maintenance task executes the dbms_stats.gather_... for the objects with old statistics. In my opinion it'll take the same options as specified in the initial procedure (options=GATHER AUTO and estimate_percent=dbms_stats.auto_sample_size). Am I right?
    The second question belongs also to the parameter of the dmbs_stats procedure - but in another context:
    Some used tools generate statistics too after intensive workload, they execute the following:
    execute DBMS_STATS.GATHER_SCHEMA_STATS(ownname => 'DBUSER',
    options => 'GATHER EMPTY',
    estimate_percent => 70
    execute DBMS_STATS.GATHER_SCHEMA_STATS(ownname => 'DBUSER',
    options => 'GATHER STALE',
    estimate_percent => 70
    So what happens to my automatic maintenance task after these two procedure calls? Do these two procedures overwrite the inital provided option (GATHER AUTO) which is required for automatic statistics collection? If yes does it mean that the automatic statistics collector doesn't maintain the database objects any more?
    Thanks for your help!
    Best regards!
    Markus

    1. There is a database and table option called MONITORING. By default this is turned on from 10g. So database is watching for changes and will collect statistics on tables/objects where needed in GATHER AUTO option.
    2. No GATHER AUTO options wont be touched but you recollect stats on objects you specified in function/procedure call. And if those objects wont change in the future AUTO GATHER may wont touch these stats anymore....
    This is a doc reference list I created when I was working on similar issue. :)
    Best Practices for automatic statistics collection on Oracle 10g [ID 377152.1]
    https://support.oracle.com/CSP/main/article?cmd=show&type=NOT&doctype=BULLETIN&id=377152.1
    How to check what automatic statistics collection is scheduled on 10g [ID 377143.1]
    https://support.oracle.com/CSP/main/article?cmd=show&type=NOT&id=377143.1
    How To Extend Maintenance Windows For GATHER_STATS_JOB for More Than 8 Hours? [ID 368475.1]
    https://support.oracle.com/CSP/main/article?cmd=show&type=NOT&id=368475.1
    Oracle 10g PL/SQL Packages and Types Reference - DBMS_SCHEDULER documentation
    http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_sched.htm#CIHHBGGI
    Oracle 10g PL/SQL Packages and Types Reference - DBMS_STATS documentation
    http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_stats.htm#CIHBIEII
    Oracle® Database Performance Tuning Guide 10g Release 2 (10.2) 14 Managing Optimizer Statistics 14.2 Automatic Statistics Gathering
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14211/stats.htm#i41282
    Statistics Best Practices: How to Backup and Restore Statistics [ID 464939.1]
    https://support.oracle.com/CSP/main/article?cmd=show&type=NOT&id=464939.1
    Managing CBO Stats during an upgrade to 10g or 11g [ID 465787.1]
    https://support.oracle.com/CSP/main/article?cmd=show&type=NOT&doctype=HOWTO&id=465787.1

  • Difference: BI Statistics 3.x to 7.0

    Hello All,
    we want to implement a cost sharing model regarding our different InfoAreas.
    Therefore, we want to use BI Statistics.
    In the old BI Statistics 3.x I find a multicube 0BWTC_C10 which provides all key figures that we need and additionally the navigation object InfoArea.
    But as we migrated to 7.0 I read that certain InfoCubes have been replaced, for example
    0tct_mc01 replaces 0bwtc_c02
    0tct_mc02 replaces 0bwtc_c02 and so on.
    However, I cannot find a multicube which combines all of these new InfoCubes.
    Here my question:
    Do I have to use the new infocubes or can I also load my query statistics (new 7.0 queries) into the old infocubes and use the old multicubes?
    Can I also use the old infocubes for data loading statistics also I use new technology - I still use infopackages and DTPs?
    Thank you in advance.
    best regards,
    Ilona

    The old Statistics related targets are obsolete and have to go ahead with the new COCKPIT targets.
    So if you want to read the earlier statistics information you can use RSA1OLD and then you can check the information of the historical data. In Bi7.0 the earlier data is not been migrated and hence you need to work with the new data flow.
    In terms of DTP's its optional.
    Hope this helps..

  • Any precautions before importing old statistic?

    Hello All,
    On my Oracle 9i database I exported statistics of a schema in a table using following commands:
    exec dbms_stats.create_stat_table(ownname => 'SCHEMA', stattab => 'STAT_TABLE');
    exec dbms_stats.export_schema_stats(ownname => 'SCHEMA', stattab => 'STAT_TABLE');I need to import them back for the schema. I will use following steps:
    exec dbms_stats.import_schema_stats(ownname => 'SCHEMA', stattab => 'STAT_TABLE');Do I need to delete existing STATs before I import old one? or I shuold above import command directly?
    Thanks,

    The precautions would really depend on when & why you exported the statistics. In general, I would probably delete the existing statistics first.
    Be aware that when you import the statistics, Oracle is going to invalidate all your cached SQL plans, so the database is going to start hard parsing like crazy. Query plans should change back to the plans that were present with the old statistics, assuming the data was the same. If you have monotonically increasing columns with histograms (i.e. a create_date column with a histogram), old statistics with new data may generate new, bad query plans because queries would be passing in bind values greater than the histogram's top value. If there are new objects, new columns, or new histograms since the statistics were exported, those objects/ columns/ histograms would now be missing and would need to be collected again.
    Justin

Maybe you are looking for

  • Sharing iTunes Between 2 Users On 1 Mac - Unique

    Hi, I've only recently gotten round to trying to figure this out, but I remember some of the stuff I tried to fix this issue previously: I currently share my iTunes music between my user account (admin) and my girlfriend's account on the same iMac by

  • How can I create a general TOC using the titles of my books TOCS

    I have a document that includes the information of 6 different books, each one has its own TOC at the beginning of the chapter. The chapter title is only written as the title in the TOC. In the first book I want to create a general index that shows t

  • How can I restore the previously bought apps after I have formatted my computer without backing up any of my files for iTunes?

    As titled. I bought many apps from iTunes at the end of last year, and then my computer had got problem that I had to format and reinstall everything. I just want to know how can I restore the previously bought apps once if I forgot to backup any of

  • Rules based ATP not working

    Hi Experts, We are facing couple of issues with Rules based ATP . The scenario is we are using only location based  substitution 1) We are creating an order for a material in ECC . This material is GATP relevant & the settings are made for rules base

  • Issue while install the BI content

    Hello Gurus, While installing the Query of Infocube 0PM_C07 from the BI content i am getting the below error Element *468JC3E5B8RMQIN39PT8WE1GR* is missing in version D. I had try to find out the Query Element in the tables RSRREPDIR, RSZCOMPDIR,RSZE