Exporting and Importing Statistics for Schema objects.

Hello All,
I am trying to gather stats for optimization using, gather_schema_stats for all objects under schema. The manual what i am reading says it covers both Tables and Indexes and we can also include the all partions too. For the safety reasons i was informed by a friend that we should take a back up of old stats into a table (user defined), so as in case of messing up with new stats for performance issue we can use it as an backup copy. So, i have created a user table and exported the stats from existing schema. My Question is, do i have to create one for the indexes too. Or its all in one table. I hope my question is clear.
Generally manuals don't teach all this kind of stuff until unless we learn it by ourselves by trial and error Or getting solutions from great resolvers like you people.
Hope to hear soon.
Thanks in Advance.

One table for table and indexes stats is enough, a little test for you :
SCOTT@demo102> exec DBMS_STATS.CREATE_STAT_TABLE('SCOTT','MYTBL');
PL/SQL procedure successfully completed.
SCOTT@demo102> desc mytbl
Name                                      Null?    Type
STATID                                             VARCHAR2(30)
TYPE                                               CHAR(1)
VERSION                                            NUMBER
FLAGS                                              NUMBER
C1                                                 VARCHAR2(30)
C2                                                 VARCHAR2(30)
C3                                                 VARCHAR2(30)
C4                                                 VARCHAR2(30)
C5                                                 VARCHAR2(30)
N1                                                 NUMBER
N2                                                 NUMBER
N3                                                 NUMBER
N4                                                 NUMBER
N5                                                 NUMBER
N6                                                 NUMBER
N7                                                 NUMBER
N8                                                 NUMBER
N9                                                 NUMBER
N10                                                NUMBER
N11                                                NUMBER
N12                                                NUMBER
D1                                                 DATE
R1                                                 RAW(32)
R2                                                 RAW(32)
CH1                                                VARCHAR2(1000)
SCOTT@demo102> create table mytable_obj as select * from all_objects;
Table created.
SCOTT@demo102> create index myindex on mytable_obj(object_id);
Index created.
SCOTT@demo102> exec dbms_stats.gather_table_stats(ownname=>'SCOTT',tabname=>'MYTABLE_OBJ',cascade=>true);
PL/SQL procedure successfully completed.
SCOTT@demo102> select last_analyzed from all_tables where table_name='MYTABLE_OBJ';
07/06/06
SCOTT@demo102> select last_analyzed from all_indexes where table_name='MYTABLE_OBJ';
07/06/06
SCOTT@demo102> exec dbms_stats.export_table_stats('SCOTT','MYTABLE_OBJ',stattab=>'MYTBL')
PL/SQL procedure successfully completed.
SCOTT@demo102> exec dbms_stats.delete_table_stats('SCOTT','MYTABLE_OBJ');
PL/SQL procedure successfully completed.
SCOTT@demo102> select last_analyzed from all_tables where table_name='MYTABLE_OBJ';
-->No value
SCOTT@demo102> select last_analyzed from all_indexes where table_name='MYTABLE_OBJ';
-->No value
SCOTT@demo102>  exec dbms_stats.import_table_stats('SCOTT','MYTABLE_OBJ',stattab=>'MYTBL');
PL/SQL procedure successfully completed.
SCOTT@demo102> select last_analyzed from all_tables where table_name='MYTABLE_OBJ';
07/06/06
SCOTT@demo102> select last_analyzed from all_indexes where table_name='MYTABLE_OBJ';
07/06/06
SCOTT@demo102> Nicolas.

Similar Messages

  • Please provide me unix shell script (export and import of database schema)

    please i am new in unix
    \please give me sample unix shell script (export and import of database schema)

    please i am new in unix
    \please give me sample unix shell script (export and import of database schema)Instead of providing you the readymade unix shell script for your requirement, I will give you the hints to prepare the same at your own.
    Create a file with .sh extension.
    # Specify and set all required Environment variables.
    ORACLE_HOME, PATH, NLS_LANG, etc.,
    # Use the export command with all clauses
    export scott/tiger@orcl file=scott.dmp log=scott_emp.log
    # Compress it.
    compress scott.dmp
    Refer any unix/linux documents for scripting basics.
    http://www.bijoos.com/ora7/oracle_unix.htm
    Regards,
    Sabdar Syed.

  • Require help in understanding exporting and importing statistics.

    Hi all,
    I am bit new to this statistics.
    Can anyone please explain me in detail about these commands.
    1) exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'MRP' , tabname => 'MRP_ATP_DETAILS_TEMP', estimate_percent => 100 ,cascade => TRUE);
    2) exec DBMS_STATS.CREATE_STAT_TABLE ( ownname => 'MRP', stattab => 'MRP_ATP_3');
    3) exec DBMS_STATS.EXPORT_TABLE_STATS ( ownname => 'MRP', stattab => 'MRP_ATP_3', tabname => 'MRP_ATP_DETAILS_TEMP',statid => 'MRP27jan14');
    4) exec DBMS_STATS.IMPORT_TABLE_STATS ( ownname => 'MRP', stattab => 'MRP_ATP_3', tabname => 'MRP_ATP_DETAILS_TEMP');
    I understand that these commands are used to export and import table statistics.
    But please anyone help me in understanding this indetail.
    Thanks in advance.
    Regards,
    Shiva.

    Shiva,
    Please post the details of the application release, database version and OS.
    Please see (FAQ: Statistics Gathering Frequently Asked Questions (Doc ID 1501712.1) -- What is the difference between DBMS_STATS and FND_STATS).
    For exporting/importing statistics summary of FND_STATS Subprograms can be found in (Doc ID 122371.1)
    http://etrm.oracle.com/pls/et1211d9/etrm_pnav.show_details?c_name=FND_STATS&c_owner=APPS&c_type=PACKAGE&c_detail_type=source
    http://etrm.oracle.com/pls/et1211d9/etrm_pnav.show_details?c_name=FND_STATS&c_owner=APPS&c_type=PACKAGE%20BODY&c_detail_type=source
    Thanks,
    Hussein

  • Export and import of a schema

    Hi all,
    I'm having some trouble doing a simply export than import of a schema (data + structure, tables, sequences, triggers, indexes) from 10g XE to a same version server on another machine.
    I follow different ways:
    1) Export from "script sql" both as file as sql commands. Trying to import I get this error:
    Not found
    The requested URL /apex/f was not found on this server2) (traduction from italy language sorry if it is not exact in english) Loading/Downloading data -> Download -> I tryed both file and xml format. In this case I must select one table a time... but only the structure is exported, no data... fortunately I have only 3 table to export, but if I got many tables?
    In short, which is the best practice to export schema with inside data from a server to another with same version??
    Thankyou

    get this message from command line:
    C:\Documents and Settings\user>expdp unime
    Export: Release 10.2.0.1.0 - Production on Giovedý, 26 Febbraio, 2009 12:48:08
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    Password:
    Connesso a: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    ORA-39002: operazione non valida (operation not valid)
    ORA-39070: Impossibile aprire il file di log. (impossibile to open log file)
    ORA-39145: il parametro oggetto directory deve essere specificato e diverso da null (the directory object parameter must be specified and different from null)
    C:\Documents and Settings\user>get the same from sqlplus:
    C:\Documents and Settings\user>sqlplus
    SQL*Plus: Release 10.2.0.1.0 - Production on Gio Feb 26 12:41:48 2009
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Immettere il nome utente: unime
    Immettere la password:
    Connesso a:
    Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    SQL> CONN sys/**** as SYSDBA
    Connesso.
    SQL> ALTER USER unime IDENTIFIED BY **** ACCOUNT UNLOCK;
    Utente modificato.
    SQL> GRANT CREATE ANY DIRECTORY TO unime;
    Concessione riuscita.
    SQL> CREATE OR REPLACE DIRECTORY test_dir AS '/u01/app/oracle/oradata/';
    Creata directory.
    SQL> GRANT READ, WRITE ON DIRECTORY test_dir TO unime;
    Concessione riuscita.
    SQL> host expdp unime
    Export: Release 10.2.0.1.0 - Production on Giovedý, 26 Febbraio, 2009 12:44:40
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    Password:
    Connesso a: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    ORA-39002: operazione non valida
    ORA-39070: Impossibile aprire il file di log.
    ORA-39145: il parametro oggetto directory deve essere specificato e diverso da null
    SQL> quit
    Disconnesso da Oracle Database 10g Express Edition Release 10.2.0.1.0 - ProductionEdited by: Silicio on 26-feb-2009 12.47

  • How to export and import signatures for addon Signature Auto Paste Prefill Fourms?

    Working with TB 24.7.0 and add-on Signature Auto Paste Prefill Fourms 4.2. I want to copy (export) the signatures to another computer where I want to import them. The developer of the add-on was not able to help me. The export function gives you three choices of format. The import function does NOT work.
    Can anyone provide detailed instructions on how to export and import the signatures from one computer to another? I am using windows XP.
    Bob Ptaker

    If the add-on author is unable to assist you, what hope do us mere mortals have

  • Need help with export, truncate and import partitions for schema

    I need to export a couple partions for our Oracle 10g Data Warehouse, truncate the partions and re-import the data using the exported file. How can I do this?
    In summary here is what I need to do:
    1) Export partition P_SUB_1 of table1
    2) Export partition P_SUB2 of table2
    3) Truncate P_SUB1 partition of table1
    4) Truncate P_SUB2 partition of table2
    5) Rebuild indexes on table1 and table2
    6) Re-import data into each partition from export files.

    If you have enough free space it might be easier to keep these partition in the database and do an exchange.
    That means:
    create table tmp_sub1 as select * from table1 where 1=0;
    create table tmp_sub2 as select * from table2 where 1=0;
    alter table table1 exchange partition sub1 with table tmp_sub1;
    alter table table2 exchange partition sub2 with table tmp_sub2;After that your have the data of your original partitions in the tmp tables and the particular partitions in the partitioned table are empty and ready for the loading process.
    If the loading process fail and you need to restore the partitions just reverse the exchange statements and you are done.

  • Connection for added/updated objects during export and import

    Hi,
    Plz help me..
    During export and import of updated/added objects to production,
    do we need to create the dataservers manually or else how can it be managed for an existing object???
    Thanks..

    eg AD_CTX_DDL
    APPS_ARRAY_DDL
    APPS_DDL
    ORA-20000 APPS_DDL/APPS_ARRAY_DDL package(s) missing or invalid in schema CTXSYS (Doc ID 944150.1)
    CS_KB_CTX_PKG. I think these are packages.Run the following scripts from $CS_TOP/patch/115/sql directory:
    cskbdstb.pls
    cskbdsts.pls
    Also I am missing certain privileges on packages given to system and application users by sys eg DBMS_SHARED_POOL. I have already executed adgrantsHave you run recreate grants and synonym from adadmin?
    Thanks,
    Hussein

  • Export and import parameter

    Hello friends,
    DB: oracle 10 G
    O/S: Sun soalris 10.
    I have two sun system system both are having oracle 10 g
    1 is prod database and another is test database
    both the systems having same schema ,I want to make my test database upto date with production database before i used to use the complete export of prod database 1 schema and simply import it into test database but this process is taking very long time as the export dump file size is alomost 2 GB and while importing at the test database the import process is taking almost 5 to 6 hours ,
    the test database schema having 10 days old old records when compare to production database.
    so I just want to know is there any parameter in IMPORT so that the import process will take only changes information from export dumpfile while importing?
    I am using tradition export and import method

    thanks for your reply,
    But right now I don'thave datapump export dump file and my production database is at remote place ,
    again to take datapump export and copy the dump from netwrok takes again very long time network will also get busy to get 2 gb of file
    right now at my place where test db is located I have only tradition export dump file
    any more sujjestion in that

  • Automate exports and imports (file names)

    Hi all,
    I will like to do a simple export and import into another schema in another database.
    I am very comfortable with the process but all i require is how to set the dump file and log file names to be the date and name of the database being exported and imported.
    For example when I do the export I want the dump file and logfile names to be exp_ORCL_230805.dmp and exp_ORCL_230805.log for database ORCL exported on 230805.
    For the import I want the logfile name to be imp_TEST_230905.log.
    I am doing this export and import on a windows xp client but the database reside on UNIX AIX boxes.
    Thanks alot

    Set up ORACLE_SID before you start the imp/exp session on the client, then you can use file=exp_%ORACLE_SID%_%DATE%.dmp etc. (Note that if your windows locale has a date format w/ whitespace you might have to write a script to get rid of it..)

  • EDL Exporting and Importing 101 - Questions

    We are testing Premiere's EDL exporting and importing capabilities for the first time, and are having issues. We set up a very simple test project with only 13 clips and a timeline with only 10 edits. This is what it looks like:
    http://img819.imageshack.us/img819/760/j00s.jpg
    We then exported an EDL, started a new Premiere project, and imported the EDL that we had created. It looks like this:
    http://imageshack.us/scaled/landing/32/6hd9.jpg
    In the bin, Premiere has now turned our 13 clips into 41. There are a lot of duplicates, some of them appear to be audio files, even though they're still listed as .mxf, and the start and end times are all over the place. Our two questions are: Why is it making all of these clips? And how do you link the media? We started linking some media manually, but because the start and end times are scrambled, the edits appear completely wrong in the sequence that we've created.

    There are a variety of options offered when exporting an EDL.
    These options need to be considered in conjunction with how the Sequence (audio/video layers) are set up. (managed for the edl export)
    Source footage management is vital in any EDL workflow and even more so when digital clip based ( as opposed to tape based linear assets).  ie. Digital clips have "unusual" filenames and are buried deep in folder structures. Filenames can be duplicated in a card to cards system. (mxf would be an exception)
    The EDL is usually used between systems.
    In your case  you are testing on the same system and assumeably the source footage resides there as well.  This is a case for managing the "source" so as the links work.
    Example. When I take footage to a Resolve Suite out of house.  I set up all the source footage on an transfer drive in the same manner ( filenames, folder names etc) as the original local drive .

  • Separate Distribution Monitor Export and Import Processes on Multiple Machines

    Hi,
    Would you kindly let me know whether it is possible (means officially supported way) to run Distribution Monitor Export and Import processes on different machines?
    As per SAP note 0001595840 "Using DISTMON and MIGMON on source and target systems", it says as below.
    > 1. DISTMON expects the export and import to be carried out on the same server
    I think it means that export and import processes for the same tables must be run on the same machine, is it correct? If yes, Export on the machine A, and then Import those exported data on the other machine B is not the officially supported way... (However, I know it is technically possible.)
    Kind regards,
    Yutaka

    Hi Yutaka,
    Point no. 2 & 3 clarify the confusion. However let me explain it briefly:
    Distribution Monitor is used basically in case of migration of large SAP systems (database). It provides the feature to increase parallelism of export and import, distributing the process across available systems.
    You have to prepare the system for using DistMon. A common directory needs to be created as"commDir" and in case you use multiple systems for executing more number of processes of export and import then that "commDir" should be shared across all those systems.  And this is what the Point no.1 in KBA 1595840 mentions about. Distribution Monitor will run both the export and import process from the machine which is prepared for using DistMon and DistMon itself will control the other processes i.e. MigMon. No need to start separate MigMon.
    For example: You are performing a migration of SAP system based on OS:AIX and DB:DB2 to  OS: HP-UX and DB: Oracle. You need to perform the export using DistMon and you are having 4 Windows servers which can be used for parallel export/import. Once you have prepared the system for DistMon which hosts the "commDir" you'll have to provide the information of involved host machines in the "distribution_monitor_cmd.properties" file. Now when DistMon is executed it will distribute the export and import process across the systems which were defined in "distribution_monitor_cmd.properties" file automatically.
    Best regards,
    SUJIT

  • How to export and import "Computer Groups" and "Patch approvals" in WSUS 4.0 ?

    Hi,
    I have a query regarding the export and import options for "Computer Groups" and "Patch Approvals" in WSUS 4.0.
    In WSUS 3.2 once we install WSUS 3.0 API Samples and Tools, we get "WSUSMigrationExport" and "WSUSMigrationImport" tools under
    C:\Program Files\Update Services 3.0 API Samples and Tools\WsusMigrate\ folder. 
    Using the 'WSUSMigrationExport' tool we can export the Computer Groups and the Patch Approvals in a XML file. And using the 'WSUSMigrationImport' tool we can import the 'Computer Groups' and the 'Patch Approvals' from that XML file into a different WSUS
    3.2 server. We can run the import tool as below:
    a. Run command prompt as administrator.
    b. In the command prompt, go to C:\Program Files (x86)\Update Services 3.0 API Samples ans Tools\ WsusMigrate\WsusMigrationImport
    c. Type WsusMigrationImport filename.xml TargetGroups None. Press enter; this will import Computer Groups to the WSUS 3.2 server.
       Type WsusMigrationImport filename.xml Approvals None. Press enter; this will import "Patch Approvals" to the WSUS 3.2 server.
    This is easy and useful.
    Now, for WSUS 4.0 I did not find  "WSUS
    4.0 API Samples and Tools". So I installed "WSUS 3.0 API Samples and Tools" in my WSUS 4.0 server. And tried to import a valid XML file in the above mentioned process. But the command returned an error.
    The error says the "Microsoft.UpdateService.Administration.dll" file was not found.
    I further searched in the internet about this issue and I found that the "WSUS 3.0 API Samples and Tools" is not supported in WSUS 4.0 as the .net framework used in "WSUS 3.0 API Samples and Tools" is 2.0 and WSUS 4.0 uses .net Framework
    4.5.
    So, Here are my questions.
    1. Is it correct that "WSUS 3.0 API Samples and Tools" is not supported in WSUS 4.0?
    2. Is "WSUS 4.0 API Samples and Tools" available?
    3. Is there any alternative way in WSUS 4.0 to export and import XML file consisting "Computer Groups" and "Patch Approvals" configurations?
    I need an urgent reply. Thank you in advance.

    Hi Tapojyoti,
    >>1. Is it correct that "WSUS 3.0 API Samples and Tools" is not supported in WSUS 4.0?
    Yes, WSUS 3.0 API Samples and Tools is not supported in Windows Server 2012R2 by default. We may try to rebuild it in Windows Server 2012R2. For detailed information about how the rebuiled, please refer to the readme document of the WSUS 3.0 API Samples
    and Tools.
    >>2. Is "WSUS 4.0 API Samples and Tools" available?
    No, I can't find the WSUS API Samples and Tools for 2012R2.
    >>3. Is there any alternative way in WSUS 4.0 to export and import XML file consisting "Computer Groups" and "Patch Approvals" configurations?
    As I have mentioned above, due to WSUS 3.0 API Samples and Tools is released with source code, we can try to rebuild it in the Windows Server 2012R2.
    If it doesn't work, as a workaround, we can configure the new WSUS server as the replica server of the existing WSUS server. After the synchronization, change the server mode to stand alone.
    Best Regards.
    Steven Lee
    TechNet Community Support

  • IDM export and import commands

    Hi,
    I am trying to export and import some of the objects in IDM using command line interface.
    Eg: I have an user Object which needs to be exported in the form of an XML file using command line and make some changes and import the user object xml back to IDM using a command line interface
    I am very new to IDM. Any help regarding this would be useful.
    Thanks,
    Hsotnas
    Edited by: Hsotnas on Sep 18, 2007 7:17 PM

    Actually it can. There are a few ways to achieve this but one is.
    1. List Objects > ListFile
    2. read ListFile to write commands to CommandFile i.e ListFile might contain:
    Configuration SPE
    EndUserForm MyForm
    then this script (perl or whatever) would convert this to the following in CommandFile
    "getObject Configuration SPE > SPE.xml"
    "getObject EndUserForm MyForm > MyForm.xml"
    3. run lh console < CommandFile

  • How to export and import dependent tables from 2 different schema

    I have a setup where schema1 has table 1 and schema 2 has table2. And table 1 from schema1 depends upon table 2 from schema 2.
    I would like to export and import these tables only and not any other tables from these 2 schemas with all information like grants,constraints.
    Also will there be same method for Oracle 10g R1,R2 and Oracle 11G.
    http://download.oracle.com/docs/cd/B12037_01/server.101/b10825/dp_export.htm#i1007514
    Looking at this For table mode it says
    Also, as in schema exports, cross-schema references are not exported
    Not sure what this means.
    As I am interested in only 2 tables I think I need to use table mode. But if I try to run export with both tables names, it says table mode support only one schema at a time. Not sure then How would the constraints would get exported in that case.
    -Rohit

    worked for my 1st time I tried
    exp file=table2.dmp tables="dbadmin.temp1,scott.emp"
    Export: Release 10.2.0.1.0 - Production on Mon Mar 1 16:32:07 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export done in US7ASCII character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P1 character set (possible charset conversion)
    About to export specified tables via Conventional Path ...
    Current user changed to DBADMIN
    . . exporting table                          TEMP1         10 rows exported
    EXP-00091: Exporting questionable statistics.
    Current user changed to SCOTT
    . . exporting table                            EMP         14 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    Export terminated successfully with warnings.

  • Exporting and Importing of Schema

    The Export and Import of Schema is not working consistently in SP4.
    When I create a new repository from the schema, all the subtables get created. The main table is has only one field in it. It reads Category.
    The other fields in the main table are missing and are not imported during the Import Schema operation.
    Has anyone encountered the same issue?
    If so, please let me know how you overcame this errorneous behaviour?

    Hi Adhappan
    I have tried importing without success. The import is not consistent. The same schema creates different sets of tables & fields each time. Sometimes if you reimport some more tables which were not set up the first time gets selected for import.
    I did a simple exercise to test the import / export functionality.
    1. I unarchived a repository from standard content provided by SAP
    2. I exported the schema
    3. I tried importing the schema without any modifications to schema
    4. The results were highly inconsistent.
    Arvind
    Pls reward useful answers

Maybe you are looking for