Exp &imp Error

hi
iffce this problem when itry to export sepecific user from my database
error :
EXP-00008: ORACLE error 4068 encountered
ORA-04068: existing state of packages has been discarded
ORA-04063: package body "SYS.LT_EXPORT_PKG" has errors
ORA-06508: PL/SQL: could not find program unit being called
ORA-06512: at line 1
EXP-00083: The previous problem occurred when calling SYS.LT_EXPORT_PKG.schema_i
nfo_exp
. exporting statistics
Export terminated successfully with warnings.
id this execution :
catexp.sql
catproc.sql
aftre that irecompile the package SYS.LT_EXPORT_PKG
but there is error come with it
show error .................
SQL> show errors package body lt_export_pkg;
Errors for PACKAGE BODY LT_EXPORT_PKG:
LINE/COL ERROR
64/12 PL/SQL: SQL Statement ignored
64/19 PLS-00201: identifier 'WMSYS.WM$UP_DEL_TRIG_NAME_SEQUENCE' must
be declared
65/12 PL/SQL: SQL Statement ignored
65/19 PLS-00201: identifier 'WMSYS.WM$INSTEADOF_TRIGS_SEQUENCE' must be
declared
66/12 PL/SQL: SQL Statement ignored
66/19 PLS-00201: identifier 'WMSYS.WM$LOCK_SEQUENCE' must be declared
67/12 PL/SQL: SQL Statement ignored
LINE/COL ERROR
67/19 PLS-00201: identifier 'WMSYS.WM$VTID' must be declared
68/12 PL/SQL: SQL Statement ignored
68/19 PLS-00201: identifier 'WMSYS.WM$ADT_SEQUENCE' must be declared
69/12 PL/SQL: SQL Statement ignored
69/19 PLS-00201: identifier 'WMSYS.WM$VERSION_SEQUENCE' must be
declared
70/12 PL/SQL: SQL Statement ignored
70/19 PLS-00201: identifier 'WMSYS.WM$ROW_SYNC_ID_SEQUENCE' must be
declared
LINE/COL ERROR
71/12 PL/SQL: SQL Statement ignored
71/19 PLS-00201: identifier 'WMSYS.WM$UDTRIG_DISPATCHER_SEQUENCE' must
be declared
192/10 PL/SQL: Statement ignored
192/18 PLS-00905: object SYS.LTADM is invalid
SQL>
so what else can ido to exeport my db successfully
thanks
john smith

Export terminated successfully with warnings.
your export operation complete successfully.
with some warning.
import ur dump file and check.
and also run both scripts before export ur db.

Similar Messages

  • Export / import  exp / imp commands Oracle 10gXE on Ubuntu

    I have Oracle 10gXE installed on Linux 2.6.32-28-generic #55-Ubuntu, and I need soe help on how to export / import the base with exp / imp commands. The commands seems to be installed on */usr/lib/oracle/xe/app/oracle/product/10.2.0/server/bin* directory but I cannot execute them.
    The error message I got =
    No command 'exp' found, did you mean:
    Command 'xep' from package 'pvm-examples' (universe)
    Command 'ex' from package 'vim' (main)
    Command 'ex' from package 'nvi' (universe)
    Command 'ex' from package 'vim-nox' (universe)
    Command 'ex' from package 'vim-gnome' (main)
    Command 'ex' from package 'vim-tiny' (main)
    Command 'ex' from package 'vim-gtk' (universe)
    Command 'axp' from package 'axp' (universe)
    Command 'expr' from package 'coreutils' (main)
    Command 'expn' from package 'sendmail-base' (universe)
    Command 'epp' from package 'e16' (universe)
    exp: command not found
    Is there something I have to do ?

    Hi,
    You have not set environment variables correctly.
    http://download.oracle.com/docs/cd/B25329_01/doc/install.102/b25144/toc.htm#BABDGCHH
    And of course that sciprt have small hickup, so see
    http://ubuntuforums.org/showpost.php?p=7838671&postcount=4
    Regards,
    Jari

  • Full Database Exp & Imp

    Hi,
    I am trying to Exp & Imp a full database. I am working on Oracle 9i & 10g on Solaris 9. I am not using Data Pump. Can anyone please help me with the following:
    1. I am performing the full export using SYSTEM user.
    1.a Does a Full export include (or backups up) the DATA DICTIONARY of the database?
    1.b Does a Full export include the backup of SYS and SYSTEM objects?
    I am using the following command to export
    exp system/system@testdb file=$HOME/testdbfullexp.dmp full=y statistics=none
    I have tried importing the FULL export to another database and i did see that SYS and SYSTEM objects were also being imported ( got some errors regarding constraints and inconsistencies).
    I would like to ask like what are the ideal steps to follow to copy a database from DB1 to DB2 using EXP and IMP
    Any information will be of a great help
    Thanks,
    Harris.

    1 a) No, as the data dictionary will be automagicall recreated by implicit SQL.
    This means any non dictionary objects under SYS will be lost
    1 b) as above. SYSTEM however is a normal user.
    any %SYS user will NOT be exported (CTXSYS, MDSYS, etc)
    On import of SYSTEM there will be always errors, as SYSTEM is non-empty after initial database creation.
    Sybrand Bakker
    Senior Oracle DBA

  • Full database exp/imp  between RAC  and single database

    Hi Experts,
    we have a RAC database oracle 10GR2 with 4 node in linux. i try to duplicate rac database into single instance window database.
    there are same version both database. during importing, I need to create 4 undo tablespace to keep imp processing.
    How to keep one undo tablespace in single instance database?
    any experience of exp/imp RAC database into single instance database to share with me?
    Thanks
    Jim
    Edited by: user589812 on Nov 13, 2009 10:35 AM

    JIm,
    I also want to know can we add the exclude=tablespace on the impdp command for full database exp/imp?You can't use exclude=tablespace on exp/imp. It is for datapump expdp/impdp only.
    I am very insteresting in your recommadition.
    But for a full database impdp, how to exclude a table during full database imp? May I have a example for this case?
    I used a expdp for full database exp. but I got a exp error in expdp log as ORA-31679: Table data object "SALE"."TOAD_PLAN_TABLE" has long columns, and longs can not >be loaded/unloaded using a network linkHaving long columns in a table means that it can't be exported/imported over a network link. To exclude this, you can use the exclude expression:
    expdp user/password exclude=TABLE:"= 'SALES'" ...
    This will exclude all tables named sales. If you have that table in schema scott and then in schema blake, it will exclude both of them. The error that you are getting is not a fatal error, but that table will not be exported/imported.
    the final message as
    Master table "SYSTEM"."SYS_EXPORT_FULL_01" successfully loaded/unloaded
    Dump file set for SYSTEM.SYS_EXPORT_FULL_01 is:
    F:\ORACLEBACKUP\SALEFULL091113.DMP
    Job "SYSTEM"."SYS_EXPORT_FULL_01" completed with 1 error(s) at 16:50:26Yes, the fact that it did not export one table does not make the job fail, it will continue on exporting all other objects.
    . I drop database that gerenated a expdp dump file.
    and recreate blank database and then impdp again.
    But I got lots of error as
    ORA-39151: Table "SYSMAN"."MGMT_ARU_OUI_COMPONENTS" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
    ORA-39151: Table "SYSMAN"."MGMT_BUG_ADVISORY" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
    ......ORA-31684: Object type TYPE_BODY:"SYSMAN"."MGMT_THRESHOLD" already exists
    ORA-39111: Dependent object type TRIGGER:"SYSMAN"."SEV_ANNOTATION_INSERT_TR" skipped, base object type VIEW:"SYSMAN"."MGMT_SEVERITY_ANNOTATION" >already exists
    and last line as
    Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 2581 error(s) at 11:54:57Yes, even though you think you have an empty database, if you have installed any apps or anything, it may create tables that could exist in your dumpfile. If you know that you want the tables from the dumpfile and not the existing ones in the database, then you can use this on the impdp command:
    impdp user/password table_exists_action=replace ...
    If a table that is being imported exists, DataPump will detect this, drop the table, then create the table. Then all of the dependent objects will be created. If you don't then the table and all of it's dependent objects will be skipped, (which is the default).
    There are 4 options with table_exists_action
    replace - I described above
    skip - default, means skip the table and dependent objects like indexes, index statistics, table statistics, etc
    append - keep the existing table and append the data to it, but skip dependent objects
    truncate - truncate the existing table and add the data from the dumpfile, but skip dependent objects.
    Hope this helps.
    Dean

  • IMP errors on LINUX

    I am trying to import to a database running on LINUX a *DMP file that I got from another database running on a windows server.  The export ran sucessfully and I used SCP to copy the file over to the LINUX box.  I keep getting the following:
    IMP-00019: row rejected due to ORACLE error 1400
    IMP-00003: ORACLE error 1400 encountered
    ORA-01400: cannot insert NULL into ("DEMO"."ADMINISTRATIVE_CHECKLIST"."MEMBER")
    Column 1
    Column 2
    Column 3
    Column 4
    Column 5
    Column 6
    Column 7
    Column 8
    Column 9
    Column 10
    for seemingly all the columns ....on and on till the next row.
    I have already sucessfully done and export and import of a different users schema between these two databases previously.
    Any ideas? Is it just a bad file or do I have some other issue with characterset conversion between LINUX and Windows?

    Yes, I checked and the table has not null constraint for that column but when I query the source database table there are no nulls - just good numbers. I tried this with another *.dmp and I'm getting the same thing on a different schema, the common thing is that both not null columns are numbers. There must be something hosed in the *.dmp file itself and I've got it on not one, but two *dmp files now.  This has to be some sort of character set translation issue between Windows or Linux or just a bad set of files.  I did NOT FTP them I used a flash disk physically to bring those files to my own Windows machine from the Windows server behind the firewall and then did SCP to copy them up to the target LINUX server.
    Strange thing is that I managed to get a good exp/imp between these databases using the exact same method on yet a third *.dmp file.

  • Exp imp of full db

    I want to take full exp and imp it to new db, i tried one imp but ended up errors....
    What would be the command for full db exp , imp db with data which should end without warnings?

    . exporting bitmap, functional and extensible indexes
    . exporting posttables actions
    . exporting triggers
    EXP-00056: ORACLE error 1422 encountered
    ORA-01422: exact fetch returns more than requested number of rows
    ORA-06512: at "XDB.DBMS_XDBUTIL_INT", line 55
    ORA-06512: at line 1
    EXP-00056: ORACLE error 1422 encountered
    ORA-01422: exact fetch returns more than requested number of rows
    ORA-06512: at "XDB.DBMS_XDBUTIL_INT", line 55
    ORA-06512: at line 1
    EXP-00000: Export terminated unsuccessfully

  • EXP-00015: error while exporting a table !!!

    Hi
    I was trying to take table backup by exp and import method.
    Got the following error
    Please adivse.
    EXP-00015: error on row 7548444 of table LOADER_POS, column VERSIONDATE, datatype 12
    From
    P

    Jaffar
    Appreciate your response.
    There is enough space.
    When u say analyze table structure ..
    Are you talking about the Updating the statistics. ?
    Please explain.
    Thanks again.
    From
    PBy "analyze table structure", you can see if you have any corruptions of block :
    analyze table table_name validate structure;[pre]
    or
    [pre]analyze index index_name validate structure;[pre]
    Nicolas.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Exp/imp related problem

    Hi all,
    I have one problem in Oracle 9.2.0.8 onRHEL4. Iwant to use exp/imp to update one table from one server to another.
    Suppose I have one table abc on A server and it is having around 1 million records and another B server is also having the same no. of records and some of the record have been changed to A server and i also want to imp the same record to another server B.Is there any paramneter in IMP.
    Please suggest me.

    Hi,
    Kamran Agayev A. wrote:
    user00726 wrote:
    but how would i know which table has been updated or modifiedUsing MERGE funtion, you'll merge two tables. This command will updated changed rows and insert non-inserted rows to the second table and make it as the same as the first tableSQL> create table azar(pid number,sales number,status varchar2(20));
    Table created.
    SQL> create table azar01(pid number,sales number,status varchar2(20));
    Table created.
    SQL> insert into azar01 values(1,12,'CURR');
    1 row created.
    SQL> insert into azar01 values(2,13,'NEW');
    1 row created.
    SQL> insert into azar01 values(3,15,'CURR');
    1 row created.
    SQL> insert into azar values(2,24,'CURR');
    1 row created.
    SQL> insert into azar values(3,0,'OBS');
    1 row created.
    SQL> insert into azar values(4,42,'CURR');
    1 row created.
    SQL> commit;
    Commit complete.
    SQL> select * from azar01;
    PID SALES STATUS
    1 12 CURR
    2 13 NEW
    3 15 CURR
    SQL> select * from azar;
    PID SALES STATUS
    2 24 CURR
    3 0 OBS
    4 42 CURR
    SQL> merge into azar01 a using azar b on (a.pid=b.pid) when matched
    2 then update set a.sales=a.sales + b.sales, a.status=b.status
    3 delete where a.status='OBS'
    4 when not matched
    5 then insert values(b.pid,b.sales,'NEW');
    3 rows merged.
    SQL> select * from azar01;
    PID SALES STATUS
    1 12 CURR
    2 37 CURR
    4 42 NEW
    Hello Sir, is this correct?
    Regards
    S.Azar
    DBA

  • Exp/Imp alternatives for large amounts of data (30GB)

    Hi,
    I've come into a new role where various test database are to be 'refreshed' each night with cleansed copies of production data. They have been using the Imp/Exp utilities with 10g R2. The export process is ok, but what's killing us is the time it takes to transfer..unzip...and import 32GB .dmp files. I'm looking for suggestions on what we can do to reduce these times. Currently the import takes 4 to 5 hours.
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilities. Are 'Transportable Tablespaces' the next logical solution? I've been reading up on them and could start prototyping/testing the process next week. What else is in Oracle's toolbox I should be considering?
    Thanks
    brian

    Hi,
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilitiesDatapump will be faster for a couple of reasons. It uses direct path to unload the data. DataPump also supports parallel processes, so while one process is exporting metadata, the other processes can be exporting the data. In 11, you can also compress the dumpfiles as you are exporting. (Both data and metadata compression is available in 11, I think metadata compression is available in 10.2). This will remove your zip step.
    As far as transportable tablespace, yes, this is an option. There are some requirements, but if it works for you, all you will be exporting will be the metadata and no data. The data is copied from the source to the target by way of datafiles. One of the biggest requirements is that the tablespaces need to be read only while the export job is running. This is true for both exp/imp and expdp/impdp.

  • Exp/Imp content Area Portlet..?

    How to Exp/Imp "Content area folder published as a portlet" which reside in "WWSBR_SITEBUILDER_PROVIDER".
    Anybody from Oracle..?
    Thanks.
    Rakesh

    I assume you are using the 309 version of portal. It is possible to export out an entire content area or a page but not the granular items. By granular I mean just the folder in this context. If you wish to export/import the entire content area/page, you can run the pageexp/pageimp/contexp/contimp scripts which reside in the WWU directory.

  • Exp/Imp In Oracle 10g client

    Hi All,
    I want to take one schema export from Oracle 10g Client. I am in new Oracle 10g.
    In oracle 9i, we can using exp/imp command for Export and Import from Client itself.
    I heard about in Oracle 10g, we can use Expdb/Impdb in Server only. Is there any possibility to take exp/imp in Client also.
    Pls help me...
    Cheers,
    Moorthy.GS

    To add up to expdb from oracle client
    NETWORK_LINK
    Default: none
    Purpose
    Enables an export from a (source) database identified by a valid database link. The data from the source database instance is written to a dump file set on the connected database instance.
    Syntax and Description
    NETWORK_LINK=source_database_link
    The NETWORK_LINK parameter initiates an export using a database link. This means that the system to which the expdp client is connected contacts the source database referenced by the source_database_link, retrieves data from it, and writes the data to a dump file set back on the connected system.
    The source_database_link provided must be the name of a database link to an available database. If the database on that instance does not already have a database link, you or your DBA must create one. For more information about the CREATE DATABASE LINK statement, see Oracle Database SQL Reference.
    If the source database is read-only, then the user on the source database must have a locally managed tablespace assigned as the default temporary tablespace. Otherwise, the job will fail. For further details about this, see the information about creating locally managed temporary tablespaces in the Oracle Database Administrator's Guide.
    Restrictions
    When the NETWORK_LINK parameter is used in conjunction with the TABLES parameter, only whole tables can be exported (not partitions of tables).
    The only types of database links supported by Data Pump Export are: public, fixed-user, and connected-user. Current-user database links are not supported.
    Example
    The following is an example of using the NETWORK_LINK parameter. The source_database_link would be replaced with the name of a valid database link that must already exist.
    expdp hr/hr DIRECTORY=dpump_dir1 NETWORK_LINK=source_database_linkDUMPFILE=network_export.dmp LOGFILE=network_export.log

  • Database upgrade from 8i to 10g using exp/imp

    Dear Friends,
    Please provide the steps to upgrade from 8i to 10g using exp/imp.
    Thanks,
    Rathinavel

    Hi;
    Please also see cold backup option
    How to migrate from 8i to 10g to new server using cold backup [ID 742108.1]
    Also see:
    Upgrading from 8i to 10g with import utility
    http://searchoracle.techtarget.com/answer/Upgrading-from-8i-to-10g-with-import-utility
    Regard
    Helios

  • Running OMBPlus and EXP/IMP in mixed version environment

    OWB Mixed Environment Guru's
    Current environment:
    OWB Client: 10.1.0.2.0 on Windows XP Professional
    OWB Server side: 10.1.0.2.0 on UNIX (AIX 5.2)
    Repository: Oracle 9.2.0.4 on UNIX (AIX 5.2)
    UNIX Listener: 9.2.0.4 on UNIX (AIX 5.2)
    Runtime Repository: Oracle 9.2.0.4 on UNIX (AIX 5.2)
    I call this a mixed environment since my OWB stuff is 10g and my database stuff is 9.2.
    Issues:
    1- I can't get the command line exp.sh script to connect to the repository and returns the famous 'ORA-12154, TNS:listener does not currently know of service requested in connect descriptor'. It looks like the 'owbsetenv.sh' script is changing the value of $ORACLE_HOME to point to the 10g areas. Could that be then causing the system to look for a 10g LISTENER which doesn't exist since all my databases are 9.2.0.4???
    2- I have the same issue trying to run OMBPlus.sh.
    I am ultimately trying to set up a promotion process using the UNIX command line programs (exp/imp and OMBPlus) to get objects from the TEST environment into the PRODUCTION environment which is a separate repository and target schema on a different machine.
    Any advice on how to successfully operate in this 'mixed' environment is most welcomed.
    Many thanks!
    Gary

    Well it looks like I did it again!
    Total brain fart.
    The problem turned out that I wasn't specifying the entire SERVICE_NAME for the repository database. I had been leaving off the domain information. Must be a habit from not having to use it in the TNSNAMES.ORA files.
    I was able to compelte my test export and connect to OMBPLUS and will now try my test import.
    Sorry to clutter the forum but if it helps anyone else with the same affliction I seem to have frequently, I guess that's a small reward.
    Until next time.
    Gary

  • Best way of using exp/imp

    Dear all,
    I wanted to migrate database from 8i to 11g(8.1.5 to 11.1.0). I am going for exp/imp method. Which is the best method of doing this task? I mean Full export and Import Or Schemawise export and import? Is there any chances of missing objects or rows while doing this task? If yes, How to avoid? Please help me to take a best decision. I dont want any problem after migration.
    Approach is (take exp of 8.1.5 and imp in 9.2.0) then (exp of 9.2.0 and imp it to 11.1.0.6)
    OS is HP Unix
    Nishant Santhan

    Have you not yet completed this task? As we have already answered to your question a couple of days back.
    Take a look at the similar duplicate thread created by you.
    Re: Migrating from 8i to 11g
    Regards,
    Sabdar Syed.

  • Urgent EXP 00028 Error

    Hi All,
    While taking Export i am getting Exp 00028 error
    i have typed
    C:\>exp userid=apregs_admin/apregs_admin file='D:\DB_Backup_08052007\fulldump\08
    052007.DMP' log='D:\DB_Backup_08052007\fulldump\08052007.log' full=Y;
    Export: Release 10.2.0.1.0 - Production on Tue May 8 13:08:20 2007
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    EXP-00028: failed to open D:\DB_Backup_08052007\fulldump\08052007.log for write
    EXP-00000: Export terminated unsuccessfully
    plz help its urgent

    Sorry i found out my mistake
    i did
    C:\>exp userid=apregs_admin/apregs_admin file='D:\DB_Backup_08052007\fulldump\08
    052007.DMP' log='D:\DB_Backup_08052007\fulldump\08052007.log' full=Y;
    i have to
    C:\>exp userid=apregs_admin/apregs_admin file='D:\DB_Backup_08052007\fulldump08052007.DMP'
    log='D:\DB_Backup_08052007\fulldump08052007.log' full=Y;

Maybe you are looking for

  • Dual Boot with Snow Leopard and Mavericks not possible?

    I have a Early 2011 Macbook Pro with 2,3 GHz i7 Quad and two 512 GB SSDs (the secondary disk in an optical drive bay enclosure). This Macbook Pro can run Snow Leopard and Mavericks on the primary disk without problems. When I try to set up a dual boo

  • Capturing Video on New iMac with only 1 FireWire Port

    Using one of the new Apple iMacs, what is the best way to capture digital video footage (Mini DV) into Final Cut using an external drive as your scratch disk? The iMac offers only one FireWire 800 port. Is it advisable to connect the camcorder to the

  • Is there a way to display timesheet for range of dates for list of employee

    Dear Experts, Is there any way, where  user can view the time sheet data for all employees for a project ? CAT3 is for single employee Se16 cannot be given to the users.  I also tried CATS_DA, report is good, but I am not getting hours .It is showing

  • The number of display columns in the report reached the limit" Interactive Report

    I get the error message of "The number of display columns in the report reached the limit" when trying to display less than 100 (100 is limit) columns when an aggregate has been created.   This does not happen when an aggregate has not been added to

  • Retrive MultiRow Result Set From Procedure

    Hi, Can you please guide me, on ways to retrieve result set with more than one row from procedure. I got two ways... 1. Using Ref Cursor & 2. Using Collection Do we have any other ways also to do this. Thanks, Ashish Edited by: Ashish Thakre on May 2