EXP -- IMP but tablespace of source schema increased!

Hello,
I wanted to make a copy of a schema called ELT (has default tablespace = ELT) on a another schema on the same database (version 92040).
So I created a new schema ELT2 with a new tablespace as its default tablespace called ELT2.
I've done export of ELT using user system with rows=y consistent=y owner=ELT
and then I've done import using also user system with the following params fromuser=ELT touser=ELT2
but the import doubled the size of tablespace ELT while the size of tablespace ELT2 remained zero!!!! It should have been the opposite, and I'm sure I've everything correctly? I've checked and re-checked.
I don't understand what happened! Can somebody explain this.
Many thnx.
Cheers,
Simon

IMHO:
1/ You grant RESOURCE to user ELT2 -> hence ELT2 is granted UNLIMITED TABLES
2/ You did export: The objects were exported with "TABLESPACE ELT" in their DDL statement.
3/ When you import, the objects are creates as expected in tablespace ELT.
If you want those objects to be created in ELT2 tablespace, revoke UNLIMITED TABLESPACE from ELT2, ensure tablespace ELT2 is user ELT2's defalt tablespace and then reimport.
Regards,
Yoann.

Similar Messages

  • Exp/imp of tablespace

    hi all
    could we use exp/imp of tablespace on same database
    say we have list of objects in a tablespace and we want to shrink the tablespace size. ( as there are not many transactions ) then can we take export of this tablespace onyly , drop the tablespace , create a new tablesapce with same name but of less size and import objects back from the export dump file into this tablespace
    thanks
    kedar

    So basically you want to shrink the tablespace but you've got objects scattered around in there and it won't coalesce.
    There are several ways to do this, not least the one you've mentioned.
    An alternative would be to rebuild the objects into a new (smaller) tablespace. If you haven't got the disk space to accomodate a new tablespace and the existing one then, yes, export/import will do the job.

  • Exp/Imp with tablespace autocreate?

    Hello dear community,
    I've got a question about backup with exp/imp. It's not about RMAN, but nevertheless I hope it's the right board.
    Is there a command to tell imp to automatically create a specified tablespace from a dumpfile?
    For exp I use:
    exp.exe user/pass tablespaces=test file=FILE.DMP
    For imp I use:
    imp.exe user/pass tablespaces=test file=FILE.DMP
    Now, if I drop the tablespace between exp and imp, so imp can't restore it. There I manually have to create the specified tablespace first. It would be nice, if there is a way that imp creates the tablespace if it is not present.
    Are there maybe some parameter like "autocreate" and "use datafile=..." and so on?
    Thanks for help,
    best regards,
    Ronny

    Transportable Tablespaces is a seperate feature whereby you take a physical
    copy of tablespace datafiles and "plug" them into the target database.
    That is different from taking a logical export dump.
    With export dumps, the only way to get the tablespace created is in a FULL
    export and import. The CREATE TABLESPACE commands are written into
    the dmp file only when a FULL export is done. At import time, you can choose
    to import a specific schema in which case, the CREATE TABLESPACE
    commands are not executed by import (they must be precreated). So,
    CREATE TABLESPACE is executed only when you do a FULL import as well.
    Note that this {obviously} creates the same datafile names (including physical
    paths) and sizes as existant in the source database. That may or may not
    meet your requirements on occassion (eg different filesystem structure,
    different sizes planned in the target database).

  • Exp/imp, but packages are missing

    Hi there,
    exported everything from a schema in a O8 database into a dump file using the exp command. Afterwards I tried to import the schema into my local O10 XE database. Tables, Triggers and so on were imported fine, but the packages are missing (and since Triggers call functions in these packages, compiling them fails). Is this behaviour normal?? I'm pretty sure, that the packages were exported into the dump-file, but I can't figure out a way importing them explicitly.
    So if anyone has an idea, I would be glad to know...
    Regards,
    Daniel

    Is this behaviour normal??I don't think so. Apart from checking Metalink Note:132904.1 - Compatibility Matrix for Export & Import Between Different Oracle Versions, you could try to repeat the import operation using ROWS=N and IGNORE=Y options.

  • Tablespace exp imp 11G

    I have problem,
    I want to imp file.dmp to schemauser
    schemauser have tablespace default USERS
    then I exp schmeauser from my pc. I named file.dmp
    then when
    I imp from other pc,
    I have prepare for schmeausernew with tablespace default tbspace.
    when I imp the data I suprise with my default tablespace tbspace not using.
    But my imp it take a place in USERS.
    Sow my question is, How to imp automaticly to my default tablespace tbspace ???
    I don't want to alter table tablespace when I must one by one and also my index.
    Thanks.
    for Supporting.

    Exp/Imp with tablespace autocreate?
    imp user/pass tablespaces=dbtothis FULL=Y
    Why still not into tablespace=dbtothis.
    Still in Tablespace users
    Thanks.
    For help me.

  • Refresh database through exp/imp

    version 10203 on windows (old db on solaris 8170)
    i Have to refresh database data, No of users/schemas are 400+. fastest way to do that would be to do full exp/imp.
    but 1st drop current users cascade. (any command to drop all users in with one command?)
    then validate all all tables/schemas are same & up-to-date( any suggestion to validate this efficiently?) i am thinking to check full exp logs on old & new db.(what that can take forever manually going through thousands of tables etc)

    First, make sure that statistics are properly gathered in both the old and the new DB.
    Next, use num_rows column in dba_tables. Since statistics are typically gathered by sampling, they would not necessarily match. They need to be quite close though.
    After you are sure that all objects were transferred, you can issue a query to find all tables with “invalid” number of records.
    The query can look something like this:
    select s.owner, d.table_name , d.num_rows
    from dba_tables d
    where d.owner not in (‘SYS’,’SYSTEM’,…)
    and not exists
               (select * from dba_tables@old_db s
              Where d.owner = s.owner
              And d.table_name = s.table_name
              And s.num_rows between 0.9*d.num_rows and 1.1*d.num_rows
              )You need to take care of some special cases, such as num_rows being NULL, partitioned tables, etc.
    Iordan Iotzov
    http://iiotzov.wordpress.com/

  • Using exp dump file, how to know schema/tablespace/database/table....

    Hi
    I have export dump file. I don't have any information other then that about the dumpfile.
    1. How do I know what version of exp was used(or database version)?
    2. what does the exp contain, whether it is table/schema/tablespace/database level export?
    3. Is it possible that I can used impdp command to export a file which was exported using "exp" command?

    1. Sorry for my ignorance but in first case how do I know if i have to use "imp or impdp".
    As discussed in the above mentioned thread strings might reveal something but not sure. Otherwise you can always confirm by trying to run it.
    And assuming that if I imported the dump(using trial and error rule), how do I know what level of export was the dump file created... ie
    how do I know the present object(object imported in target) was belonging to which tablespace and schema in source database(question 2 of "Sidhu")?
    When you will import it back in a new database, everything will be same as the original database. All objects will belong to their respective tablespaces,schemas blah blah...(where they were in the original database).
    Sidhu

  • EXP & IMP only restores to same tablespace(s)

    Hi All...
    I created new user with new tablespace and import dmp to new user
    but the import utility imported same tablepspace (increase old datafile size).
    even I dropped new user but the old tablespace (datafile) size not reduced?
    Sudhir

    Normal behavior.
    Using conventional exp/imp the correct method to 'relocate' an user is
    - make sure the target user doesn't have unlimited tablespace privilege by revoking it
    - make sure the user doesn't have quota on the old tablespace and does have quota on the new tablespace
    alter user <target user> default tablespace <new tablespace> quota unlimited on <new tablespace> quota 0 on <old tablespace>
    - Now import with indexes=n
    - run a second import using the parameter indexfile=<any filename>
    - Edit the resulting file, changing all tablespaces
    - now run the file in sqlplus
    - this should be it
    Using expdp and impdp, remap_tablespaces on impdp is sufficient
    Hth
    Sybrand Bakker
    Senior Oracle DBA

  • Schema Refresh using exp/imp.

    Hello All,
    I want to perform Schema Refresh of SAMPLE user from producation to Testing envrionment using export/import.
    Cud u plz tell what is the command to perform it ?
    Also Cud anyone plz tell me whether same user(SAMPLE) in Test environment gets dropped before Import done.
    Can i Perform the exp/imp using sys/system or user SAMPLE?

    tvenkatesh07 wrote:
    Hello All,
    I want to perform Schema Refresh of SAMPLE user from producation to Testing envrionment using export/import.
    Cud u plz tell what is the command to perform it ?
    Also Cud anyone plz tell me whether same user(SAMPLE) in Test environment gets dropped before Import done.
    Can i Perform the exp/imp using sys/system or user SAMPLE?If you're runnnig 10g, then use Data Pump and read the documentation:
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_export.htm#i1007466
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm#i1007653
    My Oracle Video Tutorials - http://kamranagayev.wordpress.com/oracle-video-tutorials/

  • Exp/Import database schema vs Ch 10 Exp/Imp Content

    Hi I'm using Portal 10.1.2.02. I'm a DBA tasked with migrating a Dev portal into Test, then Production.
    There seems to be a number of ways to achieve this and I was hoping to clarify which is best for me.
    I've run through note: 330391.1(copying the Portal schema) which details running a perl script to export and import onto a newly installed server. This process worked will.
    Chapter 10 of the Portal guide details a fairly complex process of creating transports sets and migrating these over and importing.
    My question is: if I want the entire Portal copied is there any difference in these processes? ie. Do you end up with the same result?
    thanks in advance of any advice :-)

    The cloning model is not a rerunnable model, its not granular and conditional migraton is not possible.
    You can do the cloning only if u want to take a copy of the entire setup and rewire to a new midtier.
    Portal exp/imp model helps u to achieve granular, conditional and rerunnable method of moving portal objects.
    Apart from that, it comes with the readily available prechecks to intimate what's going wrong during the process.

  • Refresh using exp/imp

    Hi All,
    I need to refresh one of my development database using exp/imp utility,i did refreshing using the copy datafiles but new to this ,i have gone thru this forum and did some search on google too but the part that am not getting is suppose my development db already created with same tablespaces name and size and containes every schema as that of source and has data in it which is refreshed earlier, so do i need to wipe out everything and do it right from scratch(create db,tablespace,users) or just wipe out tables and import with (ignore=Y) .Any feedback would be apperciated.
    Thanks

    <p class="MsoNormal">Hi,</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">>> do i need to wipe out everything and do it right from
    scratch(create db,tablespace,users)</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">Well you have choices, and you can do whatever you want,
    but there are some “best practices”.  I’m assuming that you are copying your
    PROD database into DEV (a great practice to have a full-sized production system
    the TEST and DEV):</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can restore PROD into TEST using RMAN.</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can use the
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://download-east.oracle.com/docs/html/B16227_02/oui7_cloning.htm">
    Oracle Universal Installer</a> to clone a database.</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can “<a style="color: blue; text-decoration: underline; text-underline: single" href="oracle_tips_db_copy.htm">clone
    the whole database</a>”, moving the current datafiles and recreating the
    instance</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">- You can export from PROD, truncate the tables in DEV
    (after a backup) and import IGNORE=Y.  This can be very slow, especially for a
    large database.  However, there are some
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:1240595435323">
    tips to speed-up imports</a>.</p>
    <p class="MsoNormal"> </p>
    <p class="MsoNormal">Check the
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://search.oracle.com/">
    Oracle docs</a>.  Also, Dave Moore has a book on
    <a style="color: blue; text-decoration: underline; text-underline: single" href="http://www.amazon.com/Oracle-Utilities-Programs-Oradebug-Dbverify/dp/0972751351">
    Oracle Utilities</a> that you might like.</p>
    Hope this helps . . . .
    Don Burleson
    <p>
    www.dba-oracle.com

  • Exp/Imp alternatives for large amounts of data (30GB)

    Hi,
    I've come into a new role where various test database are to be 'refreshed' each night with cleansed copies of production data. They have been using the Imp/Exp utilities with 10g R2. The export process is ok, but what's killing us is the time it takes to transfer..unzip...and import 32GB .dmp files. I'm looking for suggestions on what we can do to reduce these times. Currently the import takes 4 to 5 hours.
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilities. Are 'Transportable Tablespaces' the next logical solution? I've been reading up on them and could start prototyping/testing the process next week. What else is in Oracle's toolbox I should be considering?
    Thanks
    brian

    Hi,
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilitiesDatapump will be faster for a couple of reasons. It uses direct path to unload the data. DataPump also supports parallel processes, so while one process is exporting metadata, the other processes can be exporting the data. In 11, you can also compress the dumpfiles as you are exporting. (Both data and metadata compression is available in 11, I think metadata compression is available in 10.2). This will remove your zip step.
    As far as transportable tablespace, yes, this is an option. There are some requirements, but if it works for you, all you will be exporting will be the metadata and no data. The data is copied from the source to the target by way of datafiles. One of the biggest requirements is that the tablespaces need to be read only while the export job is running. This is true for both exp/imp and expdp/impdp.

  • Exp/Imp In Oracle 10g client

    Hi All,
    I want to take one schema export from Oracle 10g Client. I am in new Oracle 10g.
    In oracle 9i, we can using exp/imp command for Export and Import from Client itself.
    I heard about in Oracle 10g, we can use Expdb/Impdb in Server only. Is there any possibility to take exp/imp in Client also.
    Pls help me...
    Cheers,
    Moorthy.GS

    To add up to expdb from oracle client
    NETWORK_LINK
    Default: none
    Purpose
    Enables an export from a (source) database identified by a valid database link. The data from the source database instance is written to a dump file set on the connected database instance.
    Syntax and Description
    NETWORK_LINK=source_database_link
    The NETWORK_LINK parameter initiates an export using a database link. This means that the system to which the expdp client is connected contacts the source database referenced by the source_database_link, retrieves data from it, and writes the data to a dump file set back on the connected system.
    The source_database_link provided must be the name of a database link to an available database. If the database on that instance does not already have a database link, you or your DBA must create one. For more information about the CREATE DATABASE LINK statement, see Oracle Database SQL Reference.
    If the source database is read-only, then the user on the source database must have a locally managed tablespace assigned as the default temporary tablespace. Otherwise, the job will fail. For further details about this, see the information about creating locally managed temporary tablespaces in the Oracle Database Administrator's Guide.
    Restrictions
    When the NETWORK_LINK parameter is used in conjunction with the TABLES parameter, only whole tables can be exported (not partitions of tables).
    The only types of database links supported by Data Pump Export are: public, fixed-user, and connected-user. Current-user database links are not supported.
    Example
    The following is an example of using the NETWORK_LINK parameter. The source_database_link would be replaced with the name of a valid database link that must already exist.
    expdp hr/hr DIRECTORY=dpump_dir1 NETWORK_LINK=source_database_linkDUMPFILE=network_export.dmp LOGFILE=network_export.log

  • Full database exp/imp  between RAC  and single database

    Hi Experts,
    we have a RAC database oracle 10GR2 with 4 node in linux. i try to duplicate rac database into single instance window database.
    there are same version both database. during importing, I need to create 4 undo tablespace to keep imp processing.
    How to keep one undo tablespace in single instance database?
    any experience of exp/imp RAC database into single instance database to share with me?
    Thanks
    Jim
    Edited by: user589812 on Nov 13, 2009 10:35 AM

    JIm,
    I also want to know can we add the exclude=tablespace on the impdp command for full database exp/imp?You can't use exclude=tablespace on exp/imp. It is for datapump expdp/impdp only.
    I am very insteresting in your recommadition.
    But for a full database impdp, how to exclude a table during full database imp? May I have a example for this case?
    I used a expdp for full database exp. but I got a exp error in expdp log as ORA-31679: Table data object "SALE"."TOAD_PLAN_TABLE" has long columns, and longs can not >be loaded/unloaded using a network linkHaving long columns in a table means that it can't be exported/imported over a network link. To exclude this, you can use the exclude expression:
    expdp user/password exclude=TABLE:"= 'SALES'" ...
    This will exclude all tables named sales. If you have that table in schema scott and then in schema blake, it will exclude both of them. The error that you are getting is not a fatal error, but that table will not be exported/imported.
    the final message as
    Master table "SYSTEM"."SYS_EXPORT_FULL_01" successfully loaded/unloaded
    Dump file set for SYSTEM.SYS_EXPORT_FULL_01 is:
    F:\ORACLEBACKUP\SALEFULL091113.DMP
    Job "SYSTEM"."SYS_EXPORT_FULL_01" completed with 1 error(s) at 16:50:26Yes, the fact that it did not export one table does not make the job fail, it will continue on exporting all other objects.
    . I drop database that gerenated a expdp dump file.
    and recreate blank database and then impdp again.
    But I got lots of error as
    ORA-39151: Table "SYSMAN"."MGMT_ARU_OUI_COMPONENTS" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
    ORA-39151: Table "SYSMAN"."MGMT_BUG_ADVISORY" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
    ......ORA-31684: Object type TYPE_BODY:"SYSMAN"."MGMT_THRESHOLD" already exists
    ORA-39111: Dependent object type TRIGGER:"SYSMAN"."SEV_ANNOTATION_INSERT_TR" skipped, base object type VIEW:"SYSMAN"."MGMT_SEVERITY_ANNOTATION" >already exists
    and last line as
    Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 2581 error(s) at 11:54:57Yes, even though you think you have an empty database, if you have installed any apps or anything, it may create tables that could exist in your dumpfile. If you know that you want the tables from the dumpfile and not the existing ones in the database, then you can use this on the impdp command:
    impdp user/password table_exists_action=replace ...
    If a table that is being imported exists, DataPump will detect this, drop the table, then create the table. Then all of the dependent objects will be created. If you don't then the table and all of it's dependent objects will be skipped, (which is the default).
    There are 4 options with table_exists_action
    replace - I described above
    skip - default, means skip the table and dependent objects like indexes, index statistics, table statistics, etc
    append - keep the existing table and append the data to it, but skip dependent objects
    truncate - truncate the existing table and add the data from the dumpfile, but skip dependent objects.
    Hope this helps.
    Dean

  • Running OMBPlus and EXP/IMP in mixed version environment

    OWB Mixed Environment Guru's
    Current environment:
    OWB Client: 10.1.0.2.0 on Windows XP Professional
    OWB Server side: 10.1.0.2.0 on UNIX (AIX 5.2)
    Repository: Oracle 9.2.0.4 on UNIX (AIX 5.2)
    UNIX Listener: 9.2.0.4 on UNIX (AIX 5.2)
    Runtime Repository: Oracle 9.2.0.4 on UNIX (AIX 5.2)
    I call this a mixed environment since my OWB stuff is 10g and my database stuff is 9.2.
    Issues:
    1- I can't get the command line exp.sh script to connect to the repository and returns the famous 'ORA-12154, TNS:listener does not currently know of service requested in connect descriptor'. It looks like the 'owbsetenv.sh' script is changing the value of $ORACLE_HOME to point to the 10g areas. Could that be then causing the system to look for a 10g LISTENER which doesn't exist since all my databases are 9.2.0.4???
    2- I have the same issue trying to run OMBPlus.sh.
    I am ultimately trying to set up a promotion process using the UNIX command line programs (exp/imp and OMBPlus) to get objects from the TEST environment into the PRODUCTION environment which is a separate repository and target schema on a different machine.
    Any advice on how to successfully operate in this 'mixed' environment is most welcomed.
    Many thanks!
    Gary

    Well it looks like I did it again!
    Total brain fart.
    The problem turned out that I wasn't specifying the entire SERVICE_NAME for the repository database. I had been leaving off the domain information. Must be a habit from not having to use it in the TNSNAMES.ORA files.
    I was able to compelte my test export and connect to OMBPLUS and will now try my test import.
    Sorry to clutter the forum but if it helps anyone else with the same affliction I seem to have frequently, I guess that's a small reward.
    Until next time.
    Gary

Maybe you are looking for

  • Free goods determination in the delivery document ??????????

    Hello All I am working on SD module I have a problem that I created a sales order that for every 100 pc 2 free pc will be given then I created the delivery order for 50 pc only but the system puts 2 free pc I tried again to change the delivery quanti

  • Can't import audio CD into iTunes with Vista

    I can't import audio CDs into iTunes with my new Dell 531 Vista computer. The songs seem to import, but only small (~50K) files are created, which don't play. Also, I cant play audio CDs using iTunes. CDs do play using Windows Media Player. The DVD/C

  • Making backups of client's website

    Recently I designed, and a freelance colleague produced, a website for a clent of mine. Note: I have very little website production knowledge - my field is print design. This client (who also has little web use knowledge) uses Macromedia Contribute 3

  • Repository with 2 or more fact tables

    Helllo everyone. Is there any official document describing how to setup a repository with two or more fact tables? I have created a repository with three fact tables and the needed dimension tables, but there are some loops through table links. Local

  • Original/modified folders

    Hello, I'm new to iPhoto and have done some searching online and have played around with settings but still am confused with iPhoto. This is what's happening... I have copied all of my photo folders from my PC to the pictures folder on my mac into a