Schema Export

Hi,
I have a schema export using expdp which I want to import. I'm planning to drop the user and import the data. If I do that, will anything will be invalidated like synonym, grants etc? Or will it come when I import the data? If it will affect anthing like grants etc, is there any script I can use to gather the information before I start the import? My database is running in 10gR2 in Unix
Thanks

Rock2 wrote:
Hi,
I have a schema export using expdp which I want to import. I'm planning to drop the user and import the data. If I do that, will anything will be invalidated like synonym, grants etc? Or will it come when I import the data? If it will affect anthing like grants etc, is there any script I can use to gather the information before I start the import? My database is running in 10gR2 in UnixGrants Made will be absolutely fine even if you drop the schema and import.
It should work fine.
SS

Similar Messages

  • Huge size scheme export in standard edition

    Hi,
    Source:-
    Oracle 10.2.0.3 standard edition on Linux.
    Destination:-
    Oracle 11gR2 - enterprise edition.
    I have to export a scheme of size 250+ gb, it taking long time to export as we do not have parallelism too in STD edition.
    Is there any way? Where I can perform export & import faster?
    Constraint is expdp of schema taking 30+ hours, so If transportable tablespaces, then is there any compatibility problem with source and destination versions and editions?
    An what is the procedure?
    Thanks.

    Hemant K Chitale wrote:
    Can i use 11gR2 binaries to perform TTS of 10g Std edition database?
    You could concurrently run multiple export sessions with table lists --- but you wouldn't get data consistency if the tables are being updated.Thanks for your information.
    This question is now out of TTS, talking in general expdp/impdp.
    I have posted this question in Export/import division of Database, But no quick responses so i moved my question to Database-General.
    Solomon Yakobson mentioned we can use 11gR2 binaries to perform export schema of 10g database, from below link.
    Huge size scheme export in standard edition
    Hope this will work. any more suggestions on this?
    Any suggestions
    Edited by: Autoconfig on Oct 17, 2011 6:32 AM

  • Schema export/Import

    Will schema export/import work if database is not open?

    user503988 wrote:
    Will schema export/import work if database is not open?NO!

  • Schema export failing with error

    Hi,
    I am doing a schema export ,and its failing with the below error.
    EXP-00008: ORACLE error 6550 encountered
    ORA-06550: line 1, column 13:
    PLS-00201: identifier 'SYS.LT_EXPORT_PKG' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    EXP-00083: The previous problem occurred when calling SYS.LT_EXPORT_PKG.schema_info_exp
    Please advice how to solve this.
    Regards

    Hi,
    I think you hit the wrong forum: this forum is for Oracle Berkeley DB, not Oracle DB.
    Best regards,
    Rucong
    Oracle Berkeley DB
    Edited by: rucong.zhao on Dec 24, 2010 1:22 AM

  • Run Schema Exporter on password protected Access 97 mdb

    I'm trying to generate the Microsoft Access XML file required by OMWB. The MS-ACCESS 97 database that I want to migrate is password protected. When I attempt to use the MDB, I must put in an id/pw. When I try to run the "Database Schema Exporter" (version 10.1.0.2), I receive "Error 3033: You don't have necessary permissions to use path\to\access\mdb".
    There is no way to enter an id/pw from the Database Schema Exporter. Is there any way around this? Any help is greatly appreciated. Thanks!

    What a Muppet!
    Was thinking alone the same lines as your post, if it thinks the username or password is wrong maybe it is?!?
    Anyway � what I have found out is �wait for it
    Access passwords are limited to 20 characters, but there is no audio feedback to show you have reached the limit. So if you look at the keyboard when you type (like I do) you are blissfully unaware the last four characters of your password have not been accepted!
    Therefore as the Exception caught by my application said; my password was wrong!
    Maybe this will help others out in the future - although I have a sinking feeling that maybe I am the only one here who could be so dumb!
    Doh!

  • Using expdp to do a schema export is taking extremely long time

    I am using expdp in schema mode to export a 300 gigbyte database.
    The job status report 99% complete after about 2 hours .
    But now the job has been running 30 hours and is not finish.
    I can see that it is exporting the domain indexes and had been exporting
    the last index for the last 5 hours. Something is not working because I
    looked at the table the index is using and it has no data. So, why is it taking
    so long to export an index that has no data?
    Can someone tell if there is a way to bypass exporting indexes and a easy way
    to recreate the indexes if you do?
    I am using oracle 11g and expdp utility.

    I checked the log file and there are no errors in the file.
    There are no ORA- xxxx error messages.
    The last line in the log file is as follows:
    "Processing object type schema_export/table/index/domain_index/index "
    I just checked the export job run this morning and it is still on the same
    index object "A685_IX1" . This is a spatial index. It has been sitting at
    this same object according to the job status for at least 24 hours.

  • Schema Export using DBMS_DATAPUMP is extremely slow

    Hi,
    I created a procedure that duplicates a schema within a given database by first exporting the schema to a dump file using DBMS_DATAPUMP and then imports the same file (can't use network link because it fails most of the time).
    My problem is that a regular schema datapump export takes about 1.5 minutes whereas the export using dbms_datapump takes about 10 times longer - something in the range of 14 minutes.
    here is the code of the procedure that duplicates the schema:
    CREATE OR REPLACE PROCEDURE MOR_DBA.copy_schema3 (
                                              source_schema in varchar2,
                                              destination_schema in varchar2,
                                              include_data in number default 0,
                                              new_password in varchar2 default null,
                                              new_tablespace in varchar2 default null
                                            ) as
      h   number;
      js  varchar2(9); -- COMPLETED or STOPPED
      q   varchar2(1) := chr(39);
      v_old_tablespace varchar2(30);
      v_table_name varchar2(30);
    BEGIN
       /* open a new schema level export job */
       h := dbms_datapump.open ('EXPORT',  'SCHEMA');
       /* attach a file to the operation */
       DBMS_DATAPUMP.ADD_FILE (h, 'COPY_SCHEMA_EXP' ||copy_schema_unique_counter.NEXTVAL || '.DMP', 'LOCAL_DATAPUMP_DIR');
       /* restrict to the schema we want to copy */
       dbms_datapump.metadata_filter (h, 'SCHEMA_LIST',q||source_schema||q);
       /* apply the data filter if we don't want to copy the data */
       IF include_data = 0 THEN
          dbms_datapump.data_filter(h,'INCLUDE_ROWS',0);
       END IF;
       /* start the job */
       dbms_datapump.start_job(h);
       /* wait for the job to finish */
       dbms_datapump.wait_for_job(h, js);
       /* detach the job handle and free the resources */
       dbms_datapump.detach(h);
       /* open a new schema level import job */
       h := dbms_datapump.open ('IMPORT',  'SCHEMA');
       /* attach a file to the operation */
       DBMS_DATAPUMP.ADD_FILE (h, 'COPY_SCHEMA_EXP' ||copy_schema_unique_counter.CURRVAL || '.DMP', 'LOCAL_DATAPUMP_DIR');
       /* restrict to the schema we want to copy */
       dbms_datapump.metadata_filter (h, 'SCHEMA_LIST',q||source_schema||q);
       /* remap the importing schema name to the schema we want to create */     
       dbms_datapump.metadata_remap(h,'REMAP_SCHEMA',source_schema,destination_schema);
       /* remap the tablespace if needed */
       IF new_tablespace IS NOT NULL THEN
          select default_tablespace
          into v_old_tablespace
          from dba_users
          where username=source_schema;
          dbms_datapump.metadata_remap(h,'REMAP_TABLESPACE', v_old_tablespace, new_tablespace);
       END IF;
       /* apply the data filter if we don't want to copy the data */
       IF include_data = 0 THEN
          dbms_datapump.data_filter(h,'INCLUDE_ROWS',0);
       END IF;
       /* start the job */
       dbms_datapump.start_job(h);
       /* wait for the job to finish */
       dbms_datapump.wait_for_job(h, js);
       /* detach the job handle and free the resources */
       dbms_datapump.detach(h);
       /* change the password as the new user has the same password hash as the old user,
       which means the new user can't login! */
       execute immediate 'alter user '||destination_schema||' identified by '||NVL(new_password, destination_schema);
       /* finally, remove the dump file */
       utl_file.fremove('LOCAL_DATAPUMP_DIR','COPY_SCHEMA_EXP' ||copy_schema_unique_counter.CURRVAL|| '.DMP');
    /*EXCEPTION
       WHEN OTHERS THEN    --CLEAN UP IF SOMETHING GOES WRONG
          SELECT t.table_name
          INTO v_table_name
          FROM user_tables t, user_datapump_jobs j
          WHERE t.table_name=j.job_name
          AND j.state='NOT RUNNING';
          execute immediate 'DROP TABLE  ' || v_table_name || ' PURGE';
          RAISE;*/
    end copy_schema3;
    /The import part of the procedure takes about 2 minutes which is the same time a regular dp import takes on the same schema.
    If I disable the import completely it (the export) still takes about 14 minutes.
    Does anyone know why the export using dbms_datapump takes so long for exporting?
    thanks.

    Hi,
    I did a tkprof on the DM trace file and this is what I found:
    Trace file: D:\Oracle\diag\rdbms\instanceid\instanceid\trace\instanceid_dm00_8004.trc
    Sort options: prsela  execpu  fchela 
    count    = number of times OCI procedure was executed
    cpu      = cpu time in seconds executing
    elapsed  = elapsed time in seconds executing
    disk     = number of physical reads of buffers from disk
    query    = number of buffers gotten for consistent read
    current  = number of buffers gotten in current mode (usually for update)
    rows     = number of rows processed by the fetch or execute call
    SQL ID: bjf05cwcj5s6p
    Plan Hash: 0
    BEGIN :1 := sys.kupc$que_int.receive(:2); END;
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        3      0.00       0.00          0          0          0           0
    Execute    229      1.26     939.00         10       2445          0          66
    Fetch        0      0.00       0.00          0          0          0           0
    total      232      1.26     939.00         10       2445          0          66
    Misses in library cache during parse: 0
    Optimizer mode: ALL_ROWS
    Parsing user id: SYS   (recursive depth: 2)
    Elapsed times include waiting on following events:
      Event waited on                             Times   Max. Wait  Total Waited
      ----------------------------------------   Waited  ----------  ------------
      wait for unread message on broadcast channel
                                                    949        1.01        936.39
    ********************************************************************************what does "wait for unread message on broadcast channel" mean and why did it take 939 seconds (more than 15 minutes) ?

  • How to automate the schema export task

    Hi,
    I take online backup by exporting the schema 3 times in a day.
    I want to automate this process.
    with the help of expdp and linux job schedule to repeate the job daily, how it is possible....
    Regards
    Prabhaker

    schedule using DBMS_JOBHmmmm... no :-)
    Exporting is done through the exp binary, so you can't submit it via an Oracle Job - except if you use Java to launch an O.S. program, but that's another journey.
    Use cron or at instead (or task scheduler if you're on Win*)
    Yoann.

  • Schema export from 10g to 11g

    Dear all
    I want ro export a schema from 10g to import 11g,please brief me steps what should i do to complete my job.

    You can use expdp utility for that purpose.
    Look at this step by step guide:
    http://www.oracle-base.com/articles/10g/OracleDataPump10g.php

  • Schema export via Oracle data pump with Database Vault enabled question

    Hi,
    I have installed and configured Database Vault on an Oracle 11g-r2-11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a realm. I have followed the following doc:
    http://www.oracle.com/technetwork/database/security/twp-databasevault-dba-bestpractices-199882.pdf
    to ensure that the sys and the system user has sufficient rights to complete a schedule Oracle data pump export operation.
    I.e. I have granted to sys and system the following:
    execute dvsys.dbms_macadm.authorize_scheduler_user('sys','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_scheduler_user('system','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user('sys','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user('system','SCHEMA_NAME');
    I have also create a second realm on the same schema (SCHEMA_NAME) to allow sys and system to maintain indexes for real-protected tables, To allow a sys and system to maintain indexes for realm-protected tables. This separate realm was created for all their index types: Index, Index Partition, and Indextype, sys and system have been authorized as OWNER to this realm.
    However, when I try and complete an Oracle Data Pump export operation on the schema, I get two errors directly after the following line displayed in the export log:
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
    ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 9081
    ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 9081
    The export is completed but with this errors.
    Any help, suggestions, pointers, etc actually anything will be very welcome at this stage.
    Thank you

    Hi Srini,
    Thank you very much for your help. Unfortunately after having followed the instructions of the DOC I am still getting the same errors ?
    none the less thank you for your input.
    I was also wondering if someone could tell me how to move this thread to the Database Security area of the forum, as I feel I may have posted the thread in the wrong place as it appears to be a Database Vault issue and not an imp/exp problem. ?
    Edited by: zooid on May 20, 2012 10:33 PM
    Edited by: zooid on May 20, 2012 10:36 PM

  • Schema export dump needs to be imported to primary and standby

    Hi All,
    I have one production Db with standby. I have taken the schema level export from the other database which needs to be imported to the production as well standby.
    I can manage the down time.
    So, can I import it to the production first and then import to the standby. Later on can I synchronize the primary with standby? How
    Request your advice on this.
    Regards,

    SS! wrote:
    Hi All,
    I have one production Db with standby. I have taken the schema level export from the other database which needs to be imported to the production as well standby.
    I can manage the down time.
    So, can I import it to the production first and then import to the standby. Later on can I synchronize the primary with standby? How
    Request your advice on this.
    Regards,You need firstly apply these changes (import the data) to Primary database, then by creating and transferring archived redo log files, apply all changes to Standby database

  • Jobs in schema export

    Hi,
    I am doing export and import of schema in Oracle 10g. After doing an import I noticed all the jobs are missing in imported schema. Any suggestions on this.
    Thanks

    You can create a expdp dumpfile and generate sql script of jobs using impdp.
    see more details here:
    http://www.dba-oracle.com/job_scheduling/export_scheduler.htm
    you can even use "include=job" parameter creating sql code of jobs only...
    Edited by: Kecskemethy on Jan 17, 2011 3:19 AM

  • Schema export to SQL

    Hi@all,
    we're running a RDS machine (thats the Amazon Web Services type of a database) with Oracle 11.2.0.2.5 on it. At Amazon Web Services machines we didn't have access to te file system of the database, so no remote connection is available.
    Now i try to find a way and a tool with that I'm able to export a complete schema with it's objects and data sets to an SQL file. Did you guys know any tool for that?
    I'm searching in SQL Developer but i didn't find the way...
    Thanks and regards,
    David

    Hi,
    there are probably other tools on the market that can do this - we use plsql developer and thats seems much like sqldeveloper and doesn;t do what you want.
    Another alternative (if you have an oracle db locally at your site) is to use datapump using a network_link to create a local file on your machine from the remote machine - i blogged about that here:
    http://dbaharrison.blogspot.de/2013/05/expdp-creating-file-on-my-local-machine.html
    can you create a db link to AWS and do this?
    Cheers,
    Harry

  • Access 2000 schema export failing

    I'm running Windows XP Sp2
    MS Access 2003 (DB is 2000 file-format)
    Exporter for Microsoft Access Release 10.1.0.4.0
    I just downloaded and tried to run the omwb access export utility, and it keeps giving me the following error whenever I click "export database schema":
    "error #2516 - XMLExporter - Method of 'Run' of object '_Application' failed.
    I tried deleting all of the _Oracle references, and the "exporter" reference in VB, left over after it screws up, and I even tried converting the database to 2003 format (and tried the 2000, 2002, and 2003 converters on both versions), but no luck. Any suggestions?

    If you have MS Access 2003 installed, I think the mdb file should also be in 2003 format. Also which exporter are you running? It should be omwb2003.mde
    Also checkout a previous post, with a similar issue and the suggestions Hilary made
    Errors when exporting Access schema
    Donal

  • Oracle 10g - SCHEMA Export Import

    Hi
    I have a query - Whenever there are some changes in Database schema say for eg, New Columns are added, Some New Sequences are created and sundry other changes for an application release - What is the best way to take backup of the DDL so that I can bring the database / it's objects to the same position as they were in before the release?
    Regards
    Kapil

    You can take export of the schema. Its not required any downtime.
    Also, you take the cold backup and restore it to other server.
    Clone the database by RMAN duplication and restore it to other server before making the changes.
    Regards
    Asif Kabir

Maybe you are looking for

  • HT4113 I bought a second hand ipod 4 touch my other one just got stolen how do i get this passcode off and set it to factory?

    Hi I bought a second hand ipod touch 4th generation it has a passcode on it I have tried to hold the on/off button and the home button for 20 sec but never goes to the plug and cord thing symbal I want to factory reset it my other I pod just got stol

  • Music with a slide show

    In I tunes, I croped some song to start at a particular location. This works fine in Itunes. However, when I use this music for a slide show in Iphoto, it plays the entire song insted of starting at the cropped point. Why?, and how do I fix it?

  • Search Hierarchy for All Enumerated Types

    I have inherited a large hierarchy of code (about 1200 VI's) and am in the process of modifying it.  Unfortunately the previous developers who have worked on the code either did not know about Type Definitions or chose not to use them for the challen

  • Dreamweaver cs3 sudden break to attached css sheet

    While editing code in DW CS3 the Design view will lose connection to the attached CSS style sheet. This just started happening. I can't get the style sheet to link back up even after deleting the tag and re-attaching it. I have to reload a back up of

  • [request] PHP 5 with mysqli

    I just noticed that the mysql package has gone 4.1.7, which is recent enough to use the new mysqli interface (speed and security improvements).  However, the php 5 package doesn't have --with-mysqli=mysql_config_path/mysql_config.  Can anyone change