Take a Schema Dump of 11 g Db located in encrypted tablespace

Hi All,
I have a schema where all the tables are residing in encrypted tablespace.
When I try to take Schema dump ,it gives me an error that Table resides in encrypted
tablespace and will not be exported.
How will I be able to take the schema dump?Please advice.
Thanks in Advance,
Krishna

Hi Madrid,
Yeah you are correct I TDE is being used.I have the encryption key and the wallet password. Can you point me to the syntax to take the schema dump using these parameters.Also who will be the privileged user to take the dump?
I tried to search but was in vain.
Thanks ,
Krishna

Similar Messages

  • How can i take a schema offline

    Hi, my problem is that i have a schema, which is migrated to somewhere. and i dont want the users to connect it anymore. but i cant change the password.
    how can i take it offline, or is there any other solution like (no one but only sys and system can access that schema?)
    thanks in advance
    Edited by: merope on May 5, 2009 8:58 AM

    merope wrote:
    Hi, my problem is that i have a schema, which is migrated to somewhere. and i dont want the users to connect it anymore. but i cant change the password.
    how can i take it offline, or is there any other solution like (no one but only sys and system can access that schema?)
    thanks in advance
    Edited by: merope on May 5, 2009 8:58 AMOnce again we get into confusion over the difference between a "schema" and a "user". Remember, a "schema" is a collection of objects, owned by a "user". You can't take a schema offline, and users do not connect to schemas - they connect to the database with a user account, and can access whatever that account is allowed to access - within that account's schema or within other schemas.
    Are you wanting to disable the user account? Or are you wanting to prevent users from accessing objects in the schema?

  • I want the hr schema dump file

    hi all
    please i want the HR Schema dump file to import it to my DB
    thank you

    You can use Export and Import utility in oracle.
    http://www.orafaq.com/wiki/Import_Export_FAQ+_                                                                                                                                                                                                                   

  • Can ImageSnapShot take images of a website on a certain location??

    hey guys... so got a quick question, i need to make slides of a website, the only problem is all the snapshots i need to take are of swf's on the site, and the swf's get all the information from the site, so i was wondering if there was a way i could load the website first and then load the swf to get all the data? or if it is at all possible to take snapshots of a website on a certain location??
    any help is greatly appretiated!!!
    thanks in advance

    i was thinking that, but the problem is the website is made all in flash, and when you go to a url, it loads a fla and in there it has sub components, when you click on those sub components the url dosent change at all...
    and also i tried doing a httpRequest, and i got errors back, not sure if im doing it correct most likely not... the code i was trying to use is
    private function init():void{
        var http:HTTPService = new HTTPService();
        http.url = "http://test-www.apxinsider.com/#/office/1384";
        http.send();
        http.addEventListener(ResultEvent.RESULT, gothtpresult);
    private function gothtpresult(event:ResultEvent):void{
        swfLoader.load(takeSnapshot(event.result as IBitmapDrawable));
    private function takeSnapshot(source:IBitmapDrawable):ByteArray {
        var imageSnap:ImageSnapshot = ImageSnapshot.captureImage(source);
        var imageByteArray:ByteArray = imageSnap.data as ByteArray;
        return imageByteArray;

  • Script to take out schema DDL

    Hi,
    I'm looking for a script that takes a given schema from a database into a file, just the DDL and not the data, and then recreate it anywhere.
    It should write i.e. grants, creates (table, synonym, ...) and the create user.
    Is there any "oficial" script to get that?
    Thanks

    Hi..
    I'm using 9i and 10g (better for this one), and the aim is to take out that DDL into a readable text script, where I can see the sentences (create user, table, grants, ...).Just like the exp utility but in a editable text file(s).>
    You can have
    set pagesize 0
    set long 90000
    select DBMS_METADATA.GET_DDL('TABLE',object_name,'&&owner') from dba_objects where owner=upper('&&owner') and object_type ='TABLE' ;
      ---> for INDEX,VIEW,SYNONYMNS etc replace the object_type and the 1st entry in the bracket.For example in
    select DBMS_METADATA.GET_DDL('INDEX',object_name,'&&owner') from dba_objects where owner=upper('&&owner') and object_type ='INDEX' ;
    select dbms_metadata.get_ddl('USER','&OWNER')  from dual;
    select dbms_metadata.get_granted_ddl('ROLE_GRANT','&&OWNER')  from dual;
    select dbms_metadata.get_granted_ddl('SYSTEM_GRANT','&&OWNER') from dual;
    select dbms_metadata.get_granted_ddl('OBJECT_GRANT','&&OWNER') from dual;HTH
    Anand

  • How to take regular heap dumps using HPROF

    Hi Folks,
    I am using Oracle App server as my application server. I found that the memory is growing gradualy and gets maxed out with in 1 hour. I am using 1 GB of heap.
    I defently feel this is a memory leak issue. Once the Heap usage reaches 100%, I will start getting the FULL GCs and my whole server hangs and nothing will work. Some times even the JVM crashes and restarts again.
    I didn't find Out of Memory exception also in any of my logs.
    I came to know that we can use Hprof to deal with this.
    I use the below as my JVM agrs...
    -agentlib:hprof=heap=all,format=b,depth=10,file=$ORACLE_HOME\hprof\Data.hprof
    I run my load run for 10 mins, now my heap usage has been grown to some extent.
    My Questions:
    1. Why there are 2 files generated, one is with the name Data.hprof and another with Data.hprof.tmp. Which is what?
    2. How to get the dump at 2 different points. So that I can compare the the 2 dumps and I can say which object is growing more.
    I downloaded the HAT and If I use to open this Data.hprof file from HAT, I am getting this error. This error will come if I open the file with out stoping the JVM process.
    java.io.EOFException
    at java.io.DataInputStream.readFully(DataInputStream.java:178)
    at java.io.DataInputStream.readFully(DataInputStream.java:152)
    at hat.parser.HprofReader.read(HprofReader.java:202)
    at hat.parser.Reader.readFile(Reader.java:90)
    at hat.Main.main(Main.java:149)
    If I stop the JVM process, and then open through HAT I am getting this error,
    Started HTTP server on port 7000
    Reading from hprofData.hprof...
    Dump file created Wed Dec 13 02:35:03 MST 2006
    Warning: Weird stack frame line number: -688113664
    java.io.IOException: Bad record length of -1551478782 at byte 0x0008ffab of file.
    at hat.parser.HprofReader.read(HprofReader.java:193)
    at hat.parser.Reader.readFile(Reader.java:90)
    at hat.Main.main(Main.java:149)
    JVm version I am using is: Sun JVM 1.5.0_06
    I am seriously fed up of this memory leak issue... Please help me out folks... I need this as early as possible..
    I hope I get early replys...
    Thanks in advance...

    First, the suggestion of using jmap is an excellent one, you should try it. On large applications, using the hprof agent you have to restart your VM, and hprof can disturb your JVM process, you may not be able to see the problem as quickly. With jmap, you can get a heap snapshot of a running JVM when it is in the state you want to understand more of, and it's really fast compared to using the hprof agent. The hprof dump file you get from jmap will not have the stack traces of where objects were allocated, which was a concern of mine a while back, but all indications are that these stack traces are not critical to finding memory leak problems. The allocation sites can usually be found with a good IDE ot search tool,
    like the NetBeans 'Find Usages' feature.
    On hprof, there is a temp file created during the heap dump creation, ignore the tmp file.
    The HAT utility has been added to JDK6 (as jhat) and many problems have been fixed. But most importantly, this JDK6 jhat can read ANY hprof dump file, from JDK5 or even JDK1.4.2. So even though the JDK6 jhat is using JDK6 itself, the hprof dump file it is given could have come from pretty much anywhere, including jmap. As long as it's a valid hprof binary dump file.
    So even if it's just to have jhat handy, you should get JDK6.
    Also, the Netbeans profiler (http://www.netbeans.org) might be helpful too. But it will require a restart of the VM.
    -kto

  • Unable to import master repository

    Hi...
    I just installed the oracle 10g to my computer and i tried to import the master repository using the "Master Repository Import Wizard". But i got an error message as below:-
    java.lang.ClassNotFoundException: com.sunopsis.dwg.DwgExportSummary.
    Could somebody help me ?
    Thank you,
    Baharin

    One more way to import the master repository is to take a schema dump and put it on the database on which it is to be imported
    by creating the same schema name ...

  • 10g Full Database Dump Takes more than 2 Hours Still not finished

    Hi all,
    Full Database Dump Takes more than 2 Hours, Still not finished.
    Version - Oracle 10g 1.0.2.0
    Database Size is Around 160GB.
    Used Below Query to take Full Database Dump.
    expdp user/pwd@10gdb full=y directory=test_dir dumpfile=curent10g.dmp logfile=expdpcurent10g.log;
    It takes more than 1 hour in processing the functions.i.e)
    Processing object type DATABASE_EXPORT/SCHEMA/FUNCTION/FUNCTION
    And the Log File:
    Export: Release 10.1.0.2.0 - Production on Friday, 04 May, 2012 10:17
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
    With the Partitioning, OLAP and Data Mining options
    FLASHBACK automatically enabled to preserve database integrity.
    Starting "EXPTEST"."SYS_EXPORT_FULL_02": exptest/********@curentdb full=Y directory=test_dir dumpfile=curent10g.dmp logfile=expdpcurent10g.log;
    Estimate in progress using BLOCKS method...
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 4.574 GB
    Processing object type DATABASE_EXPORT/TABLESPACE
    Processing object type DATABASE_EXPORT/DE_SYS_USER/USER
    Processing object type DATABASE_EXPORT/SCHEMA/USER
    Processing object type DATABASE_EXPORT/ROLE
    Processing object type DATABASE_EXPORT/GRANT/SYSTEM_GRANT/PROC_SYSTEM_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/GRANT/SYSTEM_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/ROLE_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/DEFAULT_ROLE
    Processing object type DATABASE_EXPORT/SCHEMA/TABLESPACE_QUOTA
    Processing object type DATABASE_EXPORT/RESOURCE_COST
    Processing object type DATABASE_EXPORT/SCHEMA/DB_LINK
    Processing object type DATABASE_EXPORT/TRUSTED_DB_LINK
    Processing object type DATABASE_EXPORT/SCHEMA/SEQUENCE/SEQUENCE
    Processing object type DATABASE_EXPORT/SCHEMA/SEQUENCE/GRANT/DE_S_SEQ_OWNER_OBJGRANT/OBJECT_GRANT
    Processing object type DATABASE_EXPORT/DIRECTORY/DIRECTORY
    Processing object type DATABASE_EXPORT/DIRECTORY/GRANT/OBJECT_GRANT
    Processing object type DATABASE_EXPORT/CONTEXT
    Processing object type DATABASE_EXPORT/SCHEMA/DE_PUBLIC_SYNONYM/SYNONYM
    Processing object type DATABASE_EXPORT/SCHEMA/SYNONYM
    Processing object type DATABASE_EXPORT/SCHEMA/TYPE/TYPE_SPEC
    Processing object type DATABASE_EXPORT/DE_SYSTEM_PROCOBJACT/DE_PRE_SYSTEM_ACTIONS/PROCACT_SYSTEM
    Processing object type DATABASE_EXPORT/DE_SYSTEM_PROCOBJACT/DE_POST_SYSTEM_ACTIONS/PROCACT_SYSTEM
    Processing object type DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/PRE_TABLE_ACTION
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/GRANT/DE_S_TABLE_OWNER_OBJGRANT/OBJECT_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/GRANT/DE_S_TABLE_NOTWGO_OBJGRANT/OBJECT_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/INDEX
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/COMMENT
    Processing object type DATABASE_EXPORT/SCHEMA/PACKAGE/PACKAGE_SPEC
    Processing object type DATABASE_EXPORT/SCHEMA/PACKAGE/GRANT/DE_S_PACKAGE_OWNER_OBJGRANT/OBJECT_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/FUNCTION/FUNCTION
    Help me.

    Export is Still in progress
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Job: SYS_EXPORT_FULL_02
    Owner: EXPTEST
    Operation: EXPORT
    Creator Privs: FALSE
    GUID: 94AB0C8AA824462AA96C8481CF0F676F
    Start Time: Friday, 04 May, 2012 10:18
    Mode: FULL
    Instance: curentdb
    Max Parallelism: 1
    EXPORT Job Parameters:
    Parameter Name Parameter Value:
    CLIENT_COMMAND userexp/********@curentdb full=Y directory=test_dir dumpfile=curent10g.dmp logfile=expdpcurent10g.log;
    DATA_ACCESS_METHOD AUTOMATIC
    ESTIMATE BLOCKS
    INCLUDE_METADATA 1
    LOG_FILE_DIRECTORY TEST_DIR
    LOG_FILE_NAME expdpcurent10g.log;
    TABLE_CONSISTENCY 0
    State: EXECUTING
    Bytes Processed: 0
    Current Parallelism: 1
    Job Error Count: 0
    Dump File: C:\ORACLE DMP\CURENT10G.DMP
    bytes written: 4,096
    Worker 1 Status:
    State: EXECUTING
    Object Schema: PUBLIC
    Object Type: DATABASE_EXPORT/SCHEMA/PACKAGE/GRANT/DE_S_PACKAGE_OWNER_OBJGRANT/OBJECT_GRANT
    Completed Objects: 1
    Total Objects: 1
    What Does it mean, Suggest.

  • How to take DB dump for "virtual Columns"  enabled env in oracle11g

    Hi,
    Could you please let me know the procedure/steps to take the DB dump for "virtual Columns" enabled environment in oracle11g.
    Not able to take the database dump using 'exp' tool.
    Thanks,
    Satya Aditham

    Wrong forum, this is a Secure Backup specific forum, not an RDBMS/RMAN forum.

  • Dbms_datapump in PL/SQL- the file dump deleted after session disconnect

    Hi all
    I create test procedure to exp and imp table to dump file, it done but when I close session ( relogin to plsql developer to other schemas) the dump file is automatic deleted. how can to make dbms_datapump do not auto delete dump file when session disconnect.
    declare
    handle1 NUMBER;
    handle2 number;
    name1 VARCHAR2(30);
    name2 VARCHAR2(30);
    jobState user_datapump_jobs.state%TYPE;
    begin
    name1:='abc_tbl1'|| to_char(SYSDATE,'YYYYMMDDHH24MI')||'.dmp';
    name2:='abc_tbl2'|| to_char(SYSDATE,'YYYYMMDDHH24MI')||'.dmp';
    handle1 := dbms_datapump.open('EXPORT','TABLE');
    dbms_datapump.add_file(handle1,name1,'DUMP');
    dbms_datapump.metadata_filter(handle1,'SCHEMA_EXPR','IN (''ABC'')');
    dbms_datapump.metadata_filter(handle1,'NAME_EXPR','IN (''abc_tbl1'')');
    dbms_datapump.set_parallel(handle1,4);
    dbms_datapump.start_job(handle1,0);
    dbms_datapump.detach(handle1);
    end;
    thank

    Thank for your help, I fixed this one, just change code like this :)
    declare
    handle1 NUMBER;
    handle2 number;
    name1 VARCHAR2(30);
    name2 VARCHAR2(30);
    jobState user_datapump_jobs.state%TYPE;
    r_stat1 VARCHAR2(1);
    r_stat2 VARCHAR2(1);
    date1 DATE;
    date2 DATE;
    fcc_date DATE;
    begin
    --take name of dump
    name1:='abc_tbl1'|| to_char(SYSDATE,'YYYYMMDDHH24MI')||'.dmp';
    name2:='abc_tbl2'|| to_char(SYSDATE,'YYYYMMDDHH24MI')||'.dmp';
    ---take the date
    handle1 := dbms_datapump.open('EXPORT','TABLE');
    dbms_datapump.add_file(handle1,'lampp.log','DUMP',NULL,dbms_datapump.ku$_file_type_log_file);
    dbms_datapump.add_file(handle1,name1,'DUMP');
    dbms_datapump.metadata_filter(handle1,'SCHEMA_EXPR','IN (''ABC'')');
    dbms_datapump.metadata_filter(handle1,'NAME_EXPR','IN (''ABC_TBL1'')'); --- input with upper character
    dbms_datapump.set_parallel(handle1,4);
    dbms_datapump.start_job(handle1,0,0);
    dbms_datapump.wait_for_job(handle1, jobState);
    dbms_datapump.detach(handle1);
    end;

  • Getting Datapump Export Dump file to the local machine

    I apologize to everyone as this is a duplicate post.
    Re: Getting Datapump Export Dump file to the local machine
    My initial thread(started yesterday)was in 'Database General' and didn't get much response today. Where do i post questions on EXPORT/IMPORT utilities?
    Anyway, here is my problem:
    I want to take the export dump of itemrep schema in orcl database (in a remote machine). I have an Oracle server (10G Rel2) running in my local Windows machine. I have created a user john with necessary EXPORT/IMPORT privileges in my local db. Then i created a Directory object,ie a folder named datapump in my local hard drive and granted READ WRITE privileges to john.
    So john, who is a user in my local machine's oracle db is going to run the expdp utility.
    expdp john/jhendrix@my_local_db_alias SCHEMAS=itemrep directory=datapump logfile=itemrepexp.log
    The above command will fail because it will look for itemrep schema inside my local db, not the remote db where the itemprep is actually located. And you can't qualify the schemaname with its db in the SCHEMAS parameter (like SCHEMAS=itemrep@orcl).
    Can anyone provide me a solution for this?

    I think you can initiate the datapump exp utility from your client machine to export a schema in a remote database.But, Upon execution,oracle looks for the directory in the remote database and not on your local machine.
    You're inovoking expdp from a client (local DB) to export data from a remote DB.
    So, With this method, you can create the dumpfiles only on the Remote server and not on the local Machine.
    You can perform a direct import instead of export using the NETWORK_LINK option.
    Create a DBlink from your local and Remote DB and verify the connection.
    Then,Initiate the Impdp from Your local machine DB using the parameter network_link=<db_link of the Remote DB> to import the schema.
    The advantage of this option eliminates the Dumpfile creation at the Server side.
    There are no dumpfiles during the import process. the Data is imported directly to the target schema.

  • Best way to import a 200GB single dump file

    I was given a 200GB size single dump file containing full export of a schema. can any please tell me whats the best way to import such a huge dmp file. I need to get this import done asap in QA for testing which will let us solve some production issues. step by step instructions if possible would be really helpful for me to complete my task asap.
    Thanks in Advance,
    krim.

    Hi Krim,
    Our dump files normally are never that big so that maybe you could face some other issue here.
    If your dump was a full DB schema dump like:
    $ exp username/password file=full.dmp parameter-list
    then the import should first drop the user in the target system
    SQL> drop user username cascade;
    this is to drop the existing schema before importing
    SQL> Create user according to your reqs
    $ imp username/password file=full.dmp full=y commit=y ignore=y
    Don't know which env you have to run this, but in our case for instance using an 8 X 1.6GHz Itanium2 Montecito a 14 GB dump takes about a couple of hours to import (with an EMC Clariion disk array). It's also true that Oracle imp (did you use exp or expdp ?) is not able as far as I understand to achieve parallelism like impdp where in case of multiple huge tables the import time could be sped up.
    Another thing you may want to check is if you have archive logging on, since the import will log there consuming time.
    Cheers,
    Mike

  • DB Dump very slow

    Hi,
    We have a Customer, that use an 10g oracle DB-Server. We support normaly only the Hardware and the Operatiing System. But he ask me to help.
    The Problem is, that the Backup-Script, USER-DUMP take over 24 hours.
    It exist any backup-software, that does this Job faster?
    Or can i compress or reorganize the database? It would help?
    Why it takes longer and longer? The Database is not mutch bigger then 2 weeks before.
    Regards

    1005989 wrote:
    Hi,
    We have a Customer, that use an 10g oracle DB-Server. We support normaly only the Hardware and the Operatiing System. But he ask me to help.
    The Problem is, that the Backup-Script, USER-DUMP take over 24 hours.
    It exist any backup-software, that does this Job faster?
    Or can i compress or reorganize the database? It would help?
    Why it takes longer and longer? The Database is not mutch bigger then 2 weeks before.
    RegardsWelcome to the forum!!
    Please read How do I ask a question on the forums?
    SQL and PL/SQL FAQ
    From what you posted I could understand that,you trying to take a schema backup using exp or expdp.
    If you are taking user backup use exp or expdp based on oracle version you use.
    RMAN is the best tool for taking database backup and it is totally free so are exp and expdp.

  • Limitation on the number of tables in a Database Schema!

    Hi All,
    Is there a limitation on the number of tables present in a schema that can be fetched in Designer while creating a universe.
    Customer is using Oracle schema which contains 22000 tables and while trying to insert tables in Designer XIR2 (or trying to get the table browser) Designer hangs.
    In BO 6.5 there are no issues while retrieving the tables from the schema in Designer.
    Is there a way to retrieve only a certain amount of tables in Designer XIR2?
    Thanks for your help!

    Hi Ganesh,
    Following are the Answers regaring your queries:
    Query : 1
    There is no limitation on the number of components (objects, classes, tables, joins, hierarchies, lov's, etc) in a universe. But of course as the number of components increases, you could run into problems  related to performance.
    This depends on available RAM and the processing speed.
    Query 2:
    There is NO such option to select the number of table to be automatically inserted in to the universe because Suppose if you have 22000 tables in DB and you want only 1000 table ,you entered 1000 tables as the value to insert tables in Universe then How Designer will come to know which tables you want to take in Schema to build the Universe?
    It all depends on the DBA and Universe Designer which tables are important for organizations reporting needs.
    When you  create connection to the DB then Connection will fetch all table from the database and we canu2019t limit  DB data retrieval.
    I hope this Helps...
    Thanks...
    Pratik

  • Open PO / SA Dump in SAP 4.7

    Hi All,
    Please suggest which is the best way to take open PO dump. open Pos are to be converted in to new POs with new material master. which is the best option pls let me know. 4.7 sap
    ME2M - WE101 - alv grid OR  can we take dump form any tables ?

    closed

Maybe you are looking for