Schema export via Oracle data pump with Database Vault enabled question

Hi,
I have installed and configured Database Vault on an Oracle 11g-r2-11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a realm. I have followed the following doc:
http://www.oracle.com/technetwork/database/security/twp-databasevault-dba-bestpractices-199882.pdf
to ensure that the sys and the system user has sufficient rights to complete a schedule Oracle data pump export operation.
I.e. I have granted to sys and system the following:
execute dvsys.dbms_macadm.authorize_scheduler_user('sys','SCHEMA_NAME');
execute dvsys.dbms_macadm.authorize_scheduler_user('system','SCHEMA_NAME');
execute dvsys.dbms_macadm.authorize_datapump_user('sys','SCHEMA_NAME');
execute dvsys.dbms_macadm.authorize_datapump_user('system','SCHEMA_NAME');
I have also create a second realm on the same schema (SCHEMA_NAME) to allow sys and system to maintain indexes for real-protected tables, To allow a sys and system to maintain indexes for realm-protected tables. This separate realm was created for all their index types: Index, Index Partition, and Indextype, sys and system have been authorized as OWNER to this realm.
However, when I try and complete an Oracle Data Pump export operation on the schema, I get two errors directly after the following line displayed in the export log:
Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
ORA-01031: insufficient privileges
ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 9081
ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
ORA-01031: insufficient privileges
ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 9081
The export is completed but with this errors.
Any help, suggestions, pointers, etc actually anything will be very welcome at this stage.
Thank you

Hi Srini,
Thank you very much for your help. Unfortunately after having followed the instructions of the DOC I am still getting the same errors ?
none the less thank you for your input.
I was also wondering if someone could tell me how to move this thread to the Database Security area of the forum, as I feel I may have posted the thread in the wrong place as it appears to be a Database Vault issue and not an imp/exp problem. ?
Edited by: zooid on May 20, 2012 10:33 PM
Edited by: zooid on May 20, 2012 10:36 PM

Similar Messages

  • Oracle data pump vs import/export utility

    Hello all,
    What is the difference between Oracle Data Pump and Import/Export utility? which on is the faster?

    Handle:      user9362044
    Status Level:      Newbie
    Registered:      Jan 26, 2011
    Total Posts:      31
    Total Questions:      11 (7 unresolved)
    so many questions & so few answers.
    What is the difference between Oracle Data Pump and Import/Export utility?unwilling or incapable to Read The Fine Manual yourself?
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/toc.htm

  • Using  Data Pump when database is read-only

    Hello
    I used flashback and returned my database to the past time then I opened the database read only
    then I wanted use data pump(expdp) for exporting a schema but I encounter this error
    ORA-31626: job does not exist
    ORA-31633: unable to create master table "SYS.SYS_EXPORT_SCHEMA_05"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT", line 863
    ORA-16000: database open for read-only access
    but I could by exp, export that schema
    My question is that , don't I can use Data Pump while database is read only ? or do you know any resolution for the issue ?
    thanks

    You need to use NETWORK_LINK, so the required tables are created in a read/write database and the data is read from the read only database using a database link:
    SYSTEM@db_rw> create database link db_r_only
      2   connect to system identified by oracle using 'db_r_only';
    $ expdp system/oracle@db_rw network_link=db_r_only directory=data_pump_dir schemas=scott dumpfile=scott.dmpbut I tried it with 10.2.0.4 and found and error:
    Export: Release 10.2.0.4.0 - Production on Thursday, 27 November, 2008 9:26:31
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-39006: internal error
    ORA-39065: unexpected master process exception in DISPATCH
    ORA-02054: transaction 1.36.340 in-doubt
    ORA-16000: database open for read-only access
    ORA-02063: preceding line from DB_R_ONLY
    ORA-39097: Data Pump job encountered unexpected error -2054
    I found in Metalink the bug 7331929 which is solved in 11.2! I haven't tested this procedure with prior versions or with 11g so I don't know if this bug only affects to 10.2.0.4 or 10* and 11.1*
    HTH
    Enrique
    PS. If your problem was solved, consider marking the question as answered.

  • Data Pump with parallel data ending up in 1st file!

    Oracle 10g 10.2.0.3 EE
    Ran the following command on a 16 core HPUX PA-RISC machine:
    expdp normaluser/password@RDSPOC FULL=y directory=DMPDIR parallel=12 dumpfile=exp_RDSPOC_2nd_%U.dmp logfile=exp_RDSPOC_2nd.log
    Database size, approx 900Gig of data
    All things looked good at first. All cores close to 100% utilized, Disk also at 100% utilized
    1h23m later I had 11 files all about the same size +- 40Gig
    The first dump file continued to grow. After another 60 hours the first file is approx 500 Gig and growing (status says 85% complete)
    One core is running max, and disk utilization is about 15%
    Note, I am not sys but a normal user with full export privilege (If that could make a difference)
    How do I get it to keep the machine running all cores and disks as hard as possible?
    Thanks

    Hi,
    Metadata is never unloaded in parallel, but is sometimes loaded in parallel.
    See Parallel Capabilities of Oracle Data Pump (Doc ID 365459.1)
    Also check the status of expdp, if there is e.g. one big table only one worker will still have data to pump.
    HTH,
    Peter
    Edited by: pa110564 on 05.08.2011 11:04

  • Oracle Data Pump

    What are the pre-requisites for using the Oracle Data Pump Export and Import ? Like, the config parameters in the initialization files, permissions, etc and the to-do's
    Thanks

    No, nothing like that.
    You need to have Oracle 10g and have a directory object created to hold dump files.

  • How to recreate enterprise manager with database vault

    I'm testing the Oracle database Vault option at database version 11.1.0.7 but there are some thing that does not work correct in the test. One of them is that I do not be able to recreate the enterprise manager repository. After probe several ways with database option enabled I decided to disable it. With the database vault option disabled I recreated the emanager ok but after enabled the database vault option again the database vault administrator does not browse for me:
    The firefox notice me an error with resource /dva.
    I hope you can help me.

    when you have vault on do you get errors in the realm audit reports ?
    or are you trying to create an oem repository in a vault enabled database ?

  • Grant role DBA with Database Vault

    Hi all,
    I need help granting the role DBA to a user with Database Vault option installed. I created a user account and I need that this user be able to do all the things that a regular DBA role can do. I can't find a way to do this in Database Vault... any help will be appreciated.
    Thanks!

    Sysdba can issue powerful statements such as create user, drop user, alter user, create profile .. and so on... can be done only if it is allowed so by modifying the Can maintain accounts/profiles rule set.
    You can also login with dvsys account but that account is locked after installation. So unlock it with
    alter user username account unlock; command. And be aware that ANY system privileges are blocked in protected schemas. You can try to grant the following roles in DB Vault := DV_OWNER, DV_REALM_OWNER, DV_REALM_RESOURCE, DV_ADMIN, DV_PUBLIC, DV_ACCTMGR, DV_SECANALYST
    Following can help you
    SELECT TABLE_NAME, OWNER, PRIVILEGE FROM DBA_TAB_PRIVS WHERE GRANTEE = 'DV_ACCTMGR';
    SELECT PRIVILEGE FROM DBA_SYS_PRIVS WHERE GRANTEE = 'DV_ACCTMGR';
    Regards
    Karan

  • An Oracle data pump situation that I need help with

    Oracle 10g running on Sun Solaris:
    I have written a unix shell script that exports data into a dump file(data pump export using expdp). Similarly I have import script that imports the data from the dump file(import using impdp). These are not schema exports. In other words, we have logically divided our schema into 4 different groups based on their functionality (group1, group2,group3 and group4). Each of these groups consists of about 30-40 tables. For expdp, there are 4 parfiles. group1.par, group2.par, group3.par and group4.par. Depending on what parameter you pass while running the script, the respective par file will be picked and an export will be done.
    For example,
    While running,
    exp_script.ksh group1
    will pick up group1.par and will export into group1.dmp.
    Similarly,
    exp_script.ksh group3
    will pick up group3.par and will export into group3.dmp.
    My import script also needs the parameter to be passed to pick the right dmp file.
    For example,
    imp_script.ksh group3
    will pick up group3.dmp and import every table that group3.dmp has(Here, import process does not need par files - all it needs is the right dmp file. )
    I am now faced with a difficulty, where, I must use Oracle's dbms_datapump API, to achieve the above same(and not expdp and impdp). I haven't used this API before. How best can I use this API to stick with my above stategy -- which is, I eventually need group1.dmp, group2.dmp, group3.dmp and group4.dmp. I will need to pass table names that each group contains. How do I do this using this API ? Can someone point me to an example, or perhaps suggest ?
    Thanks

    Or atleast, how do you do a table export using dbms_datapump ?

  • How to run Oracle data pump export in silent mode

    when i run the expdp command for data export, then i get the following information message,
    Export: Release 10.2.0.1.0 - 64bit Production on Wednesday, 26 March, 2008 15:33:49
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Starting "TEST"."SYS_EXPORT_TRANSPORTABLE_01": test/********@mqa dumpfile=data.dmp directory=export TRANSPORT_TABLESPACES = client_tables,client_indexes logfile=export.log
    Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK
    Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PLUGTS_BLK
    Master table "TEST"."SYS_EXPORT_TRANSPORTABLE_01" successfully loaded/unloaded
    Dump file set for TEST.SYS_EXPORT_TRANSPORTABLE_01 is:
    /export/home/oracle/TQAImport/DataFiles/data.dmp
    Job "TEST"."SYS_EXPORT_TRANSPORTABLE_01" successfully completed at 15:34:53
    Can i avoid these messages and run it in silent mode ?

    I am not really sure about this but try using LOGFILE option.I cant recall that I have used than what happens,whether the output comes to the logfile only or on both the places but in addition to Paul's solution which are more accurate,you can give this also a try.
    http://download-west.oracle.com/docs/cd/B12037_01/server.101/b10825/dp_export.htm#sthref133
    Aman....

  • Data pump with flashback_scn or flashback_time

    Dear Gurus,
    The Oracle database version in used is 11gR2. We don't have flashback enabled for the database. However to run a data pump export with consistency, can we turn on FLASHBACK_SCN or FLASHBACK_TIME?
    Best
    rac110g

    How to set the parameter
    The logical conclusion and field-tested best practice approved by Oracle Support is, to set the UNDO_RETENTION parameter to at least the estimated time for the data pump import. Don’t forget to size your UNDO tablespace accordingly, since the retention only works as long as there is enough undo space available.
    Note
    IMPDP uses flashback technology (flashback table) on the source database to achive consistency, so the UNDO tablespace there is worth a glance as well.
    check this one link for expdp/impdp undo requirements and possible problems
    http://www.usn-it.de/index.php/2010/05/05/oracle-impdp-ora-1555-and-undo_retention/
    Edited by: Asad99 on Mar 26, 2013 10:42 PM

  • How to access two oracle data base with out DB link

    Hi,
    I have two data base schema
    one is held in oracle 10g and the other in 11g ,Currently i am using DB link to access both the database. i am accessing around 70 tables using DB link.
    As per new requirement i have to remove DB link ,is there any other way i can access both database with out DB link.
    Thanks,

    malarkandy wrote:
    I have two data base schema
    one is held in oracle 10g and the other in 11g ,Currently i am using DB link to access both the database. i am accessing around 70 tables using DB link.
    As per new requirement i have to remove DB link ,is there any other way i can access both database with out DB link.Yes. But that needs another network and application interface instead of a database link.
    The target database can implement PL/SQL web services. These can be called via HTTP.
    The local database can use UTL_HTTP to make a call to the target database's web service for accessing target database data.
    Both databases can use AQ (Advance Queues). This can be integrated with an application server's JMS queues.
    This enable one database to enqueue a request to the other database, and for that database to respond by sending data.
    There are numerous such type of interfaces that can be used. Some primitive. Some advanced. Some simple. Some complex.
    The easiest and most flexible mechanism is however a database link. And it does not make sense to disallow database links for security and management reasons, but allow JMS for example. The network infrastructure is the same for both methods. The network protocol is the same. The security issues and concerns are the same - except that JMS for example is significantly more moving parts and make configuration and security a lot more complex than using a database link.

  • "Oracle Data Integrator" with "Total Recall"

    Hi all,
    We are planning to use Oracle Data Integrator 11g for performing ELT in Oracle 11g database. We are also planning to enable the "total recall" (flashback) technology and house all our tables on it.
    Question I have in my mind right now is, will ODI and Total Recall work well together?
    Background
    Say we have an interface defined with the target data store defined on a tablespace with flashback enabled. Say there are 100 rows in the source, of which 10 rows violated the check constraint . The "bad" data, violating the constraint, will be moved to the E$ table while the rest of the 90 rows are loaded into the target.
    Questions*
    1) If our business rule dictates zero tolerance for errors and a ROLLBACK is issued, what will happen to the data in the E$ table?
    2) Say we have committed* the 90 rows and want to use a flashback transaction query to undo the changes, how will it affect the E$ table?
    3) Will the rows be deleted from the E$ table also along with rolling back of the changes in the target?
    4) If the errors in E$ are recycled and this interface is restarted after the rollback is performed, will the I$ table contain 110 rows i.e. source data + data from E$?
    5) How does ODI handle recycling / reprocessing of the violations in E$ table?
    Please advice.
    Thank you.
    CC

    1.) The data in E$ will remain there
    2.) The data in E$ will remain there
    3.) The data in E$ will remain there
    4.) 90 rows. The recycled will still error out
    5.) To me the recycling feature is pretty lame. You need to fix the errors in the E$ table and then recycle will load the data.

  • Oracle Data Pump - Table Structure change

    Hi,
    we have daily partitioned table, and for backup we are using data pump (expdp). we policy to drop partition after backup (archiving).
    we have archived dump files for 1year, few days back developer made changes with table structure they added one new column to table.
    Now we are unable to restore old partitions is there a way to restore partition if new column added / dropped from currect table.
    Thanks
    Sachin

    If a new column has been added to the table, you can import only the the data from the old structure to the new structure. Use the parameter CONTENT=DATA_ONLY.

  • Oracle Data Access with Delphi - Please help

    Hello: We would like to access Oracle Applications data using a third party tool. I understand we can use Delphi development environment to access Oracle data. Can somebody help, how do we do this? Any online tutorial or White paper. Basically we want some simple UI screens to fetch Oracle Applications data. Thanks for your help, in advance.

    Google: "Delphi" and "Oracle" and "connection"

  • Data Pump using 11g2 and 11g1 question during migration

    Our DBE put our test database on 11g2 and our production on 11g1. Due to some ingest failure we wanted to move the data in test(11g2) to production(11g1) using data pump, however I was told that you cannot go from 11g2 to 11g1. I was also told that because the database contained public synonyms that I would have to recreate all public synonyms. he said it had something to do with lbascys. Can someone clarify this for me..

    user11171364 wrote:
    ... Can I still use these parameters during the import ONLY without having used them during the export.Nope, read the restriction : during the import "+This parameter is valid only when the NETWORK_LINK parameter is also specified.+"
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm#sthref299
    Consequently, you cannot use it within your dumpfile.
    Nicolas.

Maybe you are looking for

  • Error: Balancing field "Profit Center"

    Dear Experts, I face the following error while posting the result to any Vendor Account. "Balancing field "Profit Center" in line item 001 not filled" Line Item 001 is posting of notice pay amount to Vendor Account. Please suggest, Regards ...Sadhu

  • Project Management Question - timelines for Production system deployment

    Hi I need an expert advice on setting up timelines for establishing an Oracle Apps Production environment. We are implementing Oracle HR & Payroll at an organization. Our system architecture is a 2-node application tier and 2-node DB tier (RAC). RAC

  • Printing Problem in InDesign – Resolved in CS3/4 ?

    With InDesign CS and CS2 it was not possible to print to a non-Postscript printer at a resolution greater than 300ppi – thus rendering text and graphics with less than smooth edges.  This was a limit imposed by the way InDesign CS/CS2 sends the docum

  • Crash AE CS3 when setting MPEG2/H.264

    AE crashes with "AE crashed 00::42" box, when trying access MPEG2/H.264 in RenderQueues OutputModule. We've uninstalled/cleanuped everything Adobe related and done fresh install of AE CS3 on following workstation: Intel Core i7 920, 6Gb RAM, WinXPPro

  • HT1331 Why does this article still not indicate that Airport Disk Utility no longer exists?

    ADU stopped existing after 10.6. I see this article was updated in May; that means two revisions of OSX which don't contain this utility have been announced or released at that time yet it's still wrong. It's near the top of google rankings, making t