EUL owner or Sysadmin to export import

Hi
I read in oracle userguide that it is better that sysadmin user to do the import/export rather than the EUL owner..
Can anyone advise on this?which will be better EUL_US or SYSADMIN to handle discoverer adminstration duties.
Looks like previous implementation used EUL_US..
thanks
kp

Hi kp
In an Apps mode EUL it is better to do all management using either the SYSADMIN account or some other applications account specifically set up for the purpose. Being an Apps account it will have full access to other Apps accounts and responsibilities. The database user EUL_US will not have this access.
Best wishes
Michael

Similar Messages

  • Export/Import with apps user or EUL schema owner ?

    Hi,
    I am working with a migration plan to move Discoverer 9 to Discoverer 10.
    It is an apps mode EUL, where business areas and workbooks have been granted to Apps Users & Responsibilities.
    Customer has only maintained the EUL with the database user of the EUL owner.
    What is recommended for exporting/importing the EUL from 9 to 10.
    To connect to Administrator as the EUL owner, or to connect as an apps user ( SYSADMIN ? )
    Will all the grants to apps users/responsibilities work when importing as schema owner ?
    I tried to create a entire EUL export file connected as EUL owner, and for the workbooks the export log contained :
    1234#.Workbook name1
    2345#.Workbook name 2
    1234 is the user_id for the workbook owner ( All workbooks have been created by an apps user )
    Will the import process manage to convert this user_id, and set the correct owner for the imported workbooks ?
    ( I will use the option Only take ownership if original owner not found )
    I am not sure, but I think I have seen the following syntax in other projects when exporting an apps mode EUL
    SYSADMIN.Workbook name 1
    SYSADMIN.Workbook name 2

    >
    What is recommended for exporting/importing the EUL from 9 to 10.
    To connect to Administrator as the EUL owner, or to connect as an apps user ( SYSADMIN ? )
    Will all the grants to apps users/responsibilities work when importing as schema owner ?
    Hi,
    The best that i know is that you should export using the DB user so that you will not have problem with workbooks or BA that you are not granted for (or any other grants or privileges).
    The import should be done using the APPS user (the super user you use).
    That way if you import a workbook that the owner will not be found then you will get the ownership and you can after migration deal with it.
    If you'll do that with the DB (EUL owner) user after migration you will not have access to those workbooks.
    Any way about the workbooks i suggest you'll save them as DIS files for cases that the import of the workbooks or the owner association for them fail.

  • Best choice for exporting / importing EUL

    Hi all
    I have been tasked with migrating an EUL from a R11 to a R12 environment. The Discoverer version on both environments is 10.1.2 and the OS is Solaris on oracle db's.
    I am unfortunately not experienced with Discoverer and there seems to be no one available to assist for various reasons. So I have been reading the manual and forum posts and viewing metalink articles.
    I tried exporting the entire eul via the wizard and then importing it to the new environment but i was not successfull and experienced the system hanging for many hours with a white screen and the log file just ended.
    I assumed this was a memory problem or slow network issues causing this delay. Someone suggested I export import the EUL in pieces and this seemed to be effective but I got missing item warnings when trying to open reports. This piece meal approach also worried me regarding consistency.
    So I decided to try do the full import on the server to try negate the first problem I experienced. Due to the clients security policies I am not able to open the source eul and send it to our dev. I was able to get it from their dev 11 system but I dismissed this as the dev reports were not working and the only reliable eul is the Prod one. I managed to get a prod eex file from a client resource but the upload to my server was extremely slow.
    I asked the dba to assit with the third option of exporting a db dump of the eul_us and importing this into my r12 dev environment. I managed this but had to export the db file using sys which alleviated a priviledge problem when logging in, I have reports that run and my user can see the reports but there are reports that were not shared to sysadmin in the source enviroment are now prefixed with the version 11 user_id in my desktop and the user cannot see her reports only the sysadmin ones.
    I refreshed the BA's using a shell script I made up which uses the java cmd with parameters.
    After some re reading I tried selecting all the options in the validate menu and refreshing in the discover admin tool.
    If I validate and refresh the BA using the admin tool I get the hanging screen and a lot of warnings that items are missing( so much for my java cmd refresh!) and now the report will not open and I see the substitute missing item dialogue boxes.
    My question to the forum is which would be the best approach to migrate the entire eul from a R11 instance to a R12 instance in these circumstances?
    Many thanks
    Regards
    Nick

    Hi Srini
    The os and db details are as follows:
    Source:
    eBus 11.5.2
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Target:
    ebus 12.1.2
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production DEV12
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Yes the DBA initially did an exp for me using EUL_US as the owner but a strange thing happened with privileges and also, some of the imported tables appeared in the target environment under the apps schema(21 tables) even though the eul_us exp had been 48 tables.
    I also had a problem on the db with "eul_us has insufficient privileges on table space discoverer" type errors.
    I checked the eul_us db privileges and was unable to resolve this initial privilege error even though the privileges were granted to eul_us.
    The dba managed to exp as system and then import it with the full=y flag in the import command which seems to bring in the privileges.
    Then I ran the eul5_id.sql and then made up a list of the business areas and made a sh script to refresh the business areas as follows:
    java -jar eulbuilder.jar -connect sysadmin/oracle1@dev -apps_user -apps_responsibility "System Administrator" -refresh_business_area "ABM Activities" -log refresh.log
    This runs successfully and I can log in select business area and grant access to the users. The reports return data.
    Then one of the users said she can't see all her reports. I noticed some if I opened desktop that were sitting there prefixed with a hash and her version 11 user id.
    So back to the manuals and in the disco admin help the instructions are to first go to view > validate > select all options then go to the business area and click file refresh. This gives me a lot of warnings about items that are missing. I assume this is because the item identifiers brought across in the db dump are the version 11 ones and thus not found in the new system.
    Any suggestions?
    Many thanks
    Nick

  • Copy Folder with Joins. Export/Import Folder with joins. In EUL.

    Ok, I've got a custom folder which has been made up by dragging items from 2 or 3 other folders into it.
    It then has some joins of it's own, quite a few.
    When trying to create a workbook from it, it takes 9 mins to run a query.
    I need to work out what is slowing it down. If I create the same workbook against the folder which has the majority of the items in the custom folder, it runs instantly.
    So I suspect it is one of the joins causing it.
    My plan was to duplicate the folder, then remove joins until I find out which one is causing it.
    However, if I cut n paste the folder, I get a copy without the joins.
    If I export the folder and import it I get a copy without the joins.
    Question then - how can I get a copy of a folder WITH the joins ?
    I'm slightly concerned that when I export my EUL from the dev database and import it into the live database that I'm not going to get any joins since the export and import into the dev database is not retaining the joins.
    Anyone ?

    Hi,
    The preferences for Disco Plus are set in the pref.txt file on the apps server and for Disco Desktop in the Windows Registry. I think the defaults are set on so unless you have changed them this is unlikely to help.
    I think I read somewhere that the 11g optimiser will remove unused outer joins or where there is a foreign key constraint. I may have made that last bit up as I cannot find a reference to it, but it may be worth exploring.
    To speed things up you could look at why this join is slowing things down. It could be that you need an index on the join column.
    The join actually is used, in that it has to check in the other table that a record exists. This is why Discoverer cannot remove the join from the complex folder query. If it did and there where no matching records in the other table then you would get a different result.
    Rod West

  • Suggestions needed for Export/Import Business Area Problem

    I'm trying to export/import a Business Area from one instance to another and I get the following error during import processes.
    Database Error - ORA-00001: unique constraint (EUL4_US.EUL4_EXP_PK) violated
    The Business Area I'm trying to import doesn't exists in new instance. During the import wizard I've checked refresh existing object and match by identifiers.
    I think my target instance has objects with same identifiers, so I went ahead and changed the identifiers in my source, but still problem persists. How can I correct this error?
    Thanks

    Did you happen to copy the EUL from one database to another before you were importing/exporting the BA?
    Discoverer recognizes EULs using unique reference numbers. However, if you use the database export and import utilities to copy an EUL, the new EUL (including its reference number) will be identical to the original EUL. When EULs have the same reference number, EUL consistency issues can arise if you do both of the following:
    * If you modify objects in both the original EUL and in the new EUL
    * Having modified objects in both EULs, if you then attempt to copy objects between the two EULs using the Discoverer Export Wizard and Import Wizard (or the Discoverer /export and /import commands)
    To avoid potential EUL consistency issues, run the eul5_id.sql script as the owner of the new EUL. The eul5_id.sql script gives a new refererence number to the new EUL and thereby avoids any potential EUL consistency issues.
    Reference
    * How to import an EUL using the standard database import utility
    http://download-west.oracle.com/docs/html/B13916_04/maintain_eul.htm#i1008632
    For 4i, you will ned to run:
    sql> update EUL4_VERSIONS set VER_EUL_TIMESTAMP =TO_CHAR(SYSDATE, 'YYYYMMDDHH24MISS');
    sql> commit;
    Some similar EUL inconsistencies are documented in:
         Note.267476.1     Discoverer Workbook Fails With 'Cannot join tables.Item dependency "" not found in EUL' After EUL Migration, Importing Or Cloning
    You might want to check that if the EUL version timestamps are the same.
    ~Steve.

  • Using export/import to migrate data from 8i to 9i

    We are trying to migrate all data from 8i database to 9i database. We plan to migrate the data using export/import utility so that we can have the current 8i database intact. And also the 8i and 9i database will reside on the same machine. Our 8i database size is around 300GB.
    We plan to follow below steps :
    Export data from 8i
    Install 9i
    Create tablespaces
    Create schema and tables
    create user (user used for exporting data)
    Import data in 9i
    Please let me know if below par file is correct for the export :
    BUFFER=560000
    COMPRESS=y
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=y
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE
    TRIGGERS=y
    TTS_FULL_CHECK=TRUE
    Thanks,
    Vinod Bhansali

    I recommend you to change some parameters and remove
    others:
    BUFFER=560000
    COMPRESS=y -- This will increase better storage
    structure ( It is good )
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=n -- if you set that parameter in yes you
    can have problems with some objects
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y -- this value is the default ( It is
    not necesary )
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y -- ( start the database in restrict
    mode and do not set this param )
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE -- this value is the default ( It is
    not necesary )
    TRIGGERS=y -- this value is the default ( It is
    not necesary )
    TTS_FULL_CHECK=TRUE
    you can see what parameters are not needed if you apply
    this command:
    [oracle@ozawa oracle]$ exp help=y
    Export: Release 9.2.0.1.0 - Production on Sun Dec 28 16:37:37 2003
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    You can let Export prompt you for parameters by entering the EXP
    command followed by your username/password:
    Example: EXP SCOTT/TIGER
    Or, you can control how Export runs by entering the EXP command followed
    by various arguments. To specify parameters, you use keywords:
    Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
    Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
    or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword Description (Default) Keyword Description (Default)
    USERID username/password FULL export entire file (N)
    BUFFER size of data buffer OWNER list of owner usernames
    FILE output files (EXPDAT.DMP) TABLES list of table names
    COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
    GRANTS export grants (Y) INCTYPE incremental export type
    INDEXES export indexes (Y) RECORD track incr. export (Y)
    DIRECT direct path (N) TRIGGERS export triggers (Y)
    LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
    ROWS export data rows (Y) PARFILE parameter filename
    CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
    OBJECT_CONSISTENT transaction set to read only during object export (N)
    FEEDBACK display progress every x rows (0)
    FILESIZE maximum size of each dump file
    FLASHBACK_SCN SCN used to set session snapshot back to
    FLASHBACK_TIME time used to get the SCN closest to the specified time
    QUERY select clause used to export a subset of a table
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    TTS_FULL_CHECK perform full or partial dependency check for TTS
    VOLSIZE number of bytes to write to each tape volume
    TABLESPACES list of tablespaces to export
    TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
    TEMPLATE template name which invokes iAS mode export
    Export terminated successfully without warnings.
    [oracle@ozawa oracle]$
    Joel P�rez

  • Export/import to clone database -- please advise

    Hi All,
    Need help especially from exports that are strong in export/import.
    can someone advise me how to use export/import to clone a database?
    I know how to clone database using cold backup and hot backup.
    But wish to know the full steps in using export/import to do cloning.
    I'm going to do a full export on my "TEST" database, drop the database, create a database and import using the full export.
    After doing a full database export, full=y. What are the steps to follow. Have a few doubts about it.
    1) export the database (full=y)
    2) drop the database
    3) re-create the database, will be using the same name "TEST".
    4) create the users (IDs created by DBA, eg: schemas), tablespace and datafiles. --> import doesn't do it right?
    5) do a full import (full=y). In this step, will i encounter any problems? As in understand that all the oracle-owned objects (example SYSTEM, SYS, AUX) are already in the newly created database, will import full=y cause any problems?
    6) do i have to re-create the roles, SYNONYMS and grant role,sys,obj privileges to IDs not created by oracle. (example schema owner)
    in short i would like to know what EXPORT will export and not export
    what IMPORT will import and not import.
    Please advise.

    Hi zetabouy,
    thanks for ur input, definately helpful for me.
    just to confirm. am i right to say that only a full database level (full=y) export/import will export/import out roles as well as public synonyms? because i have tried also exporting a user from the database, and imported that user into a different database, but public synonyms and roles were not imported.
    one more question:
    For example we are going to do port data from production to uat database using export and import for only the schema.
    After exporting from production with such syntax (owner=OWNER01),
    is is better to drop all objects beloging to the schema owner in UAT before importing it (touser=OWNER01, ignore=y) using the production dumpfile?
    I'm asking this question as I'm afraid that if we did not drop all the pl/sql objects in UAT database, the updated procedures/fuctions/triggers exported from the production database will not get imported to the UAT database as UAT database have the same procedures/fuctions/triggers names (but not updated code) .
    will profiles be exported and imported too duing full database level exp/imp?
    please kindly advise.
    thanks
    Message was edited by:
    chew

  • Business Area Export/Import  Conditions Error

    Hi,
    I'm working on exporting one business area from one schema X to another Y.
    Well I'm exporting the business area from the X schema using the administrator
    and following the next steps:
    [Menu]
    File/Export...
    [Export wizard Dialog]
    Selected objects in the EUL/ Selecting Business Area Definitions,
    folders,Workbooks.
    I'm saving the resultant file like "BusinesArea.eex"
    So, I'm importing the "BusinesArea.eex" file to the Y schema, using the command
    line:
         ..\Dis51adm /connect usr/[email protected] /import BusinesArea.eex
    All seems to works fine,but when I try to open one of the exported workbooks I get
    the next error:
    1) First a dialog box appears asking me for "Open the workbook in the current db
    account."
    2) when I accept the previous option, I get an Error dialog box with the "Missing
    Item condition" text.
    Apparently the conditions are not exported to the file.
    Does anyone know how could I export all the X schema Business area conditions,
    in order to import them to the Y schema, avoiding this error?
    Thanks in advance.

    Hi,
    Thanks for your answer Skunitha,
    Well, just to clarify this a little bit more my question:
    I'm trying to open the workbooks using the discoverer desktop client (9.0.2.39.01) and the conditions were created in the desktop not in the administrator.
    Well the process I'm runnig a script to delete the EUL in the Y schema, create a new EUL (in the Y schema), and import into the Y schema the exported BA from the X schema.
    All works fine, but when I open in the Administrator the recent imported Business Area in Y, I can see all the folders and its items, But when I open the Desktop to select one of the workbooks, the workbook list appears empty.
    I made what you said Skunitha, to validate the folders and I getting beside the folders the next error:
    ORA-02019: connection description for remote database not found
    Does any body know why I'm getting this?

  • Export & Import data in Oracle (Urgent)

    I just wonder whether Oracle 8i has the 'Export & Import data' feature in their DBA Administration tool.
    Inside DBA Studio, I found a option to export/import data to text file, but we must connect to Oracle Management Server (OMS) first before we can use that feature. I found the same feature available in Oracle 7.3.3 in Oracle Data Manager.
    How to make sure that I have a OMS installed on my server? (I purchase a Oracle 8i Standard Edition, does it include OMS?)
    Can we export from a table (database A) to a table in database B? Or We can only do this thru. a dump file?

    With every installation of an Oracle DB you get the exp(ort) and imp(ort) utilities. You can use them to move data from one user to another.
    Run them from the dos-prompt like:
    exp parfile=db_out.par
    imp parfile=db_in.par
    with db_out.par=
    file=db.dmp
    log= db_out.log
    userid=system/?????
    owner=???
    constraints=y
    direct=n
    buffer=0
    feedback=100
    and db_in.par=
    file=db.dmp
    log= db_in.log
    userid=system/???
    touser=??
    fromuser=???
    constraints=y
    commit=y
    feedback=100
    null

  • Export Import single table...

    Gurus...
    I am working on this single table which needs to be exported from prod and import into test.
    As I understand I need to follow below steps:
    1. Test - export table abc dump as backup
    2. Prod - Export single table abc
    3. Test - Drop table abc cascade constraints
    4. Test - Import abc into test
    Export par file:
    Directory= dbpump
    dumpfile= expdp_abc.dmp
    logfile= expdp_abc.log
    content= all
    tables=user.abc
    exclude=statistics,object_grant, tablespace_quota
    flashback_time= systimestamp
    Import par file:
    Directory= dbpump
    dumpfile= expdp_abc.dmp
    logfile= impdp_abc.log
    content= all
    tables=user.abc
    table_exist_action=replace
    transform=segment_attributes:N
    my doubts:
    1. Is my process flow correct?
    2. Export & Import file correct? or missing parameters?
    3. What happens to all objects connected to the table, will that also imported?
    4. Do I need to lock user during this process?
    5. Any script to check whether all objects connected to table does exist in test after import?

    Hi,
    Process for table export & import.
    +1. Create database directory in test as well as production:-+
    --> create or replace directory directory_name as 'physical_path';
    --> grant acsess on that directory to user.
    +2. Backp table in test environment (in case if you need old data in your test env):-+
    --> Create table BKP_table_name as select * from table_name; (table_name u want to import)
    +3. Take Export backup in Production database:-+
    --> expdp dumpfile=file_name.dmp logfile=file_name.log directory=directory_name tables=Owner.table_name
    +4. On test server do the following actvity:-+
    a. Just check for dependencies on that table using DBA_DEPENDENCIES.
    b. Truncate table which u want to import
    c. import table-
    impdp dumpfile=file_name.dmp tables=owner.table_name logfile=file_name_imp.log directory=directory_name table_exists_action=APPEND
    d. Just check for dependencies on that table using DBA_DEPENDENCIES.
    And also you can use table_exists_action=replace
    This will also import all depedent objects to the table..
    Regards,
    Edited by: XBOX on Dec 14, 2012 10:15 PM

  • Best way to Export/Import

    I am developing my application in Company environment. Whenever I posted in forums, helpers asked credentials on apex.oracle.com. Roel mentioned to create a sql file with all your data definitions (tables, packages etc) and data. But he did not have a chance to do how and I could not figure out.
    I founded the export option:
    Home>Application Builder>Application> Export/ImportWhen I exported it to local computer, I did not get any error message.
    Moreover, I have an error message when I imported it.
    Expecting p_company or wwv_flow_company cookie to contain security group id of application owner.
    Error ERR-7621 Could not determine workspace for application (:) on application accept.
    I really need help the best of Exporting and importing APEX applications with steps by steps procedure.
    Thanks in advance,
    Sam

    Dear Mr. Charlie McNeil,
    I would like to clarify concerning my post.
    We also have DEV, QA environments.
    We can export using Export:
    Home>Application Builder>Application Number>Export / Import
    We can import application using:
    Home>Application Builder>Import
    h3. Within own environment, there is no problem to export/import.
    Let me restate my post.
    I am looking for help to copy and paste an APEX application onto apex.oracle.com.
    I attempted several times and I got couple errors.
    Expecting p_company or wwv_flow_company cookie to contain security group id of application owner.
    Error ERR-7621 Could not determine workspace for application (:) on application accept.
    failed to parse SQL query:
    ORA-00942: table or view does not existIf you have any question for me, don’t hesitate to be in touch.
    Thanks for your time,
    Sam

  • How to Export/Import in Windows-Oracle10g-EnterprisesManager

    Hi
    How to Export/Import in Windows operating system / Oracle 10g through Enterprises manager.
    1)Which oracle user has privilege to logon in Enterprises manager for export/import purpose.
    2)After loging on to the Enterprises manager's Maintaince menu it asks Host credential here whether i should give OS administrator name & password or Oracle user name & password.
    by
    balamuralikrishnan.s

    Hi1)Which oracle user has privilege to logon in Enterprises manager for
    export/import purpose.
    Normally owner of the schema has all privilege of export/import data in his own schema.
    2)After loging on to the Enterprises manager's Maintaince menu it asks Host
    credential here whether i should give OS administrator name & password or Oracle user name & password.Give oracle user name & password(ie., owner of the schema and password)

  • Export/Import: Discoverer Error

    Has anybody ever come across this error:
    Duplicate element definition: Function Argument with identifier 'P_FOREIGN_KEY_ID'.
    Do you know what it means or why it is caused or how to fix it?
    Thanks

    Did you ever get an answer to this?
    Reason is, that I would love to know the answer.
    My thinking is that when you move an EUL from one machine to another (ie: dev to prod), then you can export, import, etc., save all the reports to disk, bring them in, etc.
    But a big hole in my knowledge is when you set up an EUL for cubes (ie: OLAP), you also have to set up a catalog - not the EUL, but another storage vehicle, correct?
    Therefore, when trying the move it does indeed sound as if there's more steps involved as you also need to bring over the catalog.
    So, did you ever get an answer to this quesiton?
    Thx.
    Russ
    Message was edited by:
    rproudman

  • GUI Export/Import Dump feature?

    Will SQL Developer ever have a GUI Export/Import wizard. Right now we use the exp/imp DOS commands to acheive this. It would be nice to have a GUI guide you through the process in SQL Developer.
    Thanks
    Alex

    Sorry I could not respond earlier. This week has been absolutely CRAZY 4 me.
    What I'd like to see (in addition to what Tony just mentioned above) is a Wizard kind of interface that lets me export/import an entire database/schema/table etc. I should be able to select the from and to character set. I should have all the options available in the current command line exp/imp utility (but in a good GUI interface). I'm just pasting all the command line options currently available with exp/imp below. Some of these options can be permanent settings which can be specified somewhere in the preferences. The things that change with each export or import like userid etc.. can be options on the wizard.
    In addition to that if you have sql*loader control file also that would make it a great tool indeed!
    exp
    Keyword Description (Default)
    USERID username/password
    FULL export entire file (N)
    BUFFER size of data buffer
    OWNER list of owner usernames
    FILE output files (EXPDAT.DMP)
    TABLES list of table names
    COMPRESS import into one extent (Y)
    RECORDLENGTH length of IO record
    GRANTS export grants (Y)
    INCTYPE incremental export type
    INDEXES export indexes (Y)
    RECORD track incr. export (Y)
    DIRECT direct path (N)
    TRIGGERS export triggers (Y)
    LOG log file of screen output
    STATISTICS analyze objects (ESTIMATE)
    ROWS export data rows (Y)
    PARFILE parameter filename
    CONSISTENT cross-table consistency(N)
    CONSTRAINTS export constraints (Y)
    OBJECT_CONSISTENT transaction set to read only during object export (N)
    FEEDBACK display progress every x rows (0)
    FILESIZE maximum size of each dump file
    FLASHBACK_SCN SCN used to set session snapshot back to
    FLASHBACK_TIME time used to get the SCN closest to the specified time
    QUERY select clause used to export a subset of a table
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    TTS_FULL_CHECK perform full or partial dependency check for TTS
    TABLESPACES list of tablespaces to export
    TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
    TEMPLATE template name which invokes iAS mode export
    imp
    Keyword Description (Default)
    USERID username/password
    BUFFER size of data buffer
    FILE input files (EXPDAT.DMP)
    SHOW just list file contents (N)
    IGNORE ignore create errors (N)
    GRANTS import grants (Y)
    INDEXES import indexes (Y)
    ROWS import data rows (Y)
    LOG log file of screen output
    FULL import entire file (N)
    FROMUSER list of owner usernames
    TOUSER list of usernames
    TABLES list of table names
    RECORDLENGTH length of IO record
    INCTYPE incremental import type
    COMMIT commit array insert (N)
    PARFILE parameter filename
    CONSTRAINTS import constraints (Y)
    DESTROY overwrite tablespace data file (N)
    INDEXFILE write table/index info to specified file
    SKIP_UNUSABLE_INDEXES skip maintenance of unusable indexes (N)
    FEEDBACK display progress every x rows(0)
    TOID_NOVALIDATE skip validation of specified type ids
    FILESIZE maximum size of each dump file
    STATISTICS import precomputed statistics (always)
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    COMPILE compile procedures, packages, and functions (Y)
    STREAMS_CONFIGURATION import streams general metadata (Y)
    STREAMS_INSTANTIATION import streams instantiation metadata (N)
    The following keywords only apply to transportable tablespaces
    TRANSPORT_TABLESPACE import transportable tablespace metadata (N)
    TABLESPACES tablespaces to be transported into database
    DATAFILES datafiles to be transported into database
    TTS_OWNERS users that own data in the transportable tablespace set
    Thanks for the quick response,
    Regards
    Alex

  • 10g export/import utility

    Hi,
    I imported a couple of tables using the export and import utilities. Here is a break down of the process.
    -Exported SH schema as SH.
    -Created a new user 'HT' as sys, granted it resource and connect roles.
    -Ran import utility as HT and imported only 2 tables (sales & products) from the schema export .dmp file.
    -The import runs successfully, saying the tables have been imported.
    -I run a select against user_tables as HT to find the imported tables, which returns no rows selected.
    Why is this?

    Hello,
    How did you exported the tables?
    If you are cloing the entire schema and using conventional export/import utility then do following
    exp system/manager owner=userA file=userA.dmp log=userAexport.log
    imp system/managet fromUser=userA toUser=userB file=userA.dmp log=userBimport.logAnd if you are importing just couple of tables then do following
    exp username/password tables=table1,table2 file=tables.dmp log=tableexport.log
    imp username/password tables=table1,table2 file=tables.dmp log=tableimport.log rows=Y ignore=Y
    #With tables (make sure table doesn't exists in target schema).
    imp username/password tables=table1,table2 file=tables.dmp log=tableimport.log YOu can also use datapump but it has different options and I recommend you should refer to oracle documenation for using datapump.
    Regards
    Edited by: OrionNet on May 25, 2009 2:27 PM

Maybe you are looking for

  • Is there a way to see all your new mail from multiple accounts on one page?

    I'm currently receiving emails from four different accounts on my iPhone. I love getting all my emails like this, but it would be even nicer if I didn't have to navigate between all the different accounts.

  • Validation on Transport Request Description

    I want to put some validation on the transport request description. Please let me know the procedure.

  • Difficulty downloading podcasts

    I'm have difficulty downloading poscasts. I subscribe, and I'm supposed to get the latest episode. The little wheel spins; the download bar advances; eventually it says DONE...but no podcast appears in iTunes, and no file in the podcst folder on the

  • Where doi we store the History in rich client  3.1 ?

    All, When I login to rich client 3.1, I can see the list of all the reports, I would like to know if we do store this history somewhere and if I can clear it out. Thanks in advance for your help Philippe

  • /SAPAPO/SDORDER_DEL recommendations

    Hello all, I'm performing the report /SAPAPO/SDORDER_DEL once a month with period of retention of 3 months due to degree performance problems with tables /SAPAPO/POSMAPN and /SAPAPO/SD_DOC. I didn't find any recommendation about SLS lifetime at APO s