GUI Export/Import Dump feature?

Will SQL Developer ever have a GUI Export/Import wizard. Right now we use the exp/imp DOS commands to acheive this. It would be nice to have a GUI guide you through the process in SQL Developer.
Thanks
Alex

Sorry I could not respond earlier. This week has been absolutely CRAZY 4 me.
What I'd like to see (in addition to what Tony just mentioned above) is a Wizard kind of interface that lets me export/import an entire database/schema/table etc. I should be able to select the from and to character set. I should have all the options available in the current command line exp/imp utility (but in a good GUI interface). I'm just pasting all the command line options currently available with exp/imp below. Some of these options can be permanent settings which can be specified somewhere in the preferences. The things that change with each export or import like userid etc.. can be options on the wizard.
In addition to that if you have sql*loader control file also that would make it a great tool indeed!
exp
Keyword Description (Default)
USERID username/password
FULL export entire file (N)
BUFFER size of data buffer
OWNER list of owner usernames
FILE output files (EXPDAT.DMP)
TABLES list of table names
COMPRESS import into one extent (Y)
RECORDLENGTH length of IO record
GRANTS export grants (Y)
INCTYPE incremental export type
INDEXES export indexes (Y)
RECORD track incr. export (Y)
DIRECT direct path (N)
TRIGGERS export triggers (Y)
LOG log file of screen output
STATISTICS analyze objects (ESTIMATE)
ROWS export data rows (Y)
PARFILE parameter filename
CONSISTENT cross-table consistency(N)
CONSTRAINTS export constraints (Y)
OBJECT_CONSISTENT transaction set to read only during object export (N)
FEEDBACK display progress every x rows (0)
FILESIZE maximum size of each dump file
FLASHBACK_SCN SCN used to set session snapshot back to
FLASHBACK_TIME time used to get the SCN closest to the specified time
QUERY select clause used to export a subset of a table
RESUMABLE suspend when a space related error is encountered(N)
RESUMABLE_NAME text string used to identify resumable statement
RESUMABLE_TIMEOUT wait time for RESUMABLE
TTS_FULL_CHECK perform full or partial dependency check for TTS
TABLESPACES list of tablespaces to export
TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
TEMPLATE template name which invokes iAS mode export
imp
Keyword Description (Default)
USERID username/password
BUFFER size of data buffer
FILE input files (EXPDAT.DMP)
SHOW just list file contents (N)
IGNORE ignore create errors (N)
GRANTS import grants (Y)
INDEXES import indexes (Y)
ROWS import data rows (Y)
LOG log file of screen output
FULL import entire file (N)
FROMUSER list of owner usernames
TOUSER list of usernames
TABLES list of table names
RECORDLENGTH length of IO record
INCTYPE incremental import type
COMMIT commit array insert (N)
PARFILE parameter filename
CONSTRAINTS import constraints (Y)
DESTROY overwrite tablespace data file (N)
INDEXFILE write table/index info to specified file
SKIP_UNUSABLE_INDEXES skip maintenance of unusable indexes (N)
FEEDBACK display progress every x rows(0)
TOID_NOVALIDATE skip validation of specified type ids
FILESIZE maximum size of each dump file
STATISTICS import precomputed statistics (always)
RESUMABLE suspend when a space related error is encountered(N)
RESUMABLE_NAME text string used to identify resumable statement
RESUMABLE_TIMEOUT wait time for RESUMABLE
COMPILE compile procedures, packages, and functions (Y)
STREAMS_CONFIGURATION import streams general metadata (Y)
STREAMS_INSTANTIATION import streams instantiation metadata (N)
The following keywords only apply to transportable tablespaces
TRANSPORT_TABLESPACE import transportable tablespace metadata (N)
TABLESPACES tablespaces to be transported into database
DATAFILES datafiles to be transported into database
TTS_OWNERS users that own data in the transportable tablespace set
Thanks for the quick response,
Regards
Alex

Similar Messages

  • Export & Import data in Oracle (Urgent)

    I just wonder whether Oracle 8i has the 'Export & Import data' feature in their DBA Administration tool.
    Inside DBA Studio, I found a option to export/import data to text file, but we must connect to Oracle Management Server (OMS) first before we can use that feature. I found the same feature available in Oracle 7.3.3 in Oracle Data Manager.
    How to make sure that I have a OMS installed on my server? (I purchase a Oracle 8i Standard Edition, does it include OMS?)
    Can we export from a table (database A) to a table in database B? Or We can only do this thru. a dump file?

    With every installation of an Oracle DB you get the exp(ort) and imp(ort) utilities. You can use them to move data from one user to another.
    Run them from the dos-prompt like:
    exp parfile=db_out.par
    imp parfile=db_in.par
    with db_out.par=
    file=db.dmp
    log= db_out.log
    userid=system/?????
    owner=???
    constraints=y
    direct=n
    buffer=0
    feedback=100
    and db_in.par=
    file=db.dmp
    log= db_in.log
    userid=system/???
    touser=??
    fromuser=???
    constraints=y
    commit=y
    feedback=100
    null

  • Scripting oracle export and import dumps through PLSQL stored procedures

    Hello,
    I would like to know if it is possible to script oracle export and import dump commands in a PL/SQL package rather than at command prompt. Also, how can i copy the export dump files across the network to a specific location.
    I would really appreciate if someone can provide me with examples????
    OR
    If there is off the shelf solution for what i am trying to achieve?
    Many Thanks.

    Hello,
    there are many ways to do this:
    - Java with PL/SQL wrapper,
    - call C code as external procedure from PL/SQL,
    - DBMS_SCHEDULER has some features related to this as well,
    - or write your own logic: for example creating a new file with UTL_FILE could be the trigger of the export.
    Franky
    Edited by: Franky on Aug 10, 2009 4:25 AM - extended

  • Scripting Oracle export and import dumps

    Hello,
    I would like to know if it is possible to script oracle export and import dump commands in a PL/SQL package rather than at command prompt. Also, how can i copy the export dump files across the network to a specific location.
    I would really appreciate if someone can provide me with examples????
    Thanks.

    Yes, there's DBMS_JOB on 9i, it's sort of DBMS_SCHEDULER's 'grampa' ;)
    Check the 9i docs (http://tahiti.oracle.com) and do some searches here and on http://asktom.oracle.com for more info.
    Hoewever, in order to execute OS-commands, you'll probably need a JAVA wrapper.
    DBMS_JOB cannot do that.
    Examples can be found here:
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:952229840241
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:16212348050

  • Access Connection export/import feature

    Hi all,
    Is it possible to export/import WiFi profile Pre-Shared key ?
    Thank you
    Mat

    You mean besides just exporting and importing the wireless profile that uses that key?!?
    You can also create a deployment file that will contain the key.
    Just remember that profile exports/deployments are operating system and version dependant.  i.e don't try to import a profile in v5.31 for Vista from a v4.42 version for XP.

  • Export / Import ---  Encrypt the data

    Hi,
    Is it posible to encrypt some data in a table during export/import in oracle 9i database (not in 10g)?
    Best regards
    RajaBaskar

    The main reason why people assume there are undocumented features, is because they can't be bothered to adhere to the Forums Etiquette and read the documentation prior to asking their doc question.I do not see a reason to be so rude on this. If every person reads all the documentation and uses all the features of Support services or is aware of those then we do not need these forums at all.
    I mean no offence and I agree with you in lot of other cases when people just come up with unnecessary posts but this one I see as a legitimate issue which lot of dbas find in their normal work life when they have to send an export dump to a partner outside their network and the export contains sensitive data.
    Thanks,
    Ankit.

  • Issue with Oracle Export/Import

    Hello Oracle,
    I m using Oracle Import Utility to Import data .This utility, i m controlling through a C#.Net Application.
    But I want to show the time estimation required to complete the Export/Import process in advance just like the windows feature when we are copying files from one place to another.
    How can we get time estimation in advance to import the dump file or after analysis of dump file ?
    Is Oracle providing any information regarding estimated time required to import the dump in data base.
    Pls guide me.
    Thanks.
    Regards
    Tarun

    But I want to show the time estimation required to
    complete the Export/Import process in advance just
    like the windows feature when we are copying files
    from one place to another.So you want to show the user that it will be another 10 minutes left to complete the export/import, and just when he/she thinks it is nearly complete the time to remaining is reported in some 20+ digit number? ;)
    The export and import utilities do not have a method to report the time remaining to complete that I have seen.

  • Best choice for exporting / importing EUL

    Hi all
    I have been tasked with migrating an EUL from a R11 to a R12 environment. The Discoverer version on both environments is 10.1.2 and the OS is Solaris on oracle db's.
    I am unfortunately not experienced with Discoverer and there seems to be no one available to assist for various reasons. So I have been reading the manual and forum posts and viewing metalink articles.
    I tried exporting the entire eul via the wizard and then importing it to the new environment but i was not successfull and experienced the system hanging for many hours with a white screen and the log file just ended.
    I assumed this was a memory problem or slow network issues causing this delay. Someone suggested I export import the EUL in pieces and this seemed to be effective but I got missing item warnings when trying to open reports. This piece meal approach also worried me regarding consistency.
    So I decided to try do the full import on the server to try negate the first problem I experienced. Due to the clients security policies I am not able to open the source eul and send it to our dev. I was able to get it from their dev 11 system but I dismissed this as the dev reports were not working and the only reliable eul is the Prod one. I managed to get a prod eex file from a client resource but the upload to my server was extremely slow.
    I asked the dba to assit with the third option of exporting a db dump of the eul_us and importing this into my r12 dev environment. I managed this but had to export the db file using sys which alleviated a priviledge problem when logging in, I have reports that run and my user can see the reports but there are reports that were not shared to sysadmin in the source enviroment are now prefixed with the version 11 user_id in my desktop and the user cannot see her reports only the sysadmin ones.
    I refreshed the BA's using a shell script I made up which uses the java cmd with parameters.
    After some re reading I tried selecting all the options in the validate menu and refreshing in the discover admin tool.
    If I validate and refresh the BA using the admin tool I get the hanging screen and a lot of warnings that items are missing( so much for my java cmd refresh!) and now the report will not open and I see the substitute missing item dialogue boxes.
    My question to the forum is which would be the best approach to migrate the entire eul from a R11 instance to a R12 instance in these circumstances?
    Many thanks
    Regards
    Nick

    Hi Srini
    The os and db details are as follows:
    Source:
    eBus 11.5.2
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Target:
    ebus 12.1.2
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production DEV12
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Yes the DBA initially did an exp for me using EUL_US as the owner but a strange thing happened with privileges and also, some of the imported tables appeared in the target environment under the apps schema(21 tables) even though the eul_us exp had been 48 tables.
    I also had a problem on the db with "eul_us has insufficient privileges on table space discoverer" type errors.
    I checked the eul_us db privileges and was unable to resolve this initial privilege error even though the privileges were granted to eul_us.
    The dba managed to exp as system and then import it with the full=y flag in the import command which seems to bring in the privileges.
    Then I ran the eul5_id.sql and then made up a list of the business areas and made a sh script to refresh the business areas as follows:
    java -jar eulbuilder.jar -connect sysadmin/oracle1@dev -apps_user -apps_responsibility "System Administrator" -refresh_business_area "ABM Activities" -log refresh.log
    This runs successfully and I can log in select business area and grant access to the users. The reports return data.
    Then one of the users said she can't see all her reports. I noticed some if I opened desktop that were sitting there prefixed with a hash and her version 11 user id.
    So back to the manuals and in the disco admin help the instructions are to first go to view > validate > select all options then go to the business area and click file refresh. This gives me a lot of warnings about items that are missing. I assume this is because the item identifiers brought across in the db dump are the version 11 ones and thus not found in the new system.
    Any suggestions?
    Many thanks
    Nick

  • Flex Builder 3 - Export Release Build Feature

    Just bought the Flex Builder 3 prof version and imported my
    project from Flex Builder 2 to 3. All compiled OK and I see all
    .html wrappers and swf (debug and release versions) in the bin
    folder. However when I run the "Export Release Build Feature" only
    index.swf (my main application file) along with its wrapper
    index.html and my images directory gets copied to "bin-release"
    folder. Not sure why remaining.swf files corresponding to the .mxml
    files are not copied. Do I need to copy these manually???
    Appreciate any help with this.
    Thanks in advance.

    The problem is that:
    all the images do not appear in the "Select the output files to include in the exported AIR or AIRI file" section of the Export Release Build. I see only 30 files with the rest of the files missing.
    As a work around, what I did is manually copy all the resources/images from bin-debug to bin-release directory and then make the export release build. In this air package after installation, I see now all the images.
    so, the original problem with flex builder not being able to include all the images in release build still persists.
    thanks,
    Sunil

  • Import dump with greek characters

    Hi
    Client had provided us dump file with expdp command. We have imported it with impdp command. However, data contains greek characters. Database charset is same in both source and target database (UTF8).
    Currently it seems such greek characters are not properly installed as import dump gives error for creating index on columns where such data is there as it seems it is not interpreted properly.
    Is it necessary to take care to set export nls_language parameter at the time of import/export? OR can work if specify at import time?
    Please guide on it.
    Thanks in advance

    Hi,
    Are you sure they are ???? - a lot of the time it's just because the client isnt displaying it properly - check some of the things i mention here:
    Oracle Database Blog 2.0: Working with non-english characters in linux and oracle
    Also - are you sure the index is 'ok' in the original database - there hasn't been something funny done with novalidate or anything like that?
    can you do a count(*) in the original and a count(distinct) and see if they are the same?
    Cheers,
    Rich

  • Using export/import to migrate data from 8i to 9i

    We are trying to migrate all data from 8i database to 9i database. We plan to migrate the data using export/import utility so that we can have the current 8i database intact. And also the 8i and 9i database will reside on the same machine. Our 8i database size is around 300GB.
    We plan to follow below steps :
    Export data from 8i
    Install 9i
    Create tablespaces
    Create schema and tables
    create user (user used for exporting data)
    Import data in 9i
    Please let me know if below par file is correct for the export :
    BUFFER=560000
    COMPRESS=y
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=y
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE
    TRIGGERS=y
    TTS_FULL_CHECK=TRUE
    Thanks,
    Vinod Bhansali

    I recommend you to change some parameters and remove
    others:
    BUFFER=560000
    COMPRESS=y -- This will increase better storage
    structure ( It is good )
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=n -- if you set that parameter in yes you
    can have problems with some objects
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y -- this value is the default ( It is
    not necesary )
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y -- ( start the database in restrict
    mode and do not set this param )
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE -- this value is the default ( It is
    not necesary )
    TRIGGERS=y -- this value is the default ( It is
    not necesary )
    TTS_FULL_CHECK=TRUE
    you can see what parameters are not needed if you apply
    this command:
    [oracle@ozawa oracle]$ exp help=y
    Export: Release 9.2.0.1.0 - Production on Sun Dec 28 16:37:37 2003
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    You can let Export prompt you for parameters by entering the EXP
    command followed by your username/password:
    Example: EXP SCOTT/TIGER
    Or, you can control how Export runs by entering the EXP command followed
    by various arguments. To specify parameters, you use keywords:
    Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
    Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
    or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword Description (Default) Keyword Description (Default)
    USERID username/password FULL export entire file (N)
    BUFFER size of data buffer OWNER list of owner usernames
    FILE output files (EXPDAT.DMP) TABLES list of table names
    COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
    GRANTS export grants (Y) INCTYPE incremental export type
    INDEXES export indexes (Y) RECORD track incr. export (Y)
    DIRECT direct path (N) TRIGGERS export triggers (Y)
    LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
    ROWS export data rows (Y) PARFILE parameter filename
    CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
    OBJECT_CONSISTENT transaction set to read only during object export (N)
    FEEDBACK display progress every x rows (0)
    FILESIZE maximum size of each dump file
    FLASHBACK_SCN SCN used to set session snapshot back to
    FLASHBACK_TIME time used to get the SCN closest to the specified time
    QUERY select clause used to export a subset of a table
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    TTS_FULL_CHECK perform full or partial dependency check for TTS
    VOLSIZE number of bytes to write to each tape volume
    TABLESPACES list of tablespaces to export
    TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
    TEMPLATE template name which invokes iAS mode export
    Export terminated successfully without warnings.
    [oracle@ozawa oracle]$
    Joel P�rez

  • Is time to export-import location dependent

    I had a doubt that does the export-import process depend on the location from where it is performed. I will try to explain the scenario. There is export-import to be done of a server in Africa as the migration to a new server is required. My question is does it make a difference in the time it takes to export-import, if the SecureCRT session is logged on from here, Africa itself or if the process is done from a far off location, say, India as some of our DBA team is in India.
    So, the doubt is does the export-import process depend on the location from where it is performed?
    I hope, my question is clear.
    Please, help in solving the doubt.
    regards

    You need to clarify from an Oracle perspective what the client machine is in this scenario (i.e. the machine that is connecting to the Oracle database). What machine is the export utility running on and what machine is the export utility writing to?
    If the export utility is running on the same machine and writing to a dump file on the same machine regardless of where the DBA is sitting, then where the DBA is sitting is irrelevant. If the export utility runs on a different machine or writes to a dump file on a different machine depending on where the DBA is sitting, then where the DBA is sitting is relevant.
    Justin

  • Export/import in 904 versus 10.1.2

    We have two 10G AS 904 installs (dev and prod - same O/S too). From time to time we move our production portal content to the dev server. We sometimes get corruption and have to use the Schema validation Utility scripts to fix the import to the dev server. Not a show stopper but this adds another manual step to the process.
    I see the 10.1.2 version has improved import/export checks. Can anyone give feedback on the improvements in the export/import process?
    Thanks
    Kevin

    Kevin,
    Careful with this approach. Passing from DEV to PROD is ok, but then to DEV back again, what I suggest you is that you have a clean start on DEV, ie, clean the DEV machine and later do the same to the PROD. Portal Export / Import works only in one-way and not both ways (this basically has to do with the checks we do make and with possible conflicts with the objects on both sides). Also check the available documentation... which is all compiled in Metalink Note:263995.1 - Master Note for OracleAS Portal Export / Import Issues
    As to the improvement of the process, please give it a check on the New Features papers (make a find on "export" - it is easier to pick the references):-
    10.1.2 -- http://www.oracle.com/technology/products/ias/pdf/1012_nf_paper.pdf
    10.1.4 -- http://www.oracle.com/technology/products/ias/portal/pdf/portal_1014_new_features.pdf
    I hope it helps...
    Cheers,
    Pedro.

  • Extents in Export/Import files

    I work on Oracle 8.1.7 with HP unix. I need to create a schema in QA exactly with the same tables as in Prod. (data in QA is a subset of Prod)
    Tables in prod have very big initial extents.
    I created a dump file by exporting the prod env.
    I imported the dump file (from prod) with the Index file option and got the SQL file. However, each 'create table' statement has the initial extent size included in it. I would have to manually change for 600 tables so that the initial extent is smaller for QA environment. I would like to use the default initial extent size for each table.
    I tried using the COMPRESS=Y and N options and could not spot any difference.
    Is there a way to completely avoid the extents or storage parameter in the export/import process ?
    Appreciate any help on this
    Thanks

    Hi Tushar,
    1. How do I pass this internal table to function module ?
       I assume u are creating your own Y/Z FM.
       Pass it thru TABLES parameter.
    2. When I am creating function module in se37 where do I define this iternal table type
       Define this in TABLES interface.
       What Type ?
       THE SAME TYPE WHICH HAS BEEN DEFINED
        WHILE PASSING IN THE USER-EXIT FUNCTION MODULE.
       IF U SEE THE FM OF THE USER-EXIT,
       U WILL COME TO KNOW.
    3.
    Where do I define error structure type (which is returned by function module to main program)? Is it in Export or table parameter during function module creation?
    Define it in TABLES interace. (not in export, import)
      (Since what u are going to return is an internal table)
      U can take for eg. BDCMSGCOLL.
      OR u can create your own Y/Z structure
    for the same purpose.
      (or u can use the structure type T100)
    I hope it helps.
    Regards,
    Amit M.

  • Export Import single table...

    Gurus...
    I am working on this single table which needs to be exported from prod and import into test.
    As I understand I need to follow below steps:
    1. Test - export table abc dump as backup
    2. Prod - Export single table abc
    3. Test - Drop table abc cascade constraints
    4. Test - Import abc into test
    Export par file:
    Directory= dbpump
    dumpfile= expdp_abc.dmp
    logfile= expdp_abc.log
    content= all
    tables=user.abc
    exclude=statistics,object_grant, tablespace_quota
    flashback_time= systimestamp
    Import par file:
    Directory= dbpump
    dumpfile= expdp_abc.dmp
    logfile= impdp_abc.log
    content= all
    tables=user.abc
    table_exist_action=replace
    transform=segment_attributes:N
    my doubts:
    1. Is my process flow correct?
    2. Export & Import file correct? or missing parameters?
    3. What happens to all objects connected to the table, will that also imported?
    4. Do I need to lock user during this process?
    5. Any script to check whether all objects connected to table does exist in test after import?

    Hi,
    Process for table export & import.
    +1. Create database directory in test as well as production:-+
    --> create or replace directory directory_name as 'physical_path';
    --> grant acsess on that directory to user.
    +2. Backp table in test environment (in case if you need old data in your test env):-+
    --> Create table BKP_table_name as select * from table_name; (table_name u want to import)
    +3. Take Export backup in Production database:-+
    --> expdp dumpfile=file_name.dmp logfile=file_name.log directory=directory_name tables=Owner.table_name
    +4. On test server do the following actvity:-+
    a. Just check for dependencies on that table using DBA_DEPENDENCIES.
    b. Truncate table which u want to import
    c. import table-
    impdp dumpfile=file_name.dmp tables=owner.table_name logfile=file_name_imp.log directory=directory_name table_exists_action=APPEND
    d. Just check for dependencies on that table using DBA_DEPENDENCIES.
    And also you can use table_exists_action=replace
    This will also import all depedent objects to the table..
    Regards,
    Edited by: XBOX on Dec 14, 2012 10:15 PM

Maybe you are looking for