Shell Script For Export And Import Of Table Records

Hello,
We have production and test instances and for constant testing we need to copy data from production to test or development environment.
At the moment what we do is manually doing export and import table records. At times this could be very tedious as we may need
to do this exercise a couple of times in a day.
Is it a good idea to do this exercise using shell script? If so how could I do this? If this is not a good idea what are the best alternatives?
Any input is highly appreciated.
Thanks

Ah I see, your company prefers stupidity over efficiency. It would be possible to do it in a controlled environment, wouldn't it? Also the test database would be allowed to select only.
So the non-allowance is just plain stupid.
To the second question: do you use hard-coded passwords in shell scripts?
Don't you think that poses a security risk?
Don't you think that is a bigger risk than a database link, properly set up?
In my book it is!
Sybrand Bakker
Senior Oracle DBA

Similar Messages

  • Shell script for export and import

    Hi,
    I want to run exp command in background since i need to export 40gb database of other database
    if i won't use & my session will die.
    appreciated any inputs.
    i need to run this line from shell script.
    /oracle/bin/exp pin@voipdbstg/pin file=voip.dmp owner=pin log=voip.log
    bash-2.05$ more export.sh
    #!/bin/sh
    /oracle/bin/exp pin@voipdbstg/pin file=voip.dmp owner=pin log=voip.log
    bash-2.05$ sh export.sh &
    [10] 13352
    bash-2.05$
    Export: Release 8.1.7.0.0 - Production on Fri Dec 4 22:51:09 2009
    (c) Copyright 2000 Oracle Corporation. All rights reserved.
    Password: pin
    bash: pin: command not found
    [10]+ Stopped sh export.sh
    input appreciated
    thanks
    Prakash

    Hi,
    /oracle/bin/exp pin@voipdbstg/pin file=voip.dmp owner=pin log=voip.log
    should be
    /oracle/bin/exp pin/pin@voipdbstg file=voip.dmp owner=pin log=voip.log
    Also for running the script is background and you should be able to log out of shell scriptthen run the command in nohup mode.
    $nohup export.sh&
    Regards

  • Export and Import of table

    Dears,
    I need to export two tables from PRD and import to QAS.
    Size of table is 1 GB and 3 GB.
    Please confirm:
    1--Do i need any downtime for export and import of table
    2- How much time will it take in this export process.
    Please suggest.
    Shivam

    I cannot tell you how much time it will take. are you still using Pentium 3 processors? are you on 4 quad-core CPU? are you on IDE disk? SATA disks? on a fast and big powerful SAN?
    This being said, I would say roughly 10 minutes. but as you understand, this is just a wild guess.
    for downtime, you did not tell us the table names. This being said, try to export on a low usage period.

  • How to export and import dependent tables from 2 different schema

    I have a setup where schema1 has table 1 and schema 2 has table2. And table 1 from schema1 depends upon table 2 from schema 2.
    I would like to export and import these tables only and not any other tables from these 2 schemas with all information like grants,constraints.
    Also will there be same method for Oracle 10g R1,R2 and Oracle 11G.
    http://download.oracle.com/docs/cd/B12037_01/server.101/b10825/dp_export.htm#i1007514
    Looking at this For table mode it says
    Also, as in schema exports, cross-schema references are not exported
    Not sure what this means.
    As I am interested in only 2 tables I think I need to use table mode. But if I try to run export with both tables names, it says table mode support only one schema at a time. Not sure then How would the constraints would get exported in that case.
    -Rohit

    worked for my 1st time I tried
    exp file=table2.dmp tables="dbadmin.temp1,scott.emp"
    Export: Release 10.2.0.1.0 - Production on Mon Mar 1 16:32:07 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export done in US7ASCII character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P1 character set (possible charset conversion)
    About to export specified tables via Conventional Path ...
    Current user changed to DBADMIN
    . . exporting table                          TEMP1         10 rows exported
    EXP-00091: Exporting questionable statistics.
    Current user changed to SCOTT
    . . exporting table                            EMP         14 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    Export terminated successfully with warnings.

  • Shell script for export backup in oracle 11g

    Hi,
    Oracle version 11.2.0..
    O/S-AIX
    How to write shell script for export full backup in oracle 11g and also need to remove 2 days of old backup.
    Regards,
    Raju

    How to write shell script for export full backup in oracle 11g
    Do you mean that export is your backup strategy ? is your database running in noarchivelog mode ? if so, then why ? if not so, then why not RMAN ?
    need to remove 2 days of old backup.
    If that mean remove files older than 2 days, you can use something like this :
    $ find <absolute directory path> -mtime +2 -exec rm {} \;

  • Scripting Oracle export and import dumps

    Hello,
    I would like to know if it is possible to script oracle export and import dump commands in a PL/SQL package rather than at command prompt. Also, how can i copy the export dump files across the network to a specific location.
    I would really appreciate if someone can provide me with examples????
    Thanks.

    Yes, there's DBMS_JOB on 9i, it's sort of DBMS_SCHEDULER's 'grampa' ;)
    Check the 9i docs (http://tahiti.oracle.com) and do some searches here and on http://asktom.oracle.com for more info.
    Hoewever, in order to execute OS-commands, you'll probably need a JAVA wrapper.
    DBMS_JOB cannot do that.
    Examples can be found here:
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:952229840241
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:16212348050

  • Scripting oracle export and import dumps through PLSQL stored procedures

    Hello,
    I would like to know if it is possible to script oracle export and import dump commands in a PL/SQL package rather than at command prompt. Also, how can i copy the export dump files across the network to a specific location.
    I would really appreciate if someone can provide me with examples????
    OR
    If there is off the shelf solution for what i am trying to achieve?
    Many Thanks.

    Hello,
    there are many ways to do this:
    - Java with PL/SQL wrapper,
    - call C code as external procedure from PL/SQL,
    - DBMS_SCHEDULER has some features related to this as well,
    - or write your own logic: for example creating a new file with UTL_FILE could be the trigger of the export.
    Franky
    Edited by: Franky on Aug 10, 2009 4:25 AM - extended

  • Exporting and importing just table definitions

    Hi,
    I have this production database that has a huge amount of data in it. I was asked to set up a test database based on the exact same schema as the live database. When I tried to do an export (from live) and import (to test), with the parameters rows=N and compress=y, the target (test database) data file will still grow enormously, presumably because of the huge number of extents already allocated to the table in the live database. My test database of course, has a limited hard-disk space.
    Is there a way to export and import the table definitions without having the target database experiencing a huge growth in the size of the tablespace?
    Thanks,
    Chris.

    If an export with compress=n is still creating initial extents that a too large, you can still build with the import file but it will take a little work.
    run imp with indexfile=somefile.sql
    when imp is finished, edit somefile.sql by:
    1. remove all the REM statements.
    2. remove all the storage clauses (tables and indexes)
    Make sure your tablespaces have a small (say 1k) default initial extent.
    run imp again with rows=n
    All your tables and indexes will be created with the default tablespace initial extent.

  • Shell Script  for Startup and Shutdown the database

    Hi,
    i want Shell Script for Startup and Shutdown the database in Solaries.
    could any one can hep me where i can get this script. or send to me to [email protected]
    Thanks & Regards,
    Gangi reddy

    SHUTDOWN
    SHUTDOWN ABORT]
    Shuts down a currently running Oracle instance, optionally closing and dismounting a database.
    Terms
    Refer to the following list for a description of each term or clause:
    ABORT
    Proceeds with the fastest possible shutdown of the database without waiting for calls to complete or users to disconnect.
    Uncommitted transactions are not rolled back. Client SQL statements currently being processed are terminated. All users currently connected to the database are implicitly disconnected and the next database startup will require instance recovery.
    You must use this option if a background process terminates abnormally.
    IMMEDIATE
    Does not wait for current calls to complete or users to disconnect from the database.
    Further connects are prohibited. The database is closed and dismounted. The instance is shutdown and no instance recovery is required on the next database startup.
    NORMAL
    NORMAL is the default option which waits for users to disconnect from the database.
    Further connects are prohibited. The database is closed and dismounted. The instance is shutdown and no instance recovery is required on the next database startup.
    TRANSACTIONAL [LOCAL]
    Performs a planned shutdown of an instance while allowing active transactions to complete first. It prevents clients from losing work without requiring all users to log off.
    No client can start a new transaction on this instance. Attempting to start a new transaction results in disconnection. After completion of all transactions, any client still connected to the instance is disconnected. Now the instance shuts down just as it would if a SHUTDOWN IMMEDIATE statement was submitted. The next startup of the database will not require any instance recovery procedures.
    The LOCAL mode specifies a transactional shutdown on the local instance only, so that it only waits on local transactions to complete, not all transactions. This is useful, for example, for scheduled outage maintenance.
    Usage
    SHUTDOWN with no arguments is equivalent to SHUTDOWN NORMAL.
    You must be connected to a database as SYSOPER, or SYSDBA. You cannot connect via a multi-threaded server. For more information about connecting to a database, see the CONNECT command earlier in this chapter.
    http://download-west.oracle.com/docs/cd/B10501_01/server.920/a90842/ch13.htm#1013607
    Joel Pérez

  • Guide me for exporting and importing data from oracle 10g to oracle 9i

    I have been trying export and import options in oracle but m not very clear about the procedure..
    Also when i am logging in as Scott/tiger i am facing an error
    ORA 1542 - tablespace 'USER' is offline and cannot allocate space in it.how to resolve it

    Hello,
    I see two questions.
    ORA 1542 - tablespace 'USER' is offline and cannot allocate space in it.how to resolve it About the error ORA-01542 you have to bring the Tablespace USERS online:
    http://www.error-code.org.uk/view.asp?e=ORACLE-ORA-01542
    If you cannot bring it online you may have to Recover it first.
    I have been trying export and import options in oracle but m not very clear about the procedure..About the Export in 10g and Import in 9i (9.2 or 9.0.1 ?), you have to export with the Export utility of the Target Database (the lowest Database version here: 9.0.1 or 9.2) then, you have to Import to the Target Database using its Import utility (in 9.0.1 or 9.2).
    The following Note of MOS may guide you:
    Compatibility Matrix for Export And Import Between Different Oracle Versions [Video] [ID 132904.1]
    NB: You cannot use here the DATAPUMP.
    Hope this help.
    Best regards,
    Jean-Valentin

  • How row migration get eliminated by export and import the table.

    Hi ,
    Please let me know,how row migration get eliminated by export and import the table.
    Another method,deleting migrated rows and inserting those rows from copied table,
    i think the concept behind this method is,inserting these rows into new block.
    pls correct me if i am wrong.
    Thanks,
    Kumar.

    Hi!
    You can also use ALTER TABLE MOVE command, or if you are on 9i you can use DBMS_REDEFINITION package.
    If you have a maintainance window the easiest way would be to use alter table move.
    Regards,
    PP

  • Export and import XMLType table

    Hi ,
    I want to export one table which contain xmltype column form oracle 11.2.0.1.0 and import into 11.2.0.2.0 version.
    I got following errors when i export the table , when i tried with exp and imp utility
    EXP-00107: Feature (BINARY XML) of column ZZZZ in table XXXX.YYYY is not supported. The table will not be exported.
    then i tried export and import pump.Exporting pump is working ,following is the log
    Export: Release 11.2.0.1.0 - Production on Wed Oct 17 17:53:41 2012
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ;;; Legacy Mode Active due to the following parameters:
    ;;; Legacy Mode Parameter: "log=<xxxxx>Oct17.log" Location: Command Line, Replaced with: "logfile=T<xxxxx>_Oct17.log"
    ;;; Legacy Mode has set reuse_dumpfiles=true parameter.
    Starting "<xxxxx>"."SYS_EXPORT_TABLE_01": <xxxxx>/******** DUMPFILE=<xxxxx>Oct172.dmp TABLES=<xxxxx>.<xxxxx> logfile=<xxxxx>Oct17.log reusedumpfiles=true
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 13.23 GB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "<xxxxx>"."<xxxxx>" 13.38 GB 223955 rows
    Master table "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    Dump file set for <xxxxx>.SYS_EXPORT_TABLE_01 is:
    E:\ORACLEDB\ADMIN\LOCALORA11G\DPDUMP\<xxxxx>OCT172.DMP
    Job "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully completed at 20:30:14
    h4. I got error when i import the pump using following command
    +impdp sys_dba/***** dumpfile=XYZ_OCT17_2.DMP logfile=import_vmdb_XYZ_Oct17_2.log FROMUSER=XXXX TOUSER=YYYY CONTENT=DATA_ONLY TRANSFORM=oid:n TABLE_EXISTS_ACTION=append;+
    error is :
    h3. KUP-11007: conversion error loading table "CC_DBA"."XXXX"*
    h3. ORA-01403: no data found*
    h3. ORA-31693: Table data object "XXX_DBA"."XXXX" failed to load/unload and is being skipped due to error:*
    Please help me to get solution for this.

    CREATE UNIQUE INDEX "XXXXX"."XXXX_XBRL_XMLINDEX_IX" ON "CCCC"."XXXX_XBRL" (EXTRACTVALUE(SYS_MAKEXML(128,"SYS_NC00014$"),'/xbrl'))above index is created by us because we are storing file which like
    <?xml version="1.0" encoding="UTF-8"?>
    <xbrl xmlns="http://www.xbrl.org/2003/instance" xmlns:AAAAAA="http://www.AAAAAA.COM/AAAAAA" xmlns:ddd4217="http://www.xbrl.org/2003/ddd4217" xmlns:link="http://www.xbrl.org/2003/linkbase" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <link:schemaRef xlink:href="http://www.fsm.AAAAAA.COM/AAAAAA-xbrl/v1/AAAAAA-taxonomy-2009-v2.11.xsd" xlink:type="simple" />
    <context id="Company_Current_ForPeriod"> ...I tried to export pump with and without using DATA_OPTIONS=XML_CLOBS too.Both time exporting was success but import get same KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL" error.
    I tried the import in different ways
    1. Create table first then import data_only (CONTENT=DATA_ONLY)
    2. Import all table
    In both way table ddl is created successfully ,but it fail on import data.Following is the log when i importing
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "aaaaa"."SYS_IMPORT_TABLE_02" successfully loaded/unloaded
    Starting "aaaaa"."SYS_IMPORT_TABLE_02":  aaaaa/********@vm_dba TABLES=tab_owner.bbbbb_XBRL dumpfile=bbbbb_XBRL_OCT17_2.DMP logfile=import_vmdb
    bbbbb_XBRL_Oct29.log DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    *KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL"*
    *ORA-01403: no data found*
    ORA-31693: Table data object "tab_owner"."bbbbb_XBRL" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-26062: Can not continue from previous errors.
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "aaaaa"."SYS_IMPORT_TABLE_02" completed with 1 error(s) at 18:42:26

  • SHELL SCRIPT FOR EXPORT AUTOMATICALLY

    Hi all,
    I need a shell script which will automatically take logical backup(export) in the solaris machine.I have 5 database and 10 schemas for that i want to take logical backup.
    thanks
    vicky

    If you're using 10g try to do it with DBMS_DATAPUMP and DBMS_SCHEDULER

  • Export and Import Of Tables having BLOB and Raw datatype

    Hi Gurus,
    I had to export one schema in one database and import to another schema in another database.
    However my current database contains raw and blob datatype.I have exported the whole database by the following commnad
    exp SYSTEM/manager FULL=y FILE=jbrms_full_19APR2013.dmp log=jbrms_full_19APR2013.log GRANTS=y ROWS=y
    My question is if all the tables with raw and blob have been exported properly or not.I have done one more thing after taking the export , I have imported to local db and checked the no of rows in the both the envs are same.As I have not tested with the application to confirm.
    I am using this version of Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
    not able to attach the complete log file but for the schema jbrms which has the blob and raw datatype.
    Please let me know if you find some potential concerns with the export for BLOB and raw
    . about to export JBRMS's tables via Conventional Path ...
    . . exporting table FS_FSENTRY 8 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table FS_WS_DEFAULT_FSENTRY 2 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table PM_WS_DEFAULT_BINVAL 60 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table PM_WS_DEFAULT_BUNDLE 751 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table PM_WS_DEFAULT_NAMES 2 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table PM_WS_DEFAULT_REFS 4 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table VERSIONING_FS_FSENTRY 1 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table VERSIONING_PM_BINVAL 300 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table VERSIONING_PM_BUNDLE 11654 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table VERSIONING_PM_NAMES 2 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    . . exporting table VERSIONING_PM_REFS 1370 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.

    You could see the 'QUESTIONABLE STATISTICS' warning for a couple of reasons. I don't remember them all but.
    1. If the target and source character set is different.
    2. system generated names (I think?)
    the best solution if you don't need the exact statistics that are on your source database would be to add
    statistics=none
    to your imp command and then regather statistics when the imp command is done.
    Dean

  • Export and import a table data

    Hi All,
    There is table abc which is having appprox 40lak records without partitions(and have dmp of this data). Due to space constraints I want to compress the data and put it in archive schema.
    My question is:
    1) Can we import the data into other schema(say: archive schema) in compressed mode. If so could you please let me know the syntax for that.
    2) Can we import the data into some temporary table(say: temp) in same schema?
    3) And if any blob column exists in table then can we import the data without this blob column data.(Supposing the export dump has the blob data).
    Is it possible, or whould we export without this blob column and import this to other table?
    Please help me out.
    Edited by: user11942774 on 4 Dec, 2011 10:33 PM

    Thanks Mahir,
    Could you please let me know the syntax or with any examples if possible.
    is it possible to drop any column say blob column after the table is in compressed mode?
    Edited by: user11942774 on 4 Dec, 2011 10:53 PM
    Edited by: user11942774 on 4 Dec, 2011 10:54 PM

Maybe you are looking for