Any Maxl Script to Export and import of Partition

Hi All,
Is there any maxl or Esscmd command which will do an Export or an import of the partition.
Regards,
Krishna.

Hi Krishna,
here ,there are 2 things.
Firstly, creation of partition( which is possible through MAXL scripts).Once the partition is made, an XML is created(which is what garycris was mentioning in his post).
Goto any application -> database -> partitions ->right click here , you see "import partition",and the file nature is of XML.When you look at this window , you will undestand .I am not sure of the same functinoality in MAXL.
Hope this info helps
Sandeep Reddy Enti
HCC

Similar Messages

  • Scripting Oracle export and import dumps

    Hello,
    I would like to know if it is possible to script oracle export and import dump commands in a PL/SQL package rather than at command prompt. Also, how can i copy the export dump files across the network to a specific location.
    I would really appreciate if someone can provide me with examples????
    Thanks.

    Yes, there's DBMS_JOB on 9i, it's sort of DBMS_SCHEDULER's 'grampa' ;)
    Check the 9i docs (http://tahiti.oracle.com) and do some searches here and on http://asktom.oracle.com for more info.
    Hoewever, in order to execute OS-commands, you'll probably need a JAVA wrapper.
    DBMS_JOB cannot do that.
    Examples can be found here:
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:952229840241
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:16212348050

  • Scripting oracle export and import dumps through PLSQL stored procedures

    Hello,
    I would like to know if it is possible to script oracle export and import dump commands in a PL/SQL package rather than at command prompt. Also, how can i copy the export dump files across the network to a specific location.
    I would really appreciate if someone can provide me with examples????
    OR
    If there is off the shelf solution for what i am trying to achieve?
    Many Thanks.

    Hello,
    there are many ways to do this:
    - Java with PL/SQL wrapper,
    - call C code as external procedure from PL/SQL,
    - DBMS_SCHEDULER has some features related to this as well,
    - or write your own logic: for example creating a new file with UTL_FILE could be the trigger of the export.
    Franky
    Edited by: Franky on Aug 10, 2009 4:25 AM - extended

  • Shell Script For Export And Import Of Table Records

    Hello,
    We have production and test instances and for constant testing we need to copy data from production to test or development environment.
    At the moment what we do is manually doing export and import table records. At times this could be very tedious as we may need
    to do this exercise a couple of times in a day.
    Is it a good idea to do this exercise using shell script? If so how could I do this? If this is not a good idea what are the best alternatives?
    Any input is highly appreciated.
    Thanks

    Ah I see, your company prefers stupidity over efficiency. It would be possible to do it in a controlled environment, wouldn't it? Also the test database would be allowed to select only.
    So the non-allowance is just plain stupid.
    To the second question: do you use hard-coded passwords in shell scripts?
    Don't you think that poses a security risk?
    Don't you think that is a bigger risk than a database link, properly set up?
    In my book it is!
    Sybrand Bakker
    Senior Oracle DBA

  • Shell script for export and import

    Hi,
    I want to run exp command in background since i need to export 40gb database of other database
    if i won't use & my session will die.
    appreciated any inputs.
    i need to run this line from shell script.
    /oracle/bin/exp pin@voipdbstg/pin file=voip.dmp owner=pin log=voip.log
    bash-2.05$ more export.sh
    #!/bin/sh
    /oracle/bin/exp pin@voipdbstg/pin file=voip.dmp owner=pin log=voip.log
    bash-2.05$ sh export.sh &
    [10] 13352
    bash-2.05$
    Export: Release 8.1.7.0.0 - Production on Fri Dec 4 22:51:09 2009
    (c) Copyright 2000 Oracle Corporation. All rights reserved.
    Password: pin
    bash: pin: command not found
    [10]+ Stopped sh export.sh
    input appreciated
    thanks
    Prakash

    Hi,
    /oracle/bin/exp pin@voipdbstg/pin file=voip.dmp owner=pin log=voip.log
    should be
    /oracle/bin/exp pin/pin@voipdbstg file=voip.dmp owner=pin log=voip.log
    Also for running the script is background and you should be able to log out of shell scriptthen run the command in nohup mode.
    $nohup export.sh&
    Regards

  • Script to Export and Import Keywords and Metadata

    I have a requirement to mass upload and download keywords and various metadata fields (i.e. File Name, Date Created, City, Country, Document, Title, etc.) into an external database from the Adobe Bridge. Ideally it would be compatible to .txt, .csv, etc type of database format. Currently I have to accomplish this task one at a time. I have an immediate need to upload 1000 plus pictures and various text fields into my website to share with others and be able to sell my pictures online.
    My immediate is for exporting the metadata and keywords.
    Does anyone know of a script that is aavailable?
    Arnold

    John,
    Thanks for your quick response. Below is the script I just modified this morning. When looking in the keyword field, it returns the word "undefined" for all records.
    I tried that yesterday and it didn't work. When looking at the Advance file info on Adobe Bridge I see the below format which had 2 levels. All the others I used only had one. Since I am not a programmer, I need help on that part of the script.
    ==========================
    EXIF Properties (exif,http://ns.adobe.com/exif/1.0/)
    - exif:Flash
    - exif:Fired: True
    ====================================
    Any luck on determining how to obtain file size.
    Again thank you so much for supporting me on my project,
    Arnold
    =====================
    #target bridge
    if (BridgeTalk.appName == "bridge")
    // Let's create our menu
    var menu = MenuElement.create( "command", "Export CSV File", "at the end of Tools");
    menu.onSelect = function(m) {
    try
    // Let's ask what the name of the output file
    var f = File.saveDialog("Export file list to:", "Comma delimited file:*.CSV");
    // Write the column headings
    f.open("w");
    f.writeln("Seq Number,New File Name,New File Name Path,Original File Name,Org Dt & Tm,ISO,Exposure Time,F Stop,EV,Exposure Program,Meter Mode,Focal Length,Flash,Lens,Author, Author's Position,City,Country,Description,Title,Orientation,Width,Height,Rating,Label");
    // Let's get a list of all the visible thumbnails
    var items = app.document.visibleThumbnails;
    for (var i = 0; i < items.length; ++i) { var item = items[i]; f.writeln(i + 1, ",\"", item.name, "\",\"", item.path.replace(/\"/g, "\"\""), "\",\"", ListMetadata(item), "\",\"", "\"" ); } f.close(); } catch(e) { } } menu.onDisplay = function(m) { m.enabled = app.document.contentPaneMode == "filesystem" && app.document.visibleThumbnails.length > 0;
    function ListMetadata(tn)
    md = tn.metadata;
    md.namespace = "http://ns.adobe.com/photoshop/1.0/";
    varAuthor = md.Author + "\",\"" + md.AuthorsPosition + "\",\"" + md.City + "\",\"" + md.Country + "\",\"";
    md.namespace = "http://ns.adobe.com/exif/1.0/aux/";
    varLens = md.Lens + "\",\"" ;
    md.namespace = "http://purl.org/dc/elements/1.1/";
    vartitle = md.title + "\",\"" + md.description + "\",\"" ;
    md.namespace = "http://ns.adobe.com/xap/1.0/mm/"
    VarPreservedFileName = md.PreservedFileName + "\",\"" ;
    md.namespace = "http://ns.adobe.com/exif/1.0/"
    VarDateTimeOriginal = md.DateTimeOriginal + "\",\"" + md.ISOSpeedRatings + "\",\"" + md.ExposureTime + "\",\"" + md.FNumber + "\",\"" + md.ExposureBiasValue + "\",\"" + md.ExposureProgram + "\",\"" + md.MeteringMode + "\",\"" + md.FocalLengthIn35mmFilm + "\",\"" + md.Flash + "\",\"" ;
    md.namespace = "http://ns.adobe.com/tiff/1.0/"
    varOrientation = md.Orientation + "\",\"" + md.ImageWidth + "\",\"" + md.ImageLength + "\",\"" ;
    md.namespace = "http://ns.adobe.com/xap/1.0/"
    varRating = md.Rating + "\",\"" + md.Label ;
    md.namespace = "http://ns.adobe.com/photoshop/1.0/";
    varKeywords = ListKeywords(md) + "\",\"" ;
    function ListKeywords(md)
    var varKeywords = "" ;
    for (var i = 0; i < md.Keywords.length; ++i) { varKeywords = varKeywords + md.Keywords[i] + ", "; } //strip off final comma and space varKeywords = varKeywords.substring( 0, varKeywords.length-2);
    return VarPreservedFileName + VarDateTimeOriginal + varLens + varAuthor + vartitle + varOrientation + varRating + varKeywords;
    ==============================

  • Export and import the windows phone contacts as .vcf file to the application

    Hi, I need to export the windows phone contacts as .vcf file and import it to my windows phone application. I'm unable to import the full details of a contact. Is there any sample code to export and import phone contacts programmatically. Please
    help if there is any solution. Thanks.
    Nikitha

    Found this on the internet:
    http://www.lumisoft.ee/lswww/download/downloads/Net/
    If you download the Lumisoft.Net.Zip and extract it looks like there is a vCard folder with some classes that may help you.
    You will need to use something like that or search a bit more and find a vCard C# Helper library.
    Bret Bentzinger (MSFT) @awehellyeah

  • Please provide me unix shell script (export and import of database schema)

    please i am new in unix
    \please give me sample unix shell script (export and import of database schema)

    please i am new in unix
    \please give me sample unix shell script (export and import of database schema)Instead of providing you the readymade unix shell script for your requirement, I will give you the hints to prepare the same at your own.
    Create a file with .sh extension.
    # Specify and set all required Environment variables.
    ORACLE_HOME, PATH, NLS_LANG, etc.,
    # Use the export command with all clauses
    export scott/tiger@orcl file=scott.dmp log=scott_emp.log
    # Compress it.
    compress scott.dmp
    Refer any unix/linux documents for scripting basics.
    http://www.bijoos.com/ora7/oracle_unix.htm
    Regards,
    Sabdar Syed.

  • Export and Import of mappings/process flows etc

    Hi,
    We have a single repository with multiple projects for DEV/UAT and PROD of the same logical project. This is a nightmare for controlling releases to PROD and in fact we have a corrupt repository as a result I suspect. I plan to split the repository into 3 separate databases so that we have a design repos for DEV/UAT and PROD. Controlling code migrations between these I plan to use the metadata export and subsequent import into UAT and then PROD once tested. I have used this successfully before on a project but am worried about inherent bugs with metadata export/imports (been bitten before with Oracle Portal). So can anyone advise what pitfalls there may be with this approach, and in particular if anyone has experienced loss of metadata between export and import. We have a complex warehouse with hundreds of mappings, process flows, sqlldr flatfile loads etc. I have experienced process flow imports that seem to lose their links to the mappings they encapsulate.
    Thanks for any comments,
    Brandon

    This should do the trick for you as it looks for "PARALLEL" therefore it only removes the APPEND PARALLEL Hint and leaves other Hints as is....
    #set current location
    set path "C:/TMP"
    # Project parameters
    set root "/MY_PROJECT"
    set one_Module "MY_MODULE"
    set object "MAPPINGS"
    set path "C:/TMP
    # OMBPLUS and tcl related parameters
    set action "remove_parallel"
    set datetime [clock format [clock seconds] -format %Y%m%d_%H%M%S]
    set timestamp [clock format [clock seconds] -format %Y%m%d-%H:%M:%S]
    set ext ".log"
    set sep "_"
    set ombplus "OMBplus"
    set omblogname $path/$one_Module$sep$object$sep$datetime$sep$ombplus$ext
    set OMBLOG $omblogname
    set logname $path/$one_Module$sep$object$sep$datetime$ext
    set log_file [open $logname w]
    set word "PARALLEL"
    set i 0
    #Connect to OWB Repository
    OMBCONNECT .... your connect tring
    #Ignores errors that occur in any command that is part of a script and moves to the next command in the script.
    set OMBCONTINUE_ON_ERROR ON
    OMBCC "'$root/$one_Module'";      
    #Searching Mappings for Parallel in source View operators
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
    foreach mapName [OMBLIST MAPPINGS] {
    foreach opName [OMBRETRIEVE MAPPING '$mapName' GET TABLE OPERATORS] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT)] {
    if { [ regexp $word $prop1] == 1 } {
    incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET DIMENSION OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET CUBE OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET VIEW OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    if { $i == 0 } {
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
         } else {
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
    close $log_file;
    Enjoy!
    Michel

  • How to do fast export and import

    i have a windows 2003 server and oracle 10.2.0.3 installed on it.
    here i want to ask how i can speedup my export and import using expdp.

    Hi User,
    user11798002 wrote:
    i have a windows 2003 server and oracle 10.2.0.3 installed on it.
    here i want to ask how i can speedup my export and import using expdp.You can utilize parallelizing export when using data pump,but for traditional export use the following
    Export of Databases - reads data by running a select statement of the data and generating the DDL to perform the import process in future.
    Fast Exports
    1) Use direct=y
    2) Until disk is not fully utilized try running exports in parallel.
    3) Keep export file on different disk then the datafiles.
    4) Run exports in two part rather than one i.e. 1st rows=n and second as indexes=n rows=y constraints=n.
    Fast Imports
    1) Use the first file out of two files created with the exports, It will insert all data but will not create any indexes and constraints. Once the data insertion is done run imp with indexfile option to extract script of index creation in text file. edit the file to include parallel clause and set parameters db_file_multiblock_read_count to 128 with sort_area_size to a higher value and workarea_size_policy=AUTO.
    A R P I T S I N H A
    [oracledba.in]

  • EUL export and import

    Scenario:
    Installation #1: Infrastructure Repository Database installed along with Oracle App Server etc. The EUL resides on the Repository Database (let's say under schema eul1). The Application Data (let's say in schema - app1) for reporting also resides in the Repository database under a different schema. Business areas and worksheets reside in the eul1 schema.
    Installation #2: This was done to separate the application data from the repository data. So now, the application data for reporting resides on another server. A new install of infrastructure/repository/App Server was done. In this installation, the eul1 EUL from installation #1 was exported (using the Oracle exp/imp utility) and imported into the new schema called eul2. The two scripts that the manual suggests were also run. The views that were used to access data in app1 are now pointing appropriately to the new server (where the application data for reporting has been installed).
    The issue:
    Using the Discoverer Administrator, I can point to the new installation and view the Business area which seems to have been migrated over correctly. However, using the Discoverer Desktop, I do not see any worksheets.
    Please tell me if there's anything else I need to do to make the worksheets available. These worksheets are working great in Installation #1 and are meant for end users to use. At this point, only one EUL is being used.
    If there's specific info you would like to have in trying to help me resolve this, please let me know. This is urgent.
    Thanks.

    Hi
    Have you exported and imported the worksheets?
    Do you see them in the Admin (folder dependencies)?
    Ott Karesz
    http://www.trendo-kft.hu

  • Exporting and importing only the Incremental datas

    Dear all,
    we r facing a big time consuming process of changing oracle schema datas from live servers to out testing servers..
    scenario as follows:
    1. .dmp files of size 1GB to 3 GB are taken from live server and posted in test servers to check the live issues
    2. same sized dumps are taken for simulation to rectify issue
    this type of above process consumes the whole time for exporting and importing
    can any one suggest me how can the incremental data's alone can be exported and imported to the test server.
    Hoping that i can get a valuable solution for this
    Thanks in Advance
    S.Aravind

    Hi Aravind,
    as Nicolas specified RMAN would be the best option.
    Incremental data refresh is not possible through exp/imp. but there is a possiblity of writing a script which automates this refresh process.

  • How to export and import the Portal Components

    Hi,
    I had developed few components like (Forms, Reports, Frames etc.,) in Oracle Portal release 2. Now, I need to make available all the components on my Client's machine. I like to know how to export and import these components.
    If any can help in this, it will be a great help for me...
    Thanks in Advance...

    The migration of any Portal DB provider components, has the following steps.
    Assume that you are migrating a form component form_X.
    1) Click the export link of the form_x.
    (this will take you to create transport screen).
    2) Specify a name of the transport set which you are going to create. (eg., "My form_X transportset")
    (Since the objects are migrated using a container which is in other words called as 'Transport Set').
    3) In that select the export now button.
    (this will mark your form component form_X for export).
    4) Verify that the status of the transport is 'Extract Complete'
    (check with the browse transport screen in Export/Import Portlet under Admin tab).
    5) Now, the object information is extracted into transport set tables used by Portal Export/Import Utility.
    6) Go to that particular transport set, (browse transport) and download the script available.
    7) That script will lead you to various steps,
    (create a dump ).
    8) In the target instance, import that dump using the same script again.
    9) Merge the contents of the dump (form_X information) into any of the application.
    (Import option => in the export/import portlet under admin tab).
    10) Verify that the status of the transport set is 'Merge Complete'.
    Now, the object is migrated and ready for use.
    (Here are explaining the export/import process in detail.
    You can refer the export/import manuals.
    http://portalstudio.oracle.com/pls/ops/docs/FOLDER/COMMUNITY/INTERNALPRODDEVFOLDER/TECHREADINESS/CM_PUBLISHING/EXPIMP/EXPORT_IMPORT_LOB0.PPT

  • URGENT...Help on User Level export and import

    User level export and import -
    In my db there are more than 20 users, i would like export only 2 users and they have constraints to other user tables in db.
    what are the precautions I need to take before exporting ???
    1. Disabling all constraints etc, .
    2. Synonymns, Procedures , Triggers etc.
    3. what are the parameters setting I should use for the export.
    For Importing back into DB (same users) - What are the sequences/steps to follow..
    1. what are the parameters setting I should use for the Import.
    2. Precautions like Disabling Constraints...
    3. Special actions for Synonyms/ Procedures.
    Using Oracle 8.1.7 on Solaris.. Total DB Size is 13.3 GB
    Any help will be appreciated....

    See to it that u have the DBA PRIVS to exp/imp.
    1.Grant export full database /import full database privileges to the user doing this.
    2.If this is the first time u are doing this u have to run the scripts CATEXP.SQL and CATPROC.SQL to create the necessary views,datadictionaries as user SYS/INTERNAL.These instructions are also mentioned in the scripts comments at the beginning .
    3.Also create the necessary tablespaces where u need to direct u'r imp.
    4.select the interactive option where u can interact with
    imp/exp process and u can do this selectively for users/tables/partitions and other preferences.
    5.look out for the version compatibility of your .DMP files

  • IDM export and import commands

    Hi,
    I am trying to export and import some of the objects in IDM using command line interface.
    Eg: I have an user Object which needs to be exported in the form of an XML file using command line and make some changes and import the user object xml back to IDM using a command line interface
    I am very new to IDM. Any help regarding this would be useful.
    Thanks,
    Hsotnas
    Edited by: Hsotnas on Sep 18, 2007 7:17 PM

    Actually it can. There are a few ways to achieve this but one is.
    1. List Objects > ListFile
    2. read ListFile to write commands to CommandFile i.e ListFile might contain:
    Configuration SPE
    EndUserForm MyForm
    then this script (perl or whatever) would convert this to the following in CommandFile
    "getObject Configuration SPE > SPE.xml"
    "getObject EndUserForm MyForm > MyForm.xml"
    3. run lh console < CommandFile

Maybe you are looking for