Using package to export & Import data

Hi!
Now i'm using exp imp utility to export & import data, but i don't want to everybody know password. My platform is windows 2000.
I want know, there is a package to do this?
thank!

What version of Oracle do you have? If you're on 10.1 or later, the DataPump versions of export and import use the DBMS_DATAPUMP package to do exports & imports. You should be able to use those to run the export from within a database session.
If your concern relates to people discovering database passwords from the BAT file you're using to invoke the export & import utilities at the command line, you might also consider using externally authenticated users. If you are logged in as a member of the ORA_DBA group, for example, you can connect to Oracle without a password
sqlplus / as sysdbaJustin
Distributed Database Consulting, Inc.
http://www.ddbcinc.com/askDDBC

Similar Messages

  • How to export&import data using sql *loader

    Hi all,
    How to export&import data from sql*loader. Give me the clear steps..
    Thanks in Advance

    Hi did you already exported data from SQL SERVER? if not using SQL*LOADER you cannot export data. SQL*LOADER is only mean for importing data from flat files(usually text files) into ORACLE tables.
    for importing data into oracle tables using sql*loader use below steps
    1) create a sql*loader control file.
    it looks like as follows
    LOAD DATA
    INFILE 'sample.dat'
    BADFILE 'sample.bad'
    DISCARDFILE 'sample.dsc'
    APPEND
    INTO TABLE emp
    TRAILING NULLCOLS
    or for sample script of control file search google.
    2) at command prompt issue following
    $ sqlldr test/test
    enter control file=<give control file name which you create earlier>
    debug any errors (if occured)

  • Can I use Bridge to export image data into a .txt file?

    I have a folder of images and I would like to export the File Name, Resolution, Dimensions and Color Mode for each file into one text file. Can I use Bridge to export image data into a .txt file?

    Hello
    You may try the following AppleScript script. It will ask you to choose a root folder where to start searching for *.map files and then create a CSV file named "out.csv" on desktop which you may import to Excel.
    set f to (choose folder with prompt "Choose the root folder to start searching")'s POSIX path
    if f ends with "/" then set f to f's text 1 thru -2
    do shell script "/usr/bin/perl -CSDA -w <<'EOF' - " & f's quoted form & " > ~/Desktop/out.csv
    use strict;
    use open IN => ':crlf';
    chdir $ARGV[0] or die qq($!);
    local $/ = qq(\\0);
    my @ff = map {chomp; $_} qx(find . -type f -iname '*.map' -print0);
    local $/ = qq(\\n);
    #     CSV spec
    #     - record separator is CRLF
    #     - field separator is comma
    #     - every field is quoted
    #     - text encoding is UTF-8
    local $\\ = qq(\\015\\012);    # CRLF
    local $, = qq(,);            # COMMA
    # print column header row
    my @dd = ('column 1', 'column 2', 'column 3', 'column 4', 'column 5', 'column 6');
    print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    # print data row per each file
    while (@ff) {
        my $f = shift @ff;    # file path
        if ( ! open(IN, '<', $f) ) {
            warn qq(Failed to open $f: $!);
            next;
        $f =~ s%^.*/%%og;    # file name
        @dd = ('', $f, '', '', '', '');
        while (<IN>) {
            chomp;
            $dd[0] = \"$2/$1/$3\" if m%Link Time\\s+=\\s+([0-9]{2})/([0-9]{2})/([0-9]{4})%o;
            ($dd[2] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of CODE\\s/o;
            ($dd[3] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of DATA\\s/o;
            ($dd[4] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of XDATA\\s/o;
            ($dd[5] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of FARCODE\\s/o;
            last unless grep { /^$/ } @dd;
        close IN;
        print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    EOF
    Hope this may help,
    H

  • Export & Import data in Oracle (Urgent)

    I just wonder whether Oracle 8i has the 'Export & Import data' feature in their DBA Administration tool.
    Inside DBA Studio, I found a option to export/import data to text file, but we must connect to Oracle Management Server (OMS) first before we can use that feature. I found the same feature available in Oracle 7.3.3 in Oracle Data Manager.
    How to make sure that I have a OMS installed on my server? (I purchase a Oracle 8i Standard Edition, does it include OMS?)
    Can we export from a table (database A) to a table in database B? Or We can only do this thru. a dump file?

    With every installation of an Oracle DB you get the exp(ort) and imp(ort) utilities. You can use them to move data from one user to another.
    Run them from the dos-prompt like:
    exp parfile=db_out.par
    imp parfile=db_in.par
    with db_out.par=
    file=db.dmp
    log= db_out.log
    userid=system/?????
    owner=???
    constraints=y
    direct=n
    buffer=0
    feedback=100
    and db_in.par=
    file=db.dmp
    log= db_in.log
    userid=system/???
    touser=??
    fromuser=???
    constraints=y
    commit=y
    feedback=100
    null

  • Materialized View with "error in exporting/importing data"

    My system is a 10g R2 on AIX (dev). When I impdp a dmp from other box, also 10g R2, in the dump log file, there is an error about the materialized view:ORA-31693: Table data object "BAANDB"."MV_EMPHORA" failed to load/unload and is being skipped due to error:
    ORA-02354: error in exporting/importing data
    Desc mv_emphora
    Name              Null?    Type
    C_RID                      ROWID
    P_RID                      ROWID
    T$CWOC            NOT NULL CHAR(6)
    T$EMNO            NOT NULL CHAR(6)
    T$NAMA            NOT NULL CHAR(35)
    T$EDTE            NOT NULL DATE
    T$PERI                     NUMBER
    T$QUAN                     NUMBER
    T$YEAR                     NUMBER
    T$RGDT                     DATEAs i ckecked here and Metalink, I found the info is less to do with the MV? what was the cause?

    The total lines are 25074. So I used the GREP from the OS to get the lines involved with the MV. Here are:
    grep -n -i "TTPPPC235201" impBaanFull.log
    5220:ORA-39153: Table "BAANDB"."TTPPPC235201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    5845:ORA-39153: Table "BAANDB"."MLOG$_TTPPPC235201" exists and has been truncated. Data will be loaded but all dependent meta data will be skipped due to table_exists_action of truncate
    8503:. . imported "BAANDB"."TTPPPC235201"                     36.22 MB  107912 rows
    8910:. . imported "BAANDB"."MLOG$_TTPPPC235201"               413.0 KB    6848 rows
    grep -n -i "TTCCOM001201" impBaanFull.log
    4018:ORA-39153: Table "BAANDB"."TTCCOM001201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    5844:ORA-39153: Table "BAANDB"."MLOG$_TTCCOM001201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    9129:. . imported "BAANDB"."MLOG$_TTCCOM001201"               9.718 KB      38 rows
    9136:. . imported "BAANDB"."TTCCOM001201"                     85.91 KB     239 rows
    grep -n -i "MV_EMPHORA" impBaanFull.log
    8469:ORA-39153: Table "BAANDB"."MV_EMPHORA" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    8558:ORA-31693: Table data object "BAANDB"."MV_EMPHORA" failed to load/unload and is being skipped due to error:
    8560:ORA-12081: update operation not allowed on table "BAANDB"."MV_EMPHORA"
    25066:ORA-31684: Object type MATERIALIZED_VIEW:"BAANDB"."MV_EMPHORA" already exists
    25072: BEGIN dbms_refresh.make('"BAANDB"."MV_EMPHORA"',list=>null,next_date=>null,interval=>null,implicit_destroy=>TRUE,lax=>
    FALSE,job=>44,rollback_seg=>NULL,push_deferred_rpc=>TRUE,refresh_after_errors=>FALSE,purge_option => 1,parallelism => 0,heap_size => 0);
    25073:dbms_refresh.add(name=>'"BAANDB"."MV_EMPHORA"',list=>'"BAANDB"."MV_EMPHORA"',siteid=>0,export_db=>'BAAN'); END;the number in front of each line is the line number of the import log.
    Here is my syntax of import pmup:impdp user/pw  SCHEMAS=baandb DIRECTORY=baanbk_data_pump DUMPFILE=impBaanAll.dmp  LOGFILE=impBaanAll.log TABLE_EXISTS_ACTION=TRUNCATEYes I can create the MV manually and I have no problem to refresh manually it after the inmport.

  • Export/Import data in SQL Express 2014

    Hello,
    I have SQL Server 2008 on the network server and 'central' DB. To workaround cases when the network is down I decided to implement 'local' design on all clients. Actually I have 5 clients. I installed SQL Express 2014 on each client and created exact as
    'central' empty database on each client.
    Now during network outage I'm able to continue collect data and populate it in each of 5 db's.
    When the network is up again I have to copy/move  collected data from each client to 'central' db. Unfortunately 'central' and 'local' db's are on the different network and can't see each other hence I have to somehow export package from each 'local'
    db and then import it into 'central'.
    The problem that I hit is that in SQL Express there's not an option to save package i.e. it allow only immediate export which would not work for me.
    Do you have an idea how to handle this situation?
    Thanks
    EV

    Hi,
    I agree with Ronen and Scott. You can use SQL Server Replication to
    copy the data.
    Replication is a set of technologies for copying and distributing data and database objects from one database to another and then synchronizing between databases to maintain consistency. Using replication, you can distribute data to different
    locations and to remote or mobile users over local and wide area networks, dial-up connections, wireless connections, and the Internet.
    Replication processing resumes at the point at which it left off if a connection is dropped.
    Please check out the following links for more information about SQL Server Replication:
    SQL Server Replication Step by Step
    http://www.codeproject.com/Articles/715550/SQL-Server-Replication-Step-by-Step
    SQL Server Replication
    http://msdn.microsoft.com/en-us/library/ms151198.aspx
    Thanks.
    Tracy Cai
    TechNet Community Support

  • Export - Import data , important

    Hi,
    Our systems: BPC 75 SP04, MS version (32 bits DEV server, 64 bits PROD server)
    I need help ASAP, finally I managed to "export data from server DEV - import this data to PROD" but there is an issue of corrupted data in my opinion, since only some data shows wrong amounts like $300,589,456,345,200...
    Procedure:
    1. In DEV server, optimized application and compress database
    2. Enter to SQL server manager, right click the current database > tasks >export data
    Chose Flat File > chose tlbFac2Budget
    3. Obtain txt, save into a shared folder
    4. Enter to PROD server, open BPC Excel
    Access to shared folder, right click txt and made a copy of it.
    Using the copy file and changing the dimension COMPAÑIA to COMPANIA  (since i have to change this parameter)
    5. In Excel, data upload > chose txt file as source file and destination file
    Run package > Import > Select txt file and Import file (txt) and transformation file (system file : import)
    6. Open log and shows success, 210,440 accepted count. (same # shown in SQL rows)
    7. Checking different dynamic templates, and some data is not consistent.
    8. Opened SQL Server Mgr, pick tblFac2Budget and says "207,449 rows"
    9. Full process the application/dimensions but nothing happenned, all "parent members" show a different amount, also base level members.
    10. Open DEV server and the data is also corrupted, amounts are nonsense.. for example billions $$
    What do you suggest to do?
    After importing, is it necessary to process dimensions? Is also necessary to run all the calcs?
    Does the import only import base level members?
    If this method is generating corrupted data, what other options do I have left?
    Thanx in advance.
    Velázquez

    Export can not create any trouble into source.
    If you performed export from Production server into flat file then it is something else which was creating problem with data from production.
    You run another proces or another import into production which change the data.
    I'm sure 100% an export can not create any problem into source.
    When you are using a export method you have to use the same method for import.
    In your case you have used for export SSIS package but for import BPC import.
    SSIS export dosn't take in consideration account type , BPC import yes.
    So this can be another problem why into destination system you have wrong data including base members.
    So you have to use standard BPC export and standard BPC import and you will not have any issue.
    Or you can use SSIS export but you have to do the same into destination ...(any way this can create problem if you would like to to do merge of data ). Usually you can do this when you have empty the region into destination table.
    So I'm not sure what what was causing problems into data for production but I'm not sure not export is the cause.
    Regards
    Sorin radulescu

  • How to use lodercli for export import

    Hello,
    I want to migrate my content server data  , that is on SAPDB version 7.3 (32 bit)  to new server with MAX DB 7.6. (64 bit) .
    I wanted to know:
    1)  how to do export an d import using lodercli. i wanted to use loder of my new server i.e. MAXDB loder to do the export immport. Can any one tell me the steps and the command to do this ?
    2) I have refered SAP note 962019, but need some more help on the steps that is given in this note for hetrageneous copy.
    Note : i dont have any SAP system on the server i have only content server and hence i dont have user SAPR3 on database.
    Regards,
    Bhavesh

    > Please check the following (point 7) in note 962019:
    Ok...
    > 7. The procedure described in this note supports the following migrations:
    > and I am dealing with the same situation hence choose the export import rather than normal backup recovery.
    Well. it says: the procedure supports these migrations, which is true.
    It does not say: "please do use the export/import migration in these situations".
    Please do read the note point 1 where it says;
    "Homogeneous system copies (backup/recovery) can be executed only in
    systems that fulfill  the following conditions:
    a) The database versions in the source system and the target system
        do not differ.
    b) The processor uses the same byte sorting sequence (Little Endian
        -> Little Endian; Big Endian -> Big Endian).
    *If the specified operating system prerequisites are met in the
    source and target system, you MUST use the homogeneous system copy
    and not the heterogeneous system copy.*"
    So in your case export import is not even supported!
    > 2) I know the Cs 6.10 is out of support and it need to upgrade to new CS version that is 6.4 , but if i am already installing the new server for content server with CS 6.4 and MAXDB 7.6 why then why cant i use the export import from older server to new server??
    > Other wise i need to upgrade the MAXDB then upgrade CS and then take the backup and restore, isnt it more logivcal to do export import using loder as we have all prerequisites available??
    1. you don't meet the prerquesites (see above) - you can use backup/restore, so you have to !
    2. export/import is a terribly slow and clumsy procedure
    3. backup/restore is a easy, safe and fast method to perform the system copy
    If you still really, really want to stick with export/import, feel free to try it out.
    The note contain all required steps and the documentation for MaxDB even includes tutorials for the loader.
    regards,
    Lars

  • How to export/import data with different nls_lang???

    hello
    now i try to run my forms by APPLICATION SERVER to view the forms as web forms i success but data appear unreadable that it is arabic and i use the nls WE8ISO8859P1 so i change the nls to arabic like AR8MSWIN1256 but i faild so i create new database with charset UTF8 and nls_lang is AR8MSWIN1256 and try to insert new data when i do query about data ......the new data appear readable but the old unreadable so what can i do???
    i think that if i do export data and import it with new (charset and nls_lang) it will be success....but how can i do it??
    thank u
    regards

    Hi did you already exported data from SQL SERVER? if not using SQL*LOADER you cannot export data. SQL*LOADER is only mean for importing data from flat files(usually text files) into ORACLE tables.
    for importing data into oracle tables using sql*loader use below steps
    1) create a sql*loader control file.
    it looks like as follows
    LOAD DATA
    INFILE 'sample.dat'
    BADFILE 'sample.bad'
    DISCARDFILE 'sample.dsc'
    APPEND
    INTO TABLE emp
    TRAILING NULLCOLS
    or for sample script of control file search google.
    2) at command prompt issue following
    $ sqlldr test/test
    enter control file=<give control file name which you create earlier>
    debug any errors (if occured)

  • Using .ods File to import data

    How to use the .ods file which is generated from Open Office  for importing data in Siena.

    Hey Anurag,
    Thanks for the post!
    You could convert the .ods file into .xlsx and use the Excel file in Project Siena. There are many ways to convert the file but here is one option that worked for me:
    Create/Save the OpenOffice Calc file in some location
    Launch Excel and use the File Open option read the .ods file (in Excel 2013, the file type option reads "OpenDocument Spreadsheet")
    Save the converted file in .xlsx format. Format any Table data so they can be recognized in Project Siena.
    Launch Project Siena and use the add data source option to import the xlsx file.
    Your mileage might vary depending on the formulas/tables in the .ods. Let us know if you run into any issues.
    Thanks!
    -Karthik
    This posting is provided "AS IS" with no warranties and confers no rights.

  • How to Export/Import Datas from BEA content Repository

    Hi,
    I want to Export the datas from BEA content repository and import into the ORacle Repository. Can anybody please let me know how to do that . Thanks in advance.
    Best regards,
    Venkat.S

    copy the files from
    (username)/library/preferences/AICS3settings/en_us/workspaces
    to
    (username)/library/preferences/AICS5settings/en_us/workspaces

  • I have used the easy transfer wizard to export/import data incl TB, do I now move all the Mbox folders between profiles to move the mail contents?

    I installed TB on the win8.1 box and have run the easy transfer wizard to import my data. The support item showing how to mark up my TB data for inclusion in the easy transfer wizard was very useful for this, but it does not describe how to get TB to recognise the imported profile.
    I can see two profiles in the folder: C:\Users\Andrew\AppData\Roaming\Thunderbird\Profiles
    and one appears to contain my old data, the files in the other profile are very small by comparison.
    My question is whether I should move (or copy) the mail folders (or all folders/files) from the imported profile to the new one, overwriting any files where relevant? Alternatively, is there a way to get TB to recognise and use the imported profile?
    I did look at the profile manager but it does not recognise the imported profile.
    Thanks in advance for any assistance.
    Andrew

    I decided to do the following:
    1. renamed the profile folder created after installation of Thunderbird, to a name like nnnnn.default.bak
    2. renamed the imported profile folder to the recently created profile nnnnn.default.
    3. Restarted TB.
    All working fine. All my email for the three email addresses has been transferred from the old computer to the new.

  • Using LDIFDE to Export/Import configuration

    Hi All,
    I need to create a "copy" of a domain using LDIFDE. To do that, I exported the Configuration, Schema, OUs, Users and Groups. I'm having plenty of troubles during the import, so I wanted to clarify a few things:
    1) What order should the data be imported in? I was thinking importing Schema, then Configuration, then OUs, followed by Groups and finally Users
    2) Is there an easy way to remove the extra spaces from the LDIFDE extract? I did a search and replace in notepad to change the domain from DC=abc,DC=com to DC=xyz,DC=com, but there are lines where the original DC is split which is a royal pain to clean
    up!
    3) When doing an import from Schema or Configuration, I get this error
    PS C:\tmp> ldifde -i -f .\schema_export.ldf -v -j C:\tmp\
    Connecting to "mydomain.xyz.com"
    Logging in as current user using SSPI
    Importing directory from file ".\schema_export.ldf"
    Loading entries
    1: CN=Schema,CN=Configuration,DC=xyz,DC=com
    Add error on entry starting on line 1: Unwilling To Perform
    The server side error is: 0x2079 The specified instance type is not valid.
    The extended server error is:
    00002079: SvcErr: DSID-03330968, problem 5003 (WILL_NOT_PERFORM), data 0
    0 entries modified successfully.
    An error has occurred in the program
    I can't figure out why this is happening.
    Any help will be much appreciated :)

    > I need to create a "copy" of a domain using LDIFDE. To do that, I
    > exported the Configuration, Schema, OUs, Users and Groups. I'm having
    > plenty of troubles during the import, so I wanted to clarify a few things:
    Why would you need to import schema and configuration? And why do you
    not simply join a DC to your domain, take it down and do metadata
    cleanup, then start this new DC in its own environment? Much simpler
    than your approach, I believe :)
    > 1) What order should the data be imported in? I was thinking importing
    > Schema, then Configuration, then OUs, followed by Groups and finally Users
    That's what I would think, too.
    > 1: CN=Schema,CN=Configuration,DC=xyz,DC=com
    > Add error on entry starting on line 1: Unwilling To Perform
    "Add" for what? What's the content of this file?
    Martin
    Mal ein
    GUTES Buch über GPOs lesen?
    NO THEY ARE NOT EVIL, if you know what you are doing:
    Good or bad GPOs?
    And if IT bothers me - coke bottle design refreshment :))

  • How to export/import data of hyperion performance scorecard?

    I have a Hyperion Performance Scorecard Environment which save data in Oracle database. Now I am trying to setup another scorecard environment which has exact the same set of data and use MS SQL Server as database. What is the best way to dump data from Oracle one to SQL Server one? I tried to use HPS Import/Export Utility to export data from Oracle scorecard environment to .CSV file and then import .CSV file into SQL Server Scorecard environment. The exporting process went well despite of some warning message. However, I keep getting trouble with importing. There are a lot of issues such as inconsistency date format, data dependencies, missing comma in .csv file, etc. Could anyone suggest me some easier way to do this data dumping thing? All what I am trying to achieve is a clone of my HPS environment. Thanks a lot.

    Hi Yang,
    i had your Query regarding Hyperion Performance ScoreCard. I feel that you may have some idea regarding this tool, so i was asking you. I need Some information Regarding that tool, implementations, Current& Future market, and any prerequisites to work on that tool. Also if you have any idea on Hyperion Essbase & planning plz let me know the implementations of this HPScore card when compared to them.I would be thankfull to you if you let me know what ever you know about that.
    Regards

  • Export/Import  Data Pump

    HI
    AIX 5.3
    Oracle 11.2.0.1
    Oracle 10.2.0.3
    I have done FULL export in 10g and Import 10g and also 11g using DATA PUMP. But I could not find any DBLINK in target database.
    Does it create DBLINK while import?
    Thanks,
    Vishal

    I have done FULL export in 10g and Import 10g and also 11g using DATA PUMP. But I could not find any DBLINK in target database.
    Does it create DBLINK while import?It does. Depends on the option used while Datapump Import.
    Full Database export exports DB_LINKS.
    database_export_objects shows
    SELECT named, object_path, comments  FROM database_export_objects  WHERE named='Y';
    Y DB_LINK                        Private and public database linksHTH
    -Anantha

Maybe you are looking for

  • File "rpdeluxe.properties" in Home Folder

    Hi, I have just noticed a file called "rpdeluxe.properties" in my home folder. When I open it I see the following info: #Thu Sep 27 21:28:48 CEST 2007 SerialNumber=RPRID-0400-58818-00783-71277 LocaleCountry=US LocaleLanguage=en Can anyone tell me wha

  • Filter Master data

    Hi, Question: Is it possible to filter the List of Business Partner data with help of SDK? The customer like to select data for specific employees. Thank you, Rune

  • Has anyone tried to mimic State Model like features on Opportunity Stages?

    We would like to make sure that users can only move forward on the sales stages & not backwards. The users can "loose" the opportunity from any of the intermediate stages. For simplicity we can assume that there is only 1 sales methodology. Thanks In

  • Error saying "Enter the sender activity type also"

    While trying to upload the HR details from NON SAP to SAP through SE38, an error reflects asking to enter the sender activity type. Can any body help me.

  • Time Machine and WD MyBook World Edition (white light)

    Dear All I have just given up on time machine set up after spending approx 8 hours and getting nowhere. I recently purchased a WD Mybook World Edition which states it is compatible for use with time machine. I have uploaded all the required updates a