I want the hr schema dump file

hi all
please i want the HR Schema dump file to import it to my DB
thank you

You can use Export and Import utility in oracle.
http://www.orafaq.com/wiki/Import_Export_FAQ+_                                                                                                                                                                                                                   

Similar Messages

  • HOW TO IMPORT DATA MORE THAN ONCE FROM THE SAME EXPORT DUMP FILE?

    before asking my question i'd like to mention that i'm a french spoke...
    so my english is a little bit bad. sorry for that.
    my problem is : IMPORT
    how to import data a SECOND TIME from an export dump file within oracle?
    My Export dump file was made successfully (Full Export) and then i
    tried to import datas for the first time.
    I got this following message in my logfile: I ADDED SOME COMMENTS
    Warning: the objects were exported by L1, not by you
    . importing SYSTEM's objects into SYSTEM
    REM ************** CREATING TABLESPACES *****
    REM *********************************************
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLESPACE "USER_DATA" DATAFILE 'E:\ORANT\DATABASE\USR1ORCL.ORA' SI"
    "ZE 3145728 DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS 1 MAX"
    "EXTENTS 121 PCTINCREASE 50) ONLINE PERMANENT"
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLESPACE "ROLLBACK_DATA" DATAFILE 'E:\ORANT\DATABASE\RBS1ORCL.ORA"
    "' SIZE 10485760 DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS "
    "1 MAXEXTENTS 121 PCTINCREASE 50) ONLINE PERMANENT"
    etc........
    IMP-00017: following statement failed with ORACLE error 1119:
    "CREATE TABLESPACE "L1" DATAFILE 'E:\ORADATA\L1.DBF' SIZE 1048576000 "
    "DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS 1 MAXEXTENTS 121 PCTIN"
    "CREASE 50) ONLINE PERMANENT"
    IMP-00003: ORACLE error 1119 encountered
    ORA-01119: error in creating database file 'E:\ORADATA\L1.DBF'
    ORA-09200: sfccf: error creating file
    OSD-04002: unable to open file
    O/S-Error: (OS 3) The system cannot find the path specified
    --->etc..........
    the drive E: with the folder E:\ORADATA didn't exist, but after
    all that i created it.
    see below, before my IMPORT statement
    REM ********************* CREATING USER *********
    REM ********************************************
    IMP-00017: following statement failed with ORACLE error 959:
    "CREATE USER "L1" IDENTIFIED BY VALUES 'A6E0DAA6865E7627' DEFAULT TABLESPACE"
    " "L1" TEMPORARY TABLESPACE "TEMPORARY_DATA""
    IMP-00003: ORACLE error 959 encountered
    ORA-00959: tablespace 'L1' does not exist
    IMP-00017: following statement failed with ORACLE error 959:
    "CREATE USER "MLCO" IDENTIFIED BY VALUES '56AC6447B7D50467' DEFAULT TABLESPA"
    "CE "MLCO" TEMPORARY TABLESPACE "TEMPORARY_DATA""
    IMP-00003: ORACLE error 959 encountered
    ORA-00959: tablespace 'MLCO' does not exist
    ETC.......
    REM ********************* GRANTING ROLES ***********
    REM ************************************************
    IMP-00017: following statement failed with ORACLE error 1917:
    "GRANT ALTER ANY TABLE to "L1" "
    IMP-00003: ORACLE error 1917 encountered
    ORA-01917: user or role 'L1' does not exist
    ETC.........
    IMP-00017: following statement failed with ORACLE error 1918:
    "ALTER USER "L1" DEFAULT ROLE ALL"
    IMP-00003: ORACLE error 1918 encountered
    ORA-01918: user 'L1' does not exist
    -- that is normal, since the creation of the
    tablespace failed !!
    REM******************************
    IMP-00015: following statement failed because the object already exists:
    "CREATE ROLLBACK SEGMENT RB_TEMP STORAGE (INITIAL 10240 NEXT 10240 MINEXTENT"
    "S 2 MAXEXTENTS 121) TABLESPACE "SYSTEM""
    IMP-00015: following statement failed because
    . importing SCOTT's objects into SCOTT
    IMP-00015: following statement failed because the object already exists:
    "CREATE SEQUENCE "EVT_PROFILE_SEQ" MINVALUE 1 MAXVALUE 999999999999999999999"
    "999999 INCREMENT BY 1 START WITH 21 CACHE 20 NOORDER NOCYCLE"
    ETC............
    importing L1's objects into L1
    IMP-00017: following statement failed with ORACLE error 1435:
    "ALTER SCHEMA = "L1""
    IMP-00003: ORACLE error 1435 encountered
    ORA-01435: user does not exist
    REM *************** IMPORTING TABLES *******************
    REM ****************************************************
    . importing SYSTEM's objects into SYSTEM
    . . importing table "AN1999_BDAT" 243 rows imported
    . . importing table "BOPD" 112 rows imported
    . . importing table "BOINFO_AP" 49
    ETC................
    . . importing table "BO_WHF" 2 rows imported
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLE "DEF$_CALL" ("DEFERRED_TRAN_DB" VARCHAR2(128),
    IMP-00015: following statement failed because the object already exists:
    "CREATE SYNONYM "DBA_ROLES" FOR "SYS"."DBA_ROLES""
    IMP-00015: following statement failed because the object already exists:
    "CREATE SYNONYM "DBA_ERRORS" FOR "SYS"."DBA_ERRORS""
    IMP-00008: unrecognized statement in the export file:
    . importing L1's objects into L1
    IMP-00017: following statement failed with ORACLE error 1435:
    "ALTER SCHEMA = "L1""
    IMP-00008: unrecognized statement in the export file:
    J
    Import terminated successfully with warnings.
    -------------------------------------b]
    So after analysing this log file, i created
    the appropriate drives and folders... as the
    import statement doesn't see them.
    E:\ORADATA G:\ORDATA etc...
    And i started to [b]IMPORT ONE MORE TIME. with:
    $ IMP73 sys/pssw Full=Y FILE=c:\temp\FOLD_1\data_1.dmp BUFFER=64000
    COMMIT=Y INDEXFILE=c:\temp\FOLD_1\BOO_idx.sql
    LOG=c:\temp\FOLD_1\BOO_log.LOG DESTROY=Y IGNORE=Y;
    after that i could not see the users nor the
    tables created.
    and the following message appeared in the log file:
    Warning: the objects were exported by L1, not by you
    . . skipping table "AN1999_BDAT"
    . . skipping table "ANPK"
    . . skipping table "BOAP"
    . . skipping table "BOO_D"
    ETC.....skipping all the tables
    . . skipping table "THIN_PER0"
    . . skipping table "UPDATE_TEMP"
    Import terminated successfully without warnings.
    and only 2 new tablespaces (originally 3) were
    created without any data in ( i check that in
    the Oracle Storage manager : the tablespaces exit
    with 0.002 used space; originally 60 M for each !!)
    so,
    How to import data (with full import option) succefully
    MORE THAN ONE TIME from an exported dump file ?
    Even if we have to overwrite tablespaces , tables and users.
    thank you very much

    The Member Feedback forum is for suggestions and feedback for OTN Developer Services. This forum is not monitored by Oracle support or product teams and so Oracle product and technology related questions will not be answered. We recommend that you post this thread to the appropriate Database forum.
    The main URL is:
    http://forums.oracle.com/forums/index.jsp?cat=18

  • Is there anyway we can check the validity of dump file in Unix system

    Hi,
    I have taken export of 2 tables which are huge in volume. one of the table seems to be successfully exported yet another one failed due to space scarcity. Since both the tables share a single dumpfile.
    I wanted to know.
    1. Can we use the same dumpfile to import the table which is successfull or it can not be used because the second table got error while exporting?
    2. Is there any way we can check the validity of dumpfile in unix plateform.
    Your reply is highly appreciated.

    784786 wrote:
    Hi,
    I have taken export of 2 tables which are huge in volume. one of the table seems to be successfully exported yet another one failed due to space scarcity. Since both the tables share a single dumpfile.
    I wanted to know.
    1. Can we use the same dumpfile to import the table which is successfull or it can not be used because the second table got error while exporting?No. It is better to take another backup of the same
    >
    2. Is there any way we can check the validity of dumpfile in unix plateform.
    Please check the export log backup. If it says that export terminated successfully, then the dump may be fine.

  • How to estimate the size of dump file

    Hi,
    I would like to export my full database.
    My database size is nearly 50G.
    How to know how much space the export dump will occupy?
    Thanks,
    Kavitha

    SELECT SUM(AVG_ROW_LEN * NUM_ROWS) FROM DBA_TABLES;
    you should then add an 10-20% to the result due to the definitions in the export file.
    Source:Paul M. @ Re: estimating size of a .dmp file
    HTH
    Girish Sharma

  • "SP2-0734: unknown command..." error while importing a dump file

    hello,
    i want to import a dump file into my database named orcl, and i come accross the same error after executing the import command. my import command is:
    SQL> imp mvdemo/mvdemo file=mvdemo.dmp full=y ignore=y
    i use oracle 10g enterprise edition.
    i created the user mvdemo/mvdemo.
    the error i see in my screen is:
    SP2-0734: unknown command beginning "imp mvde..." - rest of line ignored.
    for a long period of time, i havent used oracle dbms. before, i remember that i used enterprise manager in import/export processes. however, i am not able to find the enterprise manager in 10g options now. do i have to install the enterprise manager seperately in oracle 10g enterprise edition? may this problem be relational with this matter?
    if someone could answer, i would be greatful.
    best regards

    Import is run from the command prompt, not the SQL prompt.

  • Is it possible to load a dump file collected from sql server to orcle 10gR2

    Hi..
    As a part of data migration from legacy systems , we need to migrate only required tables from Sql server to Oracle 10gr2 .
    At present client is telling that he will give it as dump file.
    My question is , Is it possible to load the dump file collected from sql server to oracle 10gR2?
    Thanks in advance ..

    yes, it is.
    Using for example SQL Developer Migration Workbench even allows you to create unload scripts for the desired SQL Server tables and it also creates the DDL scripts for you to create the Oracle tables similar to the SQL Server table AND it alows creates the SQL*Loader scripts that allow you to load the SQL Server generated dump file into the Oracle db.
    Here the link:
    http://www.oracle.com/technology/tech/migration/workbench/index_sqldev_omwb.html
    If you do not want to use this utility, you can also use SQL*Loader to load the content of the SQL Server dump file into the Oracle database; but in this case you have to write the whole SQL*Loader control scripts on your own.
    Edited by: kgronau on Sep 16, 2009 1:54 PM

  • Help with Dump files from other server.

    Hello!
    I'm having a problem with reading Dump files. The issue is this:
    I'm working with a great number of SQL Servers installed in a big number of virtual/physical servers. The problem is that I can't analyse (neither with WinDg or Visual Studio) the .mdmp files out of the server that is having the issue.
    Is there anyway that I could read the SQL Server dump files out of the server (It doesn't matter if it is in my Computer or another server)? Because it always shows an error of compatability with sqlservr.exe
    Also is there any other tool that could help me to read sql server dump files?
    Thank you very much :)

    Hello,
    I would suggest you raise a case with Microsoft for accurate analysis of the dump.I would strongly recommend it .If you still want to proceed use below link
    http://blogs.msdn.com/b/psssql/archive/2012/03/15/intro-to-debugging-a-memory-dump.aspx
    As a fact by applying latest service pack you have chance to subside these Dumps
    Hope this helps
    Please mark this reply as the answer or vote as helpful, as appropriate, to make it useful for other readers

  • Where can i see the .mdf or .ldf files for mysql database?

    Hi i have created a database in mysql workbench. Now i want the .mdf  or .ldf files for my project deployment.  where can i see this files in my folders. I have searched in this folders.
    C:\Program Files\MySQL\MySQL Server 5.6\data.
    Couldn't find the data's here. Please Help. Actually i don't know i am on the right path. Or please tell me how can i create it?

    I think mysql has a different file structures than MS SQL. May be this link will help you
    http://www.mkyong.com/mysql/where-does-mysql-stored-the-data-in-my-harddisk/
    Satheesh
    My Blog | How to ask questions in technical forum

  • How to find the tablespace name and schema name from a dump file

    Good day to all,
    I recived a dump file from a client with oracle 11g, and I need do get the tablespace and schema name from the dump file. How can I do that?
    Thanks.
    Flávio Melo
    Edited by: 933141 on 09/05/2012 07:41

    user12038066 wrote:
    Use imp utility, it generates DDL for both tables and indexes. So you will get tablespace info for both.
    Use impdp utility, it generates DDL for indexes only. So you will miss tablesapce info for tables.
    Haven't found a workaround for impdp utility. Anyone knows?Are you sure you are using the impdp utility correctly? Because if you do, then sqlfile should have all gory details.

  • Getting Datapump Export Dump file to the local machine

    I apologize to everyone as this is a duplicate post.
    Re: Getting Datapump Export Dump file to the local machine
    My initial thread(started yesterday)was in 'Database General' and didn't get much response today. Where do i post questions on EXPORT/IMPORT utilities?
    Anyway, here is my problem:
    I want to take the export dump of itemrep schema in orcl database (in a remote machine). I have an Oracle server (10G Rel2) running in my local Windows machine. I have created a user john with necessary EXPORT/IMPORT privileges in my local db. Then i created a Directory object,ie a folder named datapump in my local hard drive and granted READ WRITE privileges to john.
    So john, who is a user in my local machine's oracle db is going to run the expdp utility.
    expdp john/jhendrix@my_local_db_alias SCHEMAS=itemrep directory=datapump logfile=itemrepexp.log
    The above command will fail because it will look for itemrep schema inside my local db, not the remote db where the itemprep is actually located. And you can't qualify the schemaname with its db in the SCHEMAS parameter (like SCHEMAS=itemrep@orcl).
    Can anyone provide me a solution for this?

    I think you can initiate the datapump exp utility from your client machine to export a schema in a remote database.But, Upon execution,oracle looks for the directory in the remote database and not on your local machine.
    You're inovoking expdp from a client (local DB) to export data from a remote DB.
    So, With this method, you can create the dumpfiles only on the Remote server and not on the local Machine.
    You can perform a direct import instead of export using the NETWORK_LINK option.
    Create a DBlink from your local and Remote DB and verify the connection.
    Then,Initiate the Impdp from Your local machine DB using the parameter network_link=<db_link of the Remote DB> to import the schema.
    The advantage of this option eliminates the Dumpfile creation at the Server side.
    There are no dumpfiles during the import process. the Data is imported directly to the target schema.

  • User & Schema name of Export dump file

    I have an exported dump file. I dont have any log file of that export. I want to Import it in another system. How do I know concerned user or schema name of that dump file.
    Is there any way to know it???

    dbanirav wrote:
    I have an exported dump file. I dont have any log file of that export. I want to Import it in another system. How do I know concerned user or schema name of that dump file.
    Is there any way to know it???made with exp or expdp?
    what OS name & version?
    How do I ask a question on the forums?
    SQL and PL/SQL FAQ

  • Export the database schema / structure to a file server

    I would really appreciate if any of you can share your experience on the following:
    Database-1: Oracle 8i/9i and
    Database-2: SQL Server v65/7/2000
    I am looking for a solution to extract all the database schema/structure to a pre-defined location onto a file server. I wanted to configure this thru CRON job (for unix) and Microsoft Task Scheduler (for Windows). Also this solution should work for all databases (must work with Oracle, SQL Server). The exported file should not be just a dump file (.dmp), assuming only oracle can read .dmp file.
    What are the possible ways to accomplish this.
    I am sure, several tools would be available to accomplish this (like Embarcadero Change Manager), but solution may not be cost effective. Can we write any program to accomplish the same?
    I would really appreciate your inputs.
    Thanks, Madhavi

    I am doing this job using bcp command on sql serverwhich create dat file and then transport the file to oracle server in location where I have created an external table.
    I belive that there is server level compatability which need no external programe or software. My one fellow told me once and when I got him I shall reply

  • Import few schemas out of a full DB dump file into a new DB

    Hi,
    I have a 11g database, ORCL.
    I have created a new database, TEST, where i want only 3 schemas of ORCL database to be imported.
    I have taken an export dump file of the 3 schemas (A,B and C), with which the below import command works:
    impdp system/xxxx directory=EXPDP dumpfile=schema3.dmp logfile=schema3.log REMAP_TABLESPACE=A_TBS:USERS,B_TBS:USERS,C_TBS:USERS
    where A_TBS, B_TBS and C_TBS are the tablespaces of the respective schemas A,B and C in ORCL database
    USERS is the default tablespace in the TEST database
    But If I take a full export dump of the entire database and want to import only the 3 required schemas, then is the below command correct???
    impdp system/xxxx directory=EXPDP dumpfile=full_dump.dmp logfile=full_dump.log SCHEMAS=A,B,C REMAP_TABLESPACE=A_TBS:USERS,B_TBS:USERS,C_TBS:USERS
    Please guide.
    Thanks in Advance.

    If you do a FULL export and FULL import, the import also attempts to create Tablespaces that do not exist. However, it will attempt to create the tablespaces with the same file names (including directory paths) as present on the source database. That is why it is always preferable to precreate Tablespaces at the new database before running the import -- so that you can control the sizes and names of datafiles, even change tablespace parameters if necessary. REMAP would be used if you want to import to different tablespace names. When precreating tablespaces, the size required really depends on the size of the data.
    At the source you might have a tablespace with datafiles adding up to 5GB but used segments of 3GB only (Again, within the 3GB, 80% of the data might have been deleted before the export !). So the size required at the new database might be 3GB or only 600MB !! It is best to precreate a few datafiles with some reasonable sizes and have them with AUTOEXTEND ON so that the tablespaces grow to the actual size that is really required.
    Hemant K Chitale

  • ERROR: MyService.jws:715:There are two or more operations with the same schema-element 'ns0:MyNameSpace' on the input message in a web service file or callback interface.

    I have two web service operations that have the same complex type as their input
    parameter. I want to map this type to an existing schema. I can successfully
    do this with the first operation using XQuery but when I attempt to do this with
    the second operation I get the following error:
    ERROR: MyService.jws:715:There are two or more operations with the same schema-element
    'ns0:MyNamespace' on the input message in a web service file or callback interface.
    ERROR:      SUGGESTION: Use different schema-element values for each of those operations.
    How can I use different schema-element values? The input parameters are to be
    mapped to the same schema and same element since they are the same for both operations....

    I am having the same problem. How did you resolve this..?? could you please tell me the solution??
    Thanks
    Shari

  • Is it possible to view/get the table data from a dump file

    I have dmp files generated using expdp on oracle 11g..
    expdp_schemas_18MAY2013_1.dmp
    expdp_schemas_18MAY2013_2.dmp
    expdp_schemas_18MAY2013_3.dmp
    Can I use a parameter file given below to get the table data in to the sql file or impdp the only option to load the table data in to database.
    vi test1.par
    USERID="/ as sysdba"
    DIRECTORY=DATA
    dumpfile=expdp_schemas_18MAY2013%S.dmp
    SCHEMAS=USER1,USER2
    LOGFILE=user_dump_data.log
    SQLFILE=user_dump_data.sql
    and impdp parfile=test1.par.

    Hi,
    To explain more about my situation.
    Target database has the dump files, where as the source database (cloud) doesn't have access to the target database.
    However, I can request the target DB DBA team to run the par files and provide me the output like a SQL file which I can take and run it in my source database.
    I got the metadata the same way, but is there any way I could get the data from the dump files on the target DB without actually accessing it? My question might sound weird, but please let me know.
    <
    1. export from the source into a dumpfile. You can do this on the source database and then copy the file over to your local database or you can do this from a local database if you have a database link defined on the local database that points to the source database.  In the second case, your dumpfile will be created on your local database.
    >
    How can i access the dump using the database link?

Maybe you are looking for