Problem regarding database dump import

hi all...good afternoon....
can u plss tell me how to import the dump of a higher version of database into a lower version of database....i am explaining the scenario below...
i have a schema named *"ABCD" which is a part of oracle11g*. i have taken the dump of it(ABCD.dmp) and was trying to import the dump in another schema name *"PQRS" which is in oracle10g*....actually the server of 11g is down and thats why i have to do this to continue the job...
i have transferred the dump ABCD.dmp to the 10g server using sftp.
now when i am giving the following command.....
imp file=ABCD.dmp fromuser=ABCD touser=PQRS buffer=200000
it is asking for the username and password. but when i provided that,it is throwing the error....*not a valid export file, header failed verification*
please help...thanks

ohhh ...am verry sorry about it....
from the next time it will be done widout fail....
agustin....
thanks a lot for ur helpful reply....but this time my doubt is still not totally clear....as i am new in this impdp and expdp...thats why i want to know the method of making an object directory as told by agustin UN....
previously i never created a directory to import or export....
thanks a lot in advance....

Similar Messages

  • Import  a database dump

    Hi all,
    I have tow database in my oracle server 10g , mydb1 and mudb2 , and have tow database dumps ,mydb1.dmp and mydb2.dmp. I learned from other forums that to be able to import the dump file i need that the specific database should be started and up . My question is how to start one database mydb1 and in the same time the other database mydb2 is down because by default they are started at the same time .
    BR

    Hi,
    Wat is the OS Are you using?
    I am just guessing it should be windows, Whenever windows startup there is a registry entry
    HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE\oracle_home_name.
    There will be a key
    ORA_SID_AUTOSTART. SID is your database SID.
    If it is set to true, the databases will start autoamtically, you can change it to false if you dont need it.
    Or if you navigate to the services.msc you can find the ORACLESERVICESID where you have two types of start parameters 1) Automatic (whenver windows starts it will automatically start the db)
    2) Manual (You can start the DB whenever you need it)
    To start database connect to the sql prompt and issue STARTUP to shutdown the db issue SHUTDOWN ON sql prompt.
    Regards,
    Vijayaraghavan K

  • Export/Import full database dump fails to recreate the workspaces

    Hi,
    We are studying the possibility of using Oracle Workspace to maintain multiple versions of our business data. I recently did some tests with the import/export of a dump that contains multiple workspaces. I'm not able to import the dump successfully because it doesn't recreate the workspaces on the target database instance, which is an empty Oracle database freshly created. Before to launch the import with the Oracle Data Pump utility, I have the "LIVE" workspace that exists in the WMSYS.WM$WORKSPACES_TABLE table. After the import is done, there is nothing in that table...
    The versions of the Oracle database and Oracle Workspace Manager are the same on source and target database:
    Database version (from the V$VERSION view):
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    PL/SQL Release 10.2.0.4.0 - Production
    CORE     10.2.0.4.0     Production
    TNS for 64-bit Windows: Version 10.2.0.4.0 - Production
    NLSRTL Version 10.2.0.4.0 - Production
    Workspace Manager version (from the WM_INSTALLATION view):
    OWM_VERSION: 10.2.0.4.3
    In order to recreate the tablespaces successfully during the full import of the dump, the directory structure for the tablespaces is the same on the target database's computer. I used the instructions given in this document, section "1.6 Import and Export Considerations" at page 1-19:
    http://www.wyswert.com/documentation/oracle/database/11.2/appdev.112/e11826.pdf
    Considering that the release of Oracle database used is version 10.2.0.4, I use the following command to import the dump since it doesn't contain the WMSYS schema:
    impdp system/<password>@<database> DIRECTORY=data_pump_dir DUMPFILE=expfull.dmp FULL=y TABLE_EXISTS_ACTION=append LOGFILE=impfull.log
    The only hints I have in the import log file are the following block of lines:
    1st one:
    ==============================
    Traitement du type d'objet DATABASE_EXPORT/SYSTEM_PROCOBJACT/PRE_SYSTEM_ACTIONS/PROCACT_SYSTEM
    ORA-39083: Echec de la création du type d'objet PROCACT_SYSTEM avec erreur :
    ORA-01843: ce n'est pas un mois valide
    SQL en échec :
    BEGIN
    if (system.wm$_check_install) then
    return ;
    end if ;
    begin execute immediate 'insert into wmsys.wm$workspaces_table
    values(''LIVE'',
    ''194'',
    ''-1'',
    2nd one at the end of the import process:
    ==============================
    Traitement du type d'objet DATABASE_EXPORT/SCHEMA/POST_SCHEMA/PROCACT_SCHEMA
    ORA-39083: Echec de la création du type d'objet PROCACT_SCHEMA avec erreur :
    ORA-20000: Workspace Manager Not Properly Installed
    SQL en échec :
    BEGIN
    declare
    compile_exception EXCEPTION;
    PRAGMA EXCEPTION_INIT(compile_exception, -24344);
    begin
    if (system.wm$_check_install) then
    return ;
    end if ;
    execute immediate 'alter package wmsys.ltadm compile';
    execute immediate 'alter packag
    The target operating system is in french... here is a raw translation:
    1st one:
    ==============================
    Treatment of object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/PRE_SYSTEM_ACTIONS/PROCACT_SYSTEM
    ORA-39083: Object type creation failed PROCACT_SYSTEM with error :
    ORA-01843: invalid month
    failed SQL :
    BEGIN
    if (system.wm$_check_install) then
    return ;
    end if ;
    begin execute immediate 'insert into wmsys.wm$workspaces_table
    values(''LIVE'',
    ''194'',
    ''-1'',
    2nd one at the end of the import process:
    ==============================
    Treatment of object type DATABASE_EXPORT/SCHEMA/POST_SCHEMA/PROCACT_SCHEMA
    ORA-39083: Object type creation failed PROCACT_SCHEMA with error :
    ORA-20000: Workspace Manager Not Properly Installed
    failed SQL :
    BEGIN
    declare
    compile_exception EXCEPTION;
    PRAGMA EXCEPTION_INIT(compile_exception, -24344);
    begin
    if (system.wm$_check_install) then
    return ;
    end if ;
    execute immediate 'alter package wmsys.ltadm compile';
    execute immediate 'alter packag
    By the way, the computer of the source database is Vista 64 in english and the destination computer is Vista 64 in french, do you think this has anything to do with these error messages? The parameters of the NLS_SESSION_PARAMETERS view are the same in the source and target database...
    Thanks in advance for your help!
    Joel Autotte
    Lead Developer
    Edited by: 871007 on Jul 13, 2011 7:31 AM

    I tried to import the full database dump with the "imp" command and I had the same error message with more detail:
    . Import d'objets SYSTEM dans SYSTEM
    IMP-00017: Echec de l'instruction suivante avec erreur ORACLE 1843 :
    "BEGIN "
    "if (system.wm$_check_install) then"
    " return ;"
    " end if ;"
    " begin execute immediate 'insert into wmsys.wm$workspaces_ta"
    "ble"
    " values(''LIVE'',"
    " ''194'',"
    " ''-1'',"
    " ''SYS'',"
    " *to_date(''09-DEC-2009 00:00:00'', ''DD-MON-YYYY HH"*
    *"24:MI:SS''),"*
    " ''0'',"
    " ''UNLOCKED'',"
    " ''0'',"
    " ''0'',"
    " ''37'',"
    " ''CRS_ALLNONCR'',"
    " to_date(''06-APR-2011 14:24:57'', ''DD-MON-YYYY HH"
    "24:MI:SS''),"
    " ''0'',"
    " '''')'; end;"
    "COMMIT; END;"
    IMP-00003: Erreur ORACLE 1843 rencontrée
    ORA-01843: ce n'est pas un mois valide
    ORA-06512: à ligne 5
    The call to the TO_DATE function with the string of the date format is incompatible with the format and language set in the NLS parameters of my database. Since I know that the "imp" and "exp" commands work client side (unlike "impdp" and "expdp"), I did a full export of the source database from the computer on which I try to import the dump to have the same language exported in the dump and I was able to import the dump successfully.
    But, I would like to use the Data Pump tool instead of the old import and export tools. How would it be possible to set the NLS parameters that "impdp" must use before to launch the import of the dump?
    The NLS parameters are set to the "AMERICAN" settings in both source and destination database when I validate that with the "NLS_DATABASE_PARAMETERS" view. The dump I did with "expdp" on the source database exported the data with the "AMERICAN" settings as well...
    On the destination database, the computer language is in french and I guess it is considered by the Data Pump import tool since it outputs the log in the language of the computer and it tries to do a TO_DATE with the wrong date format because it works with the "CANADIAN FRENCH" settings...
    Any suggestions? I'll continue to read on my side and re-post if I find the solution.
    Thanks!
    Edited by: 871007 on Jul 19, 2011 8:42 AM
    Edited by: 871007 on Jul 19, 2011 8:43 AM
    Edited by: 871007 on Jul 19, 2011 8:43 AM

  • From a complete database dump, I need to import 8 schemas

    I have a complete database dump made with datapump on oracle 10g on Linux.
    I need to import only 8 schemas.
    In my parfile I have to use the INCLUDE option?
    How can I tell oracle to import only these 4 schemas?

    Hi,
    What is the version you are using.. ??
    In my parfile I have to use the INCLUDE option?
    As per yu requirment it might not fit, I suppose
    The INCLUDE and EXCLUDE parameters can be used to limit the export/import to specific objects.
    Second thing is that when you use INCLUDE parameter only those objects specified by it will be included and EXCLUDE will make sure that all objects except those specified by it will be included in the export.
    How can I tell oracle to import only these 4 schemas?
    Better I suggest that import the full dB in to some test DB, then do an schema level export with respective your requirement.
    As, sybrand had pointed the correct point of SCHEMAS, which I missed out..
    - Pavan Kumar N
    Edited by: Pavan Kumar on Nov 13, 2008 4:26 PM

  • I  have a 8i database dump, I have to import the same into 9i DB

    I have a 8i database dump, I have to import the same into 9i DB. What are all the possible ways?
    please help on this

    You can run the following command:
    imp username/password file=export.dmp full=yes
    This will do the import of the full dump file.
    Let me know if you need any further help.

  • SAP: CA certificate missing in database while importing certificate in ABAP

    Hi
    We have a similar problem as the one described in
    CA certificate missing in database while importing certificate in ABAP
    however we ordered the certificate from SAP.
    The only error is:
    CA certificate missing in database
    Message no. TRUST057
    Diagnosis
    The certificate response can only be processed if the certificate of
    the CA is stored in the database.
    Procedure
    Store the CA certificate in the database.
    We believe we have done it in similar way before.
    Do you know how to handle this?
    Thank you for your input.
    Best regards
    Flemming Grand

    Hi
    I created an oss on this which sap now has responded on.
    The certificate they sent to me was wrong - did't include the roor certificate (and the intermidiate - I think).
    When I recieved the complete certificate I could import without any problems.
    Flemming Grand

  • Problem upgrading database.Object 'DF_ODLN_DocType' already exist in databa

    Hi all,
    The problem appear in upgrading, when the progessbar is in ODLN table.
    The error say:
    [Microsoft][SQL Native Client][SQL Server] Object 'DF_ODLN_DocType' already exist in database. 'Entrega'(ODLN).
    Someone knows in wich table are the objects DF........? When sap upgradings wich tables use for this?
    How i can repair this database without import all data to another new created database?
    Thank you all again.
    Regards.

    Sorry,
    nobody can move this thread please, i think that his place isn´t there.
    Sorry again.

  • Interactive report performance problem over database link - Oracle Gateway

    Hello all;
    This is regarding a thread Interactive report performance problem over database link that was posted by Samo.
    The issue that I am facing is when I use Oracle function like (apex_item.check_box) the query slow down by 45 seconds.
    query like this: (due to sensitivity issue, I can not disclose real table name)
    SELECT apex_item.checkbox(1,b.col3)
    , a.col1
    , a.col2
    FROM table_one a
    , table_two b
    WHERE a.col3 = 12345
    AND a.col4 = 100
    AND b.col5 = a.col5
    table_one and table_two are remote tables (non-oracle) which are connected using Oracle Gateway.
    Now if I run above queries without apex_item.checkbox function the query return or response is less than a second but if I have apex_item.checkbox then the query run more than 30 seconds. I have resolved the issues by creating a collection but it’s not a good practice.
    I would like to get ideas from people how to resolve or speed-up the query?
    Any idea how to use sub-factoring for the above scenario? Or others method (creating view or materialized view are not an option).
    Thank you.
    Shaun S.

    Hi Shaun
    Okay, I have a million questions (could you tell me if both tables are from the same remote source, it looks like they're possibly not?), but let's just try some things first.
    By now you should understand the idea of what I termed 'sub-factoring' in a previous post. This is to do with using the WITH blah AS (SELECT... syntax. Now in most circumstances this 'materialises' the results of the inner select statement. This means that we 'get' the results then do something with them afterwards. It's a handy trick when dealing with remote sites as sometimes you want the remote database to do the work. The reason that I ask you to use the MATERIALIZE hint for testing is just to force this, in 99.99% of cases this can be removed later. Using the WITH statement is also handled differently to inline view like SELECT * FROM (SELECT... but the same result can be mimicked with a NO_MERGE hint.
    Looking at your case I would be interested to see what the explain plan and results would be for something like the following two statements (sorry - you're going have to check them, it's late!)
    WITH a AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_one),
    b AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_two),
    sourceqry AS
    (SELECT  b.col3 x
           , a.col1 y
           , a.col2 z
    FROM table_one a
        , table_two b
    WHERE a.col3 = 12345
    AND   a.col4 = 100
    AND   b.col5 = a.col5)
    SELECT apex_item.checkbox(1,x), y , z
    FROM sourceqry
    WITH a AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_one),
    b AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_two)
    SELECT  apex_item.checkbox(1,x), y , z
    FROM table_one a
        , table_two b
    WHERE a.col3 = 12345
    AND   a.col4 = 100
    AND   b.col5 = a.col5If the remote tables are at the same site, then you should have the same results. If they aren't you should get the same results but different to the original query.
    We aren't being told the real cardinality of the inners select here so the explain plan is distorted (this is normal for queries on remote and especially non-oracle sites). This hinders tuning normally but I don't think this is your problem at all. How many distinct values do you normally get of the column aliased 'x' and how many rows are normally returned in total? Also how are you testing response times, in APEX, SQL Developer, Toad SQLplus etc?
    Sorry for all the questions but it helps to answer the question, if I can.
    Cheers
    Ben
    http://www.munkyben.wordpress.com
    Don't forget to mark replies helpful or correct ;)

  • A problem regarding set up of Oracle Lite 3.6.0.2.0 on Win 95, with JDK 1.1.8 &java 2

    A problem regarding set up of Oracle Lite 3.6.0.2.0 on Win 95, with JDK 1.1.8 and Java 2 SDK ( Ver 1.3 Beta)
    After the installation of Oracle Lite 3.6.0.2.0 on a laptop (with WIN 95 OS), When I run Oracle Lite Designer from start menu, I receive following error message :
    ====================================
    Invalid class name 'FILES\ORA95_2\LITE\DESIGNER\oldes.jar;C:\PROGRAM'
    usage: java [-options] class
    where options include:
    -help print out this message
    -version print out the build version
    -v -verbose turn on verbose mode
    -debug enable remote JAVA debugging
    -noasyncgc don't allow asynchronous garbage collection
    -verbosegc print a message when garbage collection occurs
    -noclassgc disable class garbage collection
    -ss<number> set the maximum native stack size for any thread
    -oss<number> set the maximum Java stack size for any thread
    -ms<number> set the initial Java heap size
    -mx<number> set the maximum Java heap size
    -classpath <directories separated by semicolons>
    list directories in which to look for classes
    -prof[:<file>] output profiling data to .\java.prof or .\<file>
    -verify verify all classes when read in
    -verifyremote verify classes read in over the network [default]
    -noverify do not verify any class
    -nojit disable JIT compiler
    Please make sure that JDK 1.1.4 (or greater) is installed in your machine and CLASSPATH is set properly. JAVA.EXE must be in the PATH.
    ====================================
    My ORACLE_HOME is c:\program files\ora95_2 and Oracle Lite is installed under the ORACLE_HOME in LITE\DESIGNER directory.
    JDK version is 1.1.8 which is greater than 1.1.4 installed in c:\program files\jdk1.1.8, My PATH, and CLASSPATH are set in AUTOEXEC.BAT as follows:
    set CLASSPATH=c:\Progra~1\jdk1.1.8\lib\classes.zip;c:\progra~1\ora95_2\lite\classes\olite36.jar;c:\progra~1\ora95_2\lite\designer\oldes.jar;c:\progra~1\ora95_2\lite\designer\swingall.j ar
    PATH=C:\Progra~1\Ora95_2\bin;.;c:\Progra~1\jdk1.1.8\lib;c:\Progra~1\jdk1.1.8\bin;C:\Progra~1\Ora95_2\lite\Designer;C:\WIN95;C:\WIN95\COMMAND;C:\UTIL
    And, I can run JAVA.EXE from any directory on command prompt.
    With JAVA 2 SDK (ver 1.3 Beta) instead of JDK 1.1.8 I'm getting a different Error message as follows:
    =============================
    java.lang.NoClassFoundError: 'FILES\ORA95_2\LITE\DESIGNER\oldes.jar;C:\PROGRAM'
    Please make sure that JDK 1.1.4 (or greater) is installed in your machine and CLASSPATH is set properly. JAVA.EXE must be in the PATH.
    ==============================
    the PATH and CLASSPATH were set accordingly, as with JDK1.1.8, and there was no classes.zip in classpath
    also the class file or the jar file looks weird or wrapped in the error message : 'FILES\ORA95_2\LITE\DESIGNER\oldes.jar;C:\PROGRAM'
    Another interesting thing I noticed is if I run oldes.exe from Installation CD, the Oracle Lite Designer runs fine, and without error, I'm able to modify tables in the database of my laptop also.
    Could someone shade some light on what am I doing wrong here ?
    Thanks for help in advance .
    Regards
    Viral
    null

    On 07/20/2015 06:35 AM, Itzhak Hovav wrote:
    > hi
    > [snip]
    > [root@p22 eclipse]# cat eclipse.ini -startup
    > plugins/org.eclipse.equinox.launcher_1.3.0.v20120522-1813.jar
    > --launcher.library
    > plugins/org.eclipse.equinox.launcher.gtk.linux.x86_64_1.1.200.v20120913-144807
    >
    > -showsplash
    > org.eclipse.platform
    > --launcher.XXMaxPermSize
    > 256m
    > --launcher.defaultAction
    > openFile
    > -vmargs
    > -Xms40m
    > -Xmx512m
    > [snip]
    Try this: http://wiki.eclipse.org/Eclipse.ini. You should have read the
    sticky posts at forum's top for getting started help.

  • How to determine who performed the last full database dump before cumulative dump

    Hello,
    ASE 15.7 SP100 allows cumulative backups, and if cumulative dump is tried before a full dump, ASE shows this error:
    You cannot execute an incremental dump before a full database dump.
    From my experiments, it seems that it does not matter how the full dump is performed (by native isql, or by a 3rd party API). As long as a full dump is done on a database, ASE seems to be keeping track of all the changed pages since the last full dump. So this means that you can perform a full dump using a 3rd party API, and then on isql, if you run a cumulative dump command, the cumulative dump succeeds, which is based on the last full dump by another library!
    So, my question is: is there any way to programmatically determine how (by native isql or 3rd party) the last full backup was performed? I believe $SYBASE_HOME/ASE-15_0/dumphist contains this info, but it requires 'enable dump history' to be set first, and I am looking for a solution which does not involve checking a disk file.
    Thanks,
    Ali

    Dear Mr Razib,
    I have not explored the feature but ASE autiding might provide you the possibility to access information on past database dumps via SQL.
    Apart from that - I am not aware of an SQL interface to the dumphist file (would be nice to have, I agree)
    Enabling dump history is definitley highly recommended  (in my opinion) .
    There is yet another feature which might help you to prevent an DBA from dumping databases and transactions to various locations.
    When you create a DUMP CONFIGURATION and additionally set parameter
            enforce dump configuration
    ASE will prevent normal (free style) DUMP commands but enforce the use of an existing dump configuration. The mechanism is not fool proof (nothing prevents from creating yet another dump configuration on the fly) - but at least something.
    With kind regards
    Tilman Model-Bosch

  • Problem: Select  Database  to get  X highest entries in reagrd to one field

    Hi Forum,
    I have a problem regarding a complicated (at least for me) Select statement. It uses a customer Z-table to read certain data.
    But my goal is to read only a certain number of top entries per store.
    The number can change from one run to the other.
    To make it clearer here is the simplified defintion of the database table
    MANDT
    LOCNR
    DELTAVERS
    LOCNR is a store and DELTAVERS is a version (simple counter 001,002,..)of that entry. Both fields are key fields.
    Of course there additional fields with values that I interested in. But they are left out here to let focus on the problem.
    My goal is to read a certain #number of top DELTAVERS (versions) for each LOCNR(Store). I do not know the stores or the # of DELTAVERS.
    The select should provide a internal table showing the X highest DELTAVERS with the related store
    e.g. A Select that should show the 3 highest DELTAVERS per Store LOCNR, if available
    LOCNR: DELTAVERS:
    N001 0013
    N001 0023
    N001 0024
    B235 0021
    B235 0056
    Note: In this example B235 has only two entries
    Code:
         l_having_by_cond = "count( DELTAVERS ) le  3".
           SELECT
                      locnr
                      deltavers
                      FROM z_locnr_version
                      APPENDING CORRESPONDING FIELDS OF TABLE
                      lt_locnr_version                 
                      GROUP BY locnr deltavers
                      HAVING (l_having_by_cond).
                      ORDER BY locnr deltavers DESCENDING.
    lt_locnr_version should hold the desired result.
    It seems to use the count for all entries and not just for each LOCNR.
    Hoiw can I achive that the internal table holds the three highest DELTAVERS for each LOCNR (Store).
    Or what would be the best solution here !? Remember we talking here about 1000+ Stores.
    Please help !!
    Thanks in advance,
    Chris

    Hi Forum
    me again, I believe that I somewhat missunderstood the concept of GROUP BY and HAVING clause.
    Both might prove completely useless in this case.
    Would actually aa simple UP TO N ROWS do the tricking providing that the Select is done per store !?
    How can I ensure that it gets always the TOP N entries!??
    Thanks,
    Chris
    > Hi Forum,
    >
    > I have a problem regarding a complicated (at least
    > for me) Select statement. It uses a customer Z-table
    > to read certain data.
    >
    > But my goal is to read only a certain number of top
    > entries per store.
    > The number can change from one run to the other.
    >
    > To make it clearer here is the simplified defintion
    > of the database table

    > MANDT
    > LOCNR
    > DELTAVERS

    >
    >
    > LOCNR is a store and DELTAVERS is a version (simple
    > counter 001,002,..)of that entry. Both fields are key
    > fields.
    > Of course there additional fields with values that I
    > interested in. But they are left out here to let
    > focus on the problem.
    >
    > My goal is to read a certain #number of top DELTAVERS
    > (versions) for each LOCNR(Store). I do not know the
    > stores or the # of DELTAVERS.
    >
    > The select should provide a internal table showing
    > the X highest DELTAVERS with the related store
    >
    > e.g. A Select that should show the 3 highest
    > DELTAVERS per Store LOCNR, if available
    >
    >
    > LOCNR: DELTAVERS:
    > N001 0013
    > N001 0023
    > N001 0024
    > B235 0021
    > B235 0056
    > ....
    >
    >
    >
    > Note: In this example B235 has only two entries
    >
    >
    >
    > Code:
    >
    >            
    >      l_having_by_cond = "count( DELTAVERS ) le  3".
    >
    >        SELECT
    >                   locnr
    >                   deltavers
    >
    >                   FROM z_locnr_version
    >                  
    > APPENDING CORRESPONDING FIELDS OF
    > SPONDING FIELDS OF TABLE
    >                   lt_locnr_version                 
    >                
    >                   GROUP BY locnr deltavers
    >                   HAVING (l_having_by_cond).
    > ORDER BY locnr deltavers
    > BY locnr deltavers DESCENDING.

    >
    >
    > lt_locnr_version should hold the desired result.
    >
    > It seems to use the count for all entries and not
    > just for each LOCNR.
    > Hoiw can I achive that the internal table holds the
    > three highest DELTAVERS for each LOCNR (Store).
    >
    > Or what would be the best solution here !? Remember
    > we talking here about 1000+ Stores.
    >
    > Please help !!
    >
    > Thanks in advance,
    >
    > Chris

  • Full Database Export/Import

    I was attempting to export a full database from one server (DB files on the N: drive) and import it into a freshly created new database with the same instance name on a new server (DB files on the D: drive).
    I was told to:
    1) Export the full database as sys using full=y
    2) Pre-create all of the tablespaces on the target database that do not exist on the target database
    3) Import the full database as sys using full=y
    =========================================================
    Here are the steps I had to follow:
    1) Export the full database as sys using full=y
    2) Pre-create all of the tablespaces on the target database
    3) Import the full database as sys using full=y - Result: over 1900 invalid objects and thousands of errors
    4) Run Utlrp.sql - Result: 8 to 10 invalid objects
    5) Manually attempt to compile or rebuild objects. Several grants were required to get most of these to work. - Result: After quite a bit of hand crafting, two objects are still invalid.
    6) After spending almost two hours in a Collaborative Conference with an analyst who is working very hard to sold this, there are still two objects that are invalid.
    Note: From my perspective this is not a straight forward process
    =========================================================
    Questions:
    1) Has anyone successfully moved a database as described without all of these annoying steps?
    2) Is there a document that details the full database export/import?
    Thanks, RJZ

    Hello RJZ,
    There is not sufficient information to reply accurately to your queries,
    1. Could you post the error message that you're getting when you compile these two programs? Do they compile okay on the old server?
    2. What was the reason you choose exp and imp, to move your db to the new server and different disk? Were you migrating to a different version, or was it because of the db-reorg? It would have been possible to simple copy the db-files to the new location, and then rename the datafiles in mount-state. It would be more simpler I believe unless as stated above
    Did you check the import logs? While importing were there any errors? Actually it should have been pretty straight forward (no reason why you need documentation for exp / imp). Grants should have automatically come, unless you have select grants=n option
    I think because of the changes you have done for step-5, you might have changed some stuff somewhere. Please post the error message, and tell us if you have migrated the db to a higher version
    Regards
    Sudhanshu

  • Problem with database connectivity

    Hi guys,
    I'm having a problem with database connectivity . I'm using the mySQL database & org.gjt.mm.mysql driver.
    I've kept the org folder under the directory where the Database.java program is residing .
    My program is as follows:
    import java.sql.*;
    public class Database
         Connection con;
         public void connect()
              System.out.println("In connect.");
         public Connection connectDB(String username,String password,String database)
              try
                   Class.forName("org.gjt.mm.mysql.jdbc.Driver");
                   String url="jdbc:mysql:"+database;
                   con=DriverManager.getConnection(url,username,password);
              catch(SQLException e)
                   System.out.println("SQL Exception caught:"+e);     
              catch(Exception e)
                   System.out.println("Exception e caught:"+e);     
              return con;
         public static void main(String args[])
              try
              Connection c;
              Database db=new Database();
              c=db.connectDB("root","","//192.168.0.2/squid");
              System.out.println("Connection created.");
              c.close();
              catch(Exception e)
                   System.out.println("Exception caught:"+e);     
    It gets compiled but gives the following error when run.
    Exception e caught:java.lang.ClassNotFoundException: org.gjt.mm.mysql.jdbc.Driver
    Connection created.
    Exception caught:java.lang.NullPointerException
    I don't know why the class is not found.Pl . help me rectify my mistake.Is there any other path to be set?
    Thanks.

    You need to run it with the JDBC driver in the classpath
    java -classpath .;[mySqlJdbcDriver.jar] Database

  • VKOA Problem - Going to Dump

    Hello all,
    Can any one suggest hwo to over come problem with VKOA Dump.......
    I found a solution which says problem caused due to some change in the standard program SAPL089C, Some include Function is missing. the solution was for 3.0 version, can any one provide me the details for ECC 6.0 verison.
    Thanks N Regards
    Venu MG

    Dear Venu,
    I think, a better explaination could be obtained in Forums like, SAP NetWeaver Administrator.
    Alternatively, BASIS-Admin is the right person to resolve the technical issues, like this.
    Alternatively, You may analyse the error in T. Code: ST22
    Best Regards,
    Amit.
    Note: May be this OSS-Note helps you:
    OSS Note 107161 - HP SAPKH30F35 IMPORT_PROPER - SAPL089C L089CT00

  • Problem with MVIEWS in import

    I had got an export dump from an Oracle 8.1.7.4 database and imported it into a 10gR2 database (2 node RAC on AIX 5.3, 64 bit, 10.2.0.3). All the MVIEWS got created.
    Now again i took [traditional] export of the new database and created it on another machine (10.2.0.3 non-RAC on AIX 5.3,64 bit). Now here came the issue. Some of the MVIEWS got created as VIEWS.
    That SNAP$ table got created and then what was supposed to be a MVIEW, got created as VIEW on SNAP$ table. It happened with 12 MVIEWS out of total 32.
    Any ideas what could be the reason or anybody faced something similar ?
    Any more information required please let me know.
    Thanks
    Amardeep Sidhu
    Corrected some information....
    Message was edited by:
    Amardeep Sidhu

    Could you list the steps you are going through in detail?
    I just tried stacking some images in the import window (from the HD not a card) and they came into the album as closed stacks containing all the images. Doesn't stop it being a bug on your installation, of course, just that it doesn't reproduce on my machine.
    Ian

Maybe you are looking for

  • Form doesn't exists after calling driver program

    Hi,      I had created a invoice form, and i called through driver program in se38.After executing i m getting as 'Form doesn't exists'. 'zinvoice1'   is the name of the smartform. Code as follows: REPORT  ZINVOICE1. tables: kna1,vbrp,vbrk. parameter

  • X61 Help installing ssd onto XP X61 machine

    So I purchased a Samsung 840 ssd for my 7673-74u X61. My plan: Fresh install of XP 32 bit Install drivers Install GT1 diagnostic software for BMW Well I inserted the ssd into the X61 and the BIOS recognizes the ssd. In the BIOS settings I move the ss

  • IPhoto Troubleshoot - Lost File Upon Photo Edit

    Hello! I'm having trouble in iPhoto. Each time I finish editing an image and return to the album or event screen, then click on the photo again, the area that should show the image goes black. When I right click on "Show File" the file is shown with

  • Managed server taking 15-25 minute to start with SOA/ADF/BPM application installed on it

    Hi, We are seeing issue with our Prod/non prod environment where Admin server is getting started in 5 minute which is normal but managed server(clustering) or non clustering environment are takiing 15-25 minute to start. We are using weblogic 10.3.6

  • XML Data Set selection by attribute?

    I am new to Spry and was trying to work with the XML Data Set feature. I have an XML file with the schema listed below. I wanted to know if it were possible to only grab the data from this XML file if it matches a certain type? For example, grab data