Exp / imp 64 bit - 32 bit

Hi,
We have problems to import a dmp file (what exported on a 64 bit OS) on a 32 bit OS:
IMP-00010: not a valid export file, header failed verification
IMP-00000: Import terminated unsuccessfully
Can someone help me with a link, what is the steps / parameters of exp tool to we can import the result dmp file on 32 / 64 bit OS?
Regards,
Imre

Hello,
IMP-00010: not a valid export file, header failed verification
what is the steps / parameters of exp tool to we can import the result dmp file on 32 / 64 bit OS?There's no specific step or parameter, Export/Import or Datapump can be used across different platforms in 32 bit or 64 bit.
The error message is more related about a corrupted "dump".
So be sure that you export and import with the same Utility. For instance, don't export with Datapump and import with the classical Import utility.
If you send the Dump by FTP, check that you are in binary mode.
Hope this help.
Best regards,
Jean-Valentin

Similar Messages

  • Exp/Imp between different OS

    Hi,
    Could someone please help me clarify that is exp/imp between different OS possible(eg. exp in wind2000 and then imp in Unix)?
    Thanks

    This reply may be a bit late but I just saw your Q:
    The answer is Yes, of course you in imp/exp between different OS, just ensure to transfer the dmp file(the export output) in Binary mode between different OS.

  • EXP/IMP with NCLOB/BLOB

    Hello all,
    I have to pass an schema from one database, to another recreated as the first one (same schema name, same rights, same tablespace, same all) Both are Oracle database 11.2.0.2 enterprise 64 bits on linux 64. Is there any consideration I have to take, as there's objects with NCLOB and BLOB types... I've tryed the export, and it has gone ok without any problem, buy there will be any issue when I import it on the second database?
    All this is to change the database's server from the phisical one that it's now, to a virtual one (OVM).
    Thanks for your help!!

    It would probably be better if you used Data Pump (expdp/impdp) instead of original exp/imp. It has support for more objects and features that you may have used.
    Dean

  • Adobe Flash Player 32-bit/64-bit ActiveX 12.0.0.38 for IE - failing to install (via SCCM/SCUP)

    Hi, I've been using SCCM/SCUP to update Adobe Flash for some now, and it's been working great. Much better than pervious custom scripting methods. However, the most recent version for IE, 12.0.0.38, isn't installing properly on some machines in my workplace (all windows 7 sp1 32-bit), and I can't figure out why. I can't see any pattern witht the machines that have the error.
    The FlashInstall.log looks fine, no errors at all. And "Programs & Features" shows that version installed. However, SCCM keeps trying to reinstall and gives this error message: Installation job encountered some failures. Error = 0x80240022. I've read up on this message from previous flash versions, these seem to relate to different issues.
    The few machines with Firefox have all successfully installed that version, which I notice for the first time I can remember, is a slightly different version number? Adobe Flash Player 32-bit/64-bit Plugin 12.0.0.43.
    Has anyone else seen, this issue with 12.0.0.38 for IE, and have any ideas? Thanks

    May be related to issues with logic in detection which Adobe is investigating.
    See http://forums.adobe.com/message/6014720#6014720

  • X301 and Vodafone and AT&T Wireless WAN (HSPA) Driver for Windows 7 (32-bit, 64-bit)

    Ok, I (Lenovo) did it again.
    System Update showed me this update: "Vodafone and AT&T Wireless WAN (HSPA) Driver for Windows 7 (32-bit, 64-bit)".
    As I'm using Vodafone and I had a AT&T driver, I decided to update.
    The result is, that now the GPS doesn't work.
    Yes, I'm in a place where the GPS signal is present (my house, with the previous driver I was able to get 6/7 satellites).
    What happens? Lenovo TV GPS stays there trying to fix the satellites, but nothing happens. I can't recover the system to a previous state, as I updated Flash player too, and Windows only show me the possibility to roll back this update.
    So, apart advicing everyone not to update, I kindky ask:
    - where can I found the previous driver? I searched the web without success...
    - why does Lenovo deploy driver which can cause problems (I just had one updating the BT driver last week! .msi installer problem, error 1935...)?
    Thanks to everyone
    600, R52, T61p, X301

    Ok, I found in the Net the drive 7uw706ww.zip, which is Version 6.1.4.2, while the updated version is 6.1.10.5 (7UW708WW).
    I tried again the GPS, and it wasn't working.
    Anyway, before downgrading, I tried to reinstall the new driver, downloading and installing the .exe instead going through System Update.
    Result: now GPS is working (same place, same conditions, just 5 minutes later)!
    Don't know where the difference is...
    600, R52, T61p, X301

  • Exp/imp related problem

    Hi all,
    I have one problem in Oracle 9.2.0.8 onRHEL4. Iwant to use exp/imp to update one table from one server to another.
    Suppose I have one table abc on A server and it is having around 1 million records and another B server is also having the same no. of records and some of the record have been changed to A server and i also want to imp the same record to another server B.Is there any paramneter in IMP.
    Please suggest me.

    Hi,
    Kamran Agayev A. wrote:
    user00726 wrote:
    but how would i know which table has been updated or modifiedUsing MERGE funtion, you'll merge two tables. This command will updated changed rows and insert non-inserted rows to the second table and make it as the same as the first tableSQL> create table azar(pid number,sales number,status varchar2(20));
    Table created.
    SQL> create table azar01(pid number,sales number,status varchar2(20));
    Table created.
    SQL> insert into azar01 values(1,12,'CURR');
    1 row created.
    SQL> insert into azar01 values(2,13,'NEW');
    1 row created.
    SQL> insert into azar01 values(3,15,'CURR');
    1 row created.
    SQL> insert into azar values(2,24,'CURR');
    1 row created.
    SQL> insert into azar values(3,0,'OBS');
    1 row created.
    SQL> insert into azar values(4,42,'CURR');
    1 row created.
    SQL> commit;
    Commit complete.
    SQL> select * from azar01;
    PID SALES STATUS
    1 12 CURR
    2 13 NEW
    3 15 CURR
    SQL> select * from azar;
    PID SALES STATUS
    2 24 CURR
    3 0 OBS
    4 42 CURR
    SQL> merge into azar01 a using azar b on (a.pid=b.pid) when matched
    2 then update set a.sales=a.sales + b.sales, a.status=b.status
    3 delete where a.status='OBS'
    4 when not matched
    5 then insert values(b.pid,b.sales,'NEW');
    3 rows merged.
    SQL> select * from azar01;
    PID SALES STATUS
    1 12 CURR
    2 37 CURR
    4 42 NEW
    Hello Sir, is this correct?
    Regards
    S.Azar
    DBA

  • Exp/Imp alternatives for large amounts of data (30GB)

    Hi,
    I've come into a new role where various test database are to be 'refreshed' each night with cleansed copies of production data. They have been using the Imp/Exp utilities with 10g R2. The export process is ok, but what's killing us is the time it takes to transfer..unzip...and import 32GB .dmp files. I'm looking for suggestions on what we can do to reduce these times. Currently the import takes 4 to 5 hours.
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilities. Are 'Transportable Tablespaces' the next logical solution? I've been reading up on them and could start prototyping/testing the process next week. What else is in Oracle's toolbox I should be considering?
    Thanks
    brian

    Hi,
    I haven't used datapump, but I've heard it doesn't offer much benefit when it comes to saving time over the old imp/exp utilitiesDatapump will be faster for a couple of reasons. It uses direct path to unload the data. DataPump also supports parallel processes, so while one process is exporting metadata, the other processes can be exporting the data. In 11, you can also compress the dumpfiles as you are exporting. (Both data and metadata compression is available in 11, I think metadata compression is available in 10.2). This will remove your zip step.
    As far as transportable tablespace, yes, this is an option. There are some requirements, but if it works for you, all you will be exporting will be the metadata and no data. The data is copied from the source to the target by way of datafiles. One of the biggest requirements is that the tablespaces need to be read only while the export job is running. This is true for both exp/imp and expdp/impdp.

  • Will CC&B 2.3.1 work in both 32 bits, 64 bits windows7

    Hi,
    Will CC&B 2.3.1 work in both 32 bits, 64 bits windows7 ? Also which version of Internet Explorer is supported for CC&B 2.3.1 in windows 7

    Windows 7 is supported only in a tier 1 role (browser-based client) as far as I am aware. It's not certified for use with the web app server, or at least not mentioned in the official CC&B 2.3.1 documentation. The only supported configuration is Windows 7/Internet Explorer 8.
    If you need more flexibility but don't have (or don't want to use) Windows 2008 Server, you can run Windows XP SP3 or higher, then you get a bit more variety on the browser side of things. It will support IE 6 SP3, IE 7, and IE 8.
    I hope this helps you at a little! In my personal opinion, Windows 7 is still somewhat new to be used comfortably as a "server" for CC&B 2.3.1. This is just my opinion of course, and you know what they say about opinions :)

  • Illustrator CC (32 bit/64 bit) and InDesign CC crashing on windows 8.1

    I have recently installed the trail versions of illustrator CC (32 bit/64 bit) and InDesign CC using windows 8.1 operating system. When I run the applications they both crash. I have updated them but still facing the same problem.
    I get the “what’s new screen” on Illustrator and then it crashes. After that, I get the “trial screen” with the amount of days left to use. With InDesign, it start. I get the “trial screen” with the amount of days left to use. Click continue and it crashes. 
    I have already lost 6 days of the trail. Please advise on how to fix this problem. 
    Thanks,
    Tariq

    Tariq,
    I have alerted someone that may alert the right someone, who will hopefully turn up here soon.
    You may also read this similar thread:
    http://forums.adobe.com/message/6127034#6127034

  • Photoshop Extended CS 4: 32 bit & 64 bit has stopped working with an error 131:4

    Dell XPS 720 Quod Processor 8GB Memory Running Window 7
    Both Photoshop Extended CS 4: 32 bit & 64 bit has stopped working with an error 131:4. which stated to reboot to solve the problem. After several attempts at rebooting the computer, I received the same error message.
    I was in the process of updating all my Adobe software in order to tackle an Adobe Media Encoder problem: when I sent a stream from the time line to be converted to an AVI file from Premiere Pro CS4 to Encoder the window appeared & hung.
    I want to treat the Photoshop problem first. How do I proceed?

    Unable to Re-install Adobe Photoshop cs4
    A few years back my brother bought me a Dell Studio XPS 8100 running Windows 7 pro.
    I have 8 gig's of ram and has a 32 and x64 bit system, right now it is enough for my needs and I can easily spare a gig or two for this program.
    I cannot install this program again. I have used iYogi, Adobe Tech, support and am wondering if it makes any difference whether I install it on a x64 or 32 system?
    Why can't i re-installl the program. I used it a lot and I do miss it.

  • Export / import  exp / imp commands Oracle 10gXE on Ubuntu

    I have Oracle 10gXE installed on Linux 2.6.32-28-generic #55-Ubuntu, and I need soe help on how to export / import the base with exp / imp commands. The commands seems to be installed on */usr/lib/oracle/xe/app/oracle/product/10.2.0/server/bin* directory but I cannot execute them.
    The error message I got =
    No command 'exp' found, did you mean:
    Command 'xep' from package 'pvm-examples' (universe)
    Command 'ex' from package 'vim' (main)
    Command 'ex' from package 'nvi' (universe)
    Command 'ex' from package 'vim-nox' (universe)
    Command 'ex' from package 'vim-gnome' (main)
    Command 'ex' from package 'vim-tiny' (main)
    Command 'ex' from package 'vim-gtk' (universe)
    Command 'axp' from package 'axp' (universe)
    Command 'expr' from package 'coreutils' (main)
    Command 'expn' from package 'sendmail-base' (universe)
    Command 'epp' from package 'e16' (universe)
    exp: command not found
    Is there something I have to do ?

    Hi,
    You have not set environment variables correctly.
    http://download.oracle.com/docs/cd/B25329_01/doc/install.102/b25144/toc.htm#BABDGCHH
    And of course that sciprt have small hickup, so see
    http://ubuntuforums.org/showpost.php?p=7838671&postcount=4
    Regards,
    Jari

  • Exp/Imp content Area Portlet..?

    How to Exp/Imp "Content area folder published as a portlet" which reside in "WWSBR_SITEBUILDER_PROVIDER".
    Anybody from Oracle..?
    Thanks.
    Rakesh

    I assume you are using the 309 version of portal. It is possible to export out an entire content area or a page but not the granular items. By granular I mean just the folder in this context. If you wish to export/import the entire content area/page, you can run the pageexp/pageimp/contexp/contimp scripts which reside in the WWU directory.

  • Exp/Imp In Oracle 10g client

    Hi All,
    I want to take one schema export from Oracle 10g Client. I am in new Oracle 10g.
    In oracle 9i, we can using exp/imp command for Export and Import from Client itself.
    I heard about in Oracle 10g, we can use Expdb/Impdb in Server only. Is there any possibility to take exp/imp in Client also.
    Pls help me...
    Cheers,
    Moorthy.GS

    To add up to expdb from oracle client
    NETWORK_LINK
    Default: none
    Purpose
    Enables an export from a (source) database identified by a valid database link. The data from the source database instance is written to a dump file set on the connected database instance.
    Syntax and Description
    NETWORK_LINK=source_database_link
    The NETWORK_LINK parameter initiates an export using a database link. This means that the system to which the expdp client is connected contacts the source database referenced by the source_database_link, retrieves data from it, and writes the data to a dump file set back on the connected system.
    The source_database_link provided must be the name of a database link to an available database. If the database on that instance does not already have a database link, you or your DBA must create one. For more information about the CREATE DATABASE LINK statement, see Oracle Database SQL Reference.
    If the source database is read-only, then the user on the source database must have a locally managed tablespace assigned as the default temporary tablespace. Otherwise, the job will fail. For further details about this, see the information about creating locally managed temporary tablespaces in the Oracle Database Administrator's Guide.
    Restrictions
    When the NETWORK_LINK parameter is used in conjunction with the TABLES parameter, only whole tables can be exported (not partitions of tables).
    The only types of database links supported by Data Pump Export are: public, fixed-user, and connected-user. Current-user database links are not supported.
    Example
    The following is an example of using the NETWORK_LINK parameter. The source_database_link would be replaced with the name of a valid database link that must already exist.
    expdp hr/hr DIRECTORY=dpump_dir1 NETWORK_LINK=source_database_linkDUMPFILE=network_export.dmp LOGFILE=network_export.log

  • Database upgrade from 8i to 10g using exp/imp

    Dear Friends,
    Please provide the steps to upgrade from 8i to 10g using exp/imp.
    Thanks,
    Rathinavel

    Hi;
    Please also see cold backup option
    How to migrate from 8i to 10g to new server using cold backup [ID 742108.1]
    Also see:
    Upgrading from 8i to 10g with import utility
    http://searchoracle.techtarget.com/answer/Upgrading-from-8i-to-10g-with-import-utility
    Regard
    Helios

  • Full Database Exp & Imp

    Hi,
    I am trying to Exp & Imp a full database. I am working on Oracle 9i & 10g on Solaris 9. I am not using Data Pump. Can anyone please help me with the following:
    1. I am performing the full export using SYSTEM user.
    1.a Does a Full export include (or backups up) the DATA DICTIONARY of the database?
    1.b Does a Full export include the backup of SYS and SYSTEM objects?
    I am using the following command to export
    exp system/system@testdb file=$HOME/testdbfullexp.dmp full=y statistics=none
    I have tried importing the FULL export to another database and i did see that SYS and SYSTEM objects were also being imported ( got some errors regarding constraints and inconsistencies).
    I would like to ask like what are the ideal steps to follow to copy a database from DB1 to DB2 using EXP and IMP
    Any information will be of a great help
    Thanks,
    Harris.

    1 a) No, as the data dictionary will be automagicall recreated by implicit SQL.
    This means any non dictionary objects under SYS will be lost
    1 b) as above. SYSTEM however is a normal user.
    any %SYS user will NOT be exported (CTXSYS, MDSYS, etc)
    On import of SYSTEM there will be always errors, as SYSTEM is non-empty after initial database creation.
    Sybrand Bakker
    Senior Oracle DBA

  • Full database exp/imp  between RAC  and single database

    Hi Experts,
    we have a RAC database oracle 10GR2 with 4 node in linux. i try to duplicate rac database into single instance window database.
    there are same version both database. during importing, I need to create 4 undo tablespace to keep imp processing.
    How to keep one undo tablespace in single instance database?
    any experience of exp/imp RAC database into single instance database to share with me?
    Thanks
    Jim
    Edited by: user589812 on Nov 13, 2009 10:35 AM

    JIm,
    I also want to know can we add the exclude=tablespace on the impdp command for full database exp/imp?You can't use exclude=tablespace on exp/imp. It is for datapump expdp/impdp only.
    I am very insteresting in your recommadition.
    But for a full database impdp, how to exclude a table during full database imp? May I have a example for this case?
    I used a expdp for full database exp. but I got a exp error in expdp log as ORA-31679: Table data object "SALE"."TOAD_PLAN_TABLE" has long columns, and longs can not >be loaded/unloaded using a network linkHaving long columns in a table means that it can't be exported/imported over a network link. To exclude this, you can use the exclude expression:
    expdp user/password exclude=TABLE:"= 'SALES'" ...
    This will exclude all tables named sales. If you have that table in schema scott and then in schema blake, it will exclude both of them. The error that you are getting is not a fatal error, but that table will not be exported/imported.
    the final message as
    Master table "SYSTEM"."SYS_EXPORT_FULL_01" successfully loaded/unloaded
    Dump file set for SYSTEM.SYS_EXPORT_FULL_01 is:
    F:\ORACLEBACKUP\SALEFULL091113.DMP
    Job "SYSTEM"."SYS_EXPORT_FULL_01" completed with 1 error(s) at 16:50:26Yes, the fact that it did not export one table does not make the job fail, it will continue on exporting all other objects.
    . I drop database that gerenated a expdp dump file.
    and recreate blank database and then impdp again.
    But I got lots of error as
    ORA-39151: Table "SYSMAN"."MGMT_ARU_OUI_COMPONENTS" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
    ORA-39151: Table "SYSMAN"."MGMT_BUG_ADVISORY" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
    ......ORA-31684: Object type TYPE_BODY:"SYSMAN"."MGMT_THRESHOLD" already exists
    ORA-39111: Dependent object type TRIGGER:"SYSMAN"."SEV_ANNOTATION_INSERT_TR" skipped, base object type VIEW:"SYSMAN"."MGMT_SEVERITY_ANNOTATION" >already exists
    and last line as
    Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 2581 error(s) at 11:54:57Yes, even though you think you have an empty database, if you have installed any apps or anything, it may create tables that could exist in your dumpfile. If you know that you want the tables from the dumpfile and not the existing ones in the database, then you can use this on the impdp command:
    impdp user/password table_exists_action=replace ...
    If a table that is being imported exists, DataPump will detect this, drop the table, then create the table. Then all of the dependent objects will be created. If you don't then the table and all of it's dependent objects will be skipped, (which is the default).
    There are 4 options with table_exists_action
    replace - I described above
    skip - default, means skip the table and dependent objects like indexes, index statistics, table statistics, etc
    append - keep the existing table and append the data to it, but skip dependent objects
    truncate - truncate the existing table and add the data from the dumpfile, but skip dependent objects.
    Hope this helps.
    Dean

Maybe you are looking for