Export to DMP
Hi everione,
I am trying to export the DB but it won't work.
I try:
exp system/manager file=Wonder.dmp full=yes
I get:
EXP-00056: ORACLE error 1017 encountered
ORA-01017: invalid username/password; logon denied
Username:
I can logon to SQLplus with:
sqlplus system/manager@XE as sysdba
and it works.
Can somebody help me?
thx
Hello Matt,
I'm not sure you've hit the right forum.
But this problem seems to be simple: It seems you haven't set your environment to provide the sid for your XE. Try to give your export the sid of your database like you do in your sqlplus call:
exp system/manager@XE file=Wonder.dmp full=yesFor other database-related problems, please post in the forum {forum:id=61}.
Thanks,
Udo
P.S.: I just found out there is a new specialized forum for issues like yours: {forum:id=732}
Edited by: Udo on 17.11.2010 18:04
Similar Messages
-
Ur database size is 100gb when u take export eport dmp sixe is 90gb but ur
ur database size is 100gb when u take export eport dmp size is 90gb but ur mount point have only 70 gb space how u take export u have not other mount point .
Answers at your duplicated thread --> Some inter view Questions Please give prefect answer help me
-
Time required to strip a field and export to .dmp?
Hello,
I am not an Oracle user but need some help with an Oracle-related question. I work for a newspaper's data department, and I am seeking to get records of inspections of day care operations from a state agency. The agency contracts with a third-party vendor.
The database is about 400,000 records and probably eight or nine tables in Oracle 10g. The vendor needs to strip out one field (social security numbers) and export the rest of the database to a .dmp file. The vendor says that this will take 23 hours of work at $140 an hour, for a total of $3,200.
I work with SQL Server and MyQL, and this seems like an outrageous amount of time and money to strip one field and export a file. Could some Oracle users give me their opinion?
Thank you.Another thing to consider is if the schema design uses social security numbers as a key (generally not a good idea, of course, but not uncommon), then generating a SSN-free dump would require replacing every SSN with some other identifier and making sure that all the referential integrity was still in place, which could require a few hours of development time and make the QA process substantially more complicated. If there are free-form text fields, the vendor may also need to scrub those fields which could require a fair amount of manual effort to determine whether strings of digits are SSN's or something else.
Justin -
Generate SQL Insert Statement using EXPORTED dmp file
Dear DBAs,
Kindly inform me that is it possible to generate SQL script file using the exported file dmp,
I am looking for some option in import utility that generates script without import any rows in the database
Secondly is there any other way to check the validity of exported dmp file data validity without importing it back into the database.
Regards,
AsifHi Satish,
Yes you are correct the insert into statements won't be avaliable using either the indexfile or the show parameter.I had given these two options as per his 1st post requirement where he had mentioned
>
it possible to generate SQL script file using the exported file dmp,I am looking for some option in import utility that generates script without import any rows in the database
>
So, i thought he needed to see whats all objects are in the .dmp
Apologies for my 2nd post.
Anand -
Exporting .dmp file from 9.1.2 to 9.0.1.1
Is it possiable from me to export a .dmp file generated from 9.1.2 to oracle 9.0.1.1
Dear Carlo & Chris, May I ask you to clarify on below comment which is mentioned in Release note of CUCM 10.5
I don't understand why Cisco has mentioned this, if its not required ? I would appreciate your any inputs.
Preupgrade COP File
If you are upgrading to Cisco Unified Communications Manager Release 10.5(1), or later, from a release earlier than Cisco Unified Communications Manager Release 10.0(1), you must download and install ciscocm.version3-keys.cop.sgn on every node in the cluster. This Cisco Options Package (COP) file has the RSA keys that are required to validate the upgrade. Missing RSA-3 keys will result in status errors in the Software Installation/Upgrade window of the Cisco Unified Operating System Administration interface.
Suresh -
What is the best gzip ratio by using pipe to export
Hello,
I try to use data pump to export a core application schema. (10Gdb 10.2.0.4)
I use 'estimate' parameter of expdp before real export, and get future dump file size
will be 235GB.
the only issue is that my largest file system size is 34GB, so seems like I have to use
pipe to zip while exporting...
the action plan is like below:
1 make a unix pipe
$ mknod /faj29/appl/oracle/exports/scott1 p
2.driect pipe to .dmp
nohup gzip -5c < scott1 > /faj29/appl/oracle/exports/ODS.dmp.gz & ===>quesiton is here?
3. create a par
userid=system/password@dbname
SCHEMAS=xxx
DUMPFILE=scott1
LOGFILE=dpump_emrgods:exp_xxx.log
COMPRESSION=NONE
CONTENT=ALL
ATTACH=xxx.EXPORT.job
nohup expdp PARFILE=exp_xxx.par &
My quesiton is based the information I provided, (34GB filesystem size vs 235GB dumpfile size)
how much zip extent shoud I use?
gzip -5c or gzip -9c
seems like -9c is very time consuming, this why I don't like to use it. but I don't know
zip ratio, so any friend knows?
or do we have better way to do this task?
(seems like I can not use parallel export by using pipe, if yes, how?)
thanks a lot
Jerry
Edited by: jerrygreat on May 8, 2009 9:00 AMHello,
I am wrong, the pipe won't work with expdp at all.
it is unlike the process with the older EXP tool, which you could tell to write to a named pipe and the data written to the named pipe could be compressed—all in one step.
So can I use parameter 'compression'???
--it seems like to only to compress meta data? not for dump file...?
(but I want to compress my expected 235G dump file to fit one 34G filesystem)
Any idea?
regards,
Jerry -
How to append date in YYYYMMDD format in .par file for export
Hi,
Database Version: 10.2.0.4
OS: AIX
I have an export script which reads .par file and executes "exp" to export schema.
I would like to add date in "YYYYMMDD" format for the dump file like this
owner=scott file=/exports/scott_${`date '+%Y%m%d'`}.dmp feedback=10000
I know above statement will not help, but I am giving it as example on what I want to achieve.
I want the file name as =/exports/scott_20120712.dmp
Thanks
Sarayuuser13312943 wrote:
Hi,
Database Version: 10.2.0.4
OS: AIX
I have an export script which reads .par file and executes "exp" to export schema.
I would like to add date in "YYYYMMDD" format for the dump file like this
owner=scott file=/exports/scott_${`date '+%Y%m%d'`}.dmp feedback=10000bcm@bcm-laptop:~$ export YYYYMMDD=`date '+%Y%m%d'`
bcm@bcm-laptop:~$ echo $YYYYMMDD
20120712
bcm@bcm-laptop:~$
owner=scott file=/exports/scott_${YYYYMMDD}.dmp feedback=10000 -
Database Export problem (11.2.0.1)
hi all,
I am trying to export the database with the following command:
exp username/password@DEV FULL=Y File=C:\Export\dev.dmp log=C:\Export\dev.log
the user i am exporting have all the rights, all the objects are valid and still it is hanging on the system procedural and objects.
However other user can do export with same command and in the same database?
Please guide..
thankscan you post the screen you are seeing after you issue the command?
Is there any error message in alert file saying any has trace generated??
check the status of the job running...
select owner_name,attached_sessions,job_name,state from dba_datapump_jobs;
here check you jobs details..
-Regards,
Saha -
Hello,
I have to export three tables out of my database with a query. I started the export with following command (on a Windows testing machine).
exp.exe userid=ccq/xxx@KDG file=C:\download\export\NEW.dmp tables= qem_message, qem_messagestatlog, qw_text query=””"where qem_message.messagebody > 739409 AND qem_message.pkey = qem_messagestatlog.pkey AND qem_message.messagebody = qw_text.qwkey" log=C:\download\export\NEW.log
I have got the following error:
. . Export of the table QEM_MESSAGE
EXP-00056: ORACLE-error 904 found
ORA-00904: Unknown columnname
I think the problem is the query statement. All the columns are written right. Is there an error in the statement or is it impossible to execute such statement in an export?Hi,
you have to use export one time at a time with query parameter
consider
SQL> host type d:\query.txt
query="where emp.deptno = dept.deptno
and emp.sal > 1000
or dept.deptno = 40"
SQL> host exp scott/tiger file=d:\tt.dmp tables=(EMP,DEPT) PARFILE=d:\query.txt
Export: Release 10.1.0.2.0 - Production on Thu Feb 1 17:11:11 2007
Copyright (c) 1982, 2004, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
tion
With the Partitioning, OLAP and Data Mining options
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
About to export specified tables via Conventional Path ...
. . exporting table EMP
EXP-00056: ORACLE error 904 encountered
ORA-00904: "DEPT"."DEPTNO": invalid identifier
. . exporting table DEPT
EXP-00056: ORACLE error 904 encountered
ORA-00904: "EMP"."SAL": invalid identifier
Export terminated successfully with warnings.
SQL>regards
Taj -
Export using a QUERY parameter
I am trying to export a subset of data using the following parms:
FILE=C:\EXPORT\ADM_ACCESS_LOG.DMP
GRANTS=Y
INDEXES=Y
DIRECT=N
ROWS=Y
CONSISTENT=N
TRIGGERS=Y
CONSTRAINTS=Y
FEEDBACK=1000
TABLES=(THRESHER.ADM_ACCESS_LOG)
QUERY=\"WHERE LOGIN_DT='31-JUL-2005'\"
The table exists as does the column, however, I get the following error:
LRM-00101: unknown parameter name 'LOGIN_DT'
I have everything spelled correctly and I am logging into export as thresher.
What am I doing wrong?
Thanksuse the query parameter like this
QUERY="WHERE to_char(dte,'mm/dd/yyyy')='07/31/2005'"
or
QUERY="WHERE to_char(dte,'dd-mon-yyyy')='31-JUL-2005'"
--thedba -
Hi
I export a user on Solaris (ORA 9) and want to import on Windows (ORA 9). Data imports - but Triggers generate an error.
What's wrong ?
Toni
Export is done with
exp userid=md01fr2/avs@sun owner=(md01fr2) direct=n statistics=compute grants=y file=d:\temp\export\fr.dmp
Import is done with
Imp USERID=fr/avs FILE=d:\temp\import\fr.dmp FROMUSER=(md01fr2) TOUSER=(fr) LOG=d:\temp\import\IMP.log COMMIT=Y
IMP-00017: Nachfolgende Anweisung war wegen Oracle-Fehler 600 erfolglos:
"CREATE TRIGGER "FR".Troncon_rue_hb before insert or update on Troncon_rue "
"for each row declare maxVersion number; executeTrigger number; begin s"
"elect exec into executeTrigger from tb_history_trigger; if executeTrigger "
"!= 0 then if inserting then :new.f_version := 1; elsif updating then se"
"lect nvl(max(version), 0) into maxVersion from Troncon_rue_h where fid = :n"
"ew.fid; maxVersion := maxVersion + 1; if :old.fid != :new.fid then raise"
"_application_error(-20103, 'Fatal error: Fid can not be changed!'); end if"
"; :new.f_version := maxVersion + 1; end if; end if; end; "
IMP-00003: Oracle-Fehler 600 gefunden
ORA-00600: Interner Fehlercode, Argumente: [4814], [5], [0], [0], [], [], [], []This is likely a DRL issue. verify DRL is configured correctly and a valid PLM4P user is setup in the setup assistant. in addition, make sure you added the new app in IIS for DRLService (this is a doc bug we are correcting that we failed to include in the 611 guide). verify you can attach and then open an attachment on a material spec.
-
Total number of records of partition tables exported using datapump
Hi All,
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
OS: RHEL
I exported a table with partitions using datapump and I would like to verify the total number of all the records exported of all these partition tables.The export has a query on it with WHERE ITEMDATE< TO_DATE (1-JAN-2010). I need it to compare with the exact amount of records in the actual table if it's the same.
Below is the log file of the exported table. It does not show the total number of rows exported but only individually via partition.
Starting "SYS"."SYS_EXPORT_TABLE_05": '/******** AS SYSDBA' dumpfile=data_pump_dir:GSDBA_APPROVED_TL.dmp nologfile=y tables=GSDBA.APPROVED_TL query=GSDBA.APPROVED_TL:"
WHERE ITEMDATE< TO_DATE(\'1-JAN-2010\'\)"
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 517.6 MB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q3" 35.02 MB 1361311 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q4" 33.23 MB 1292051 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q4" 5.875 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q1" 5.875 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q3" 30.53 MB 1186974 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q3" 5.875 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q1" 30.44 MB 1183811 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q2" 30.29 MB 1177468 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q4" 30.09 MB 1170470 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q2" 5.875 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q2" 5.875 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q1" 5.875 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q3" 5.875 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q4" 5.875 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q1" 5.875 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2006_Q3" 0 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2006_Q4" 0 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q1" 0 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q2" 0 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q3" 0 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q4" 0 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q1" 0 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q2" 0 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q2" 0 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q3" 0 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q4" 0 KB 0 rows
. . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_MAXVALUE" 0 KB 0 rows
Master table "SYS"."SYS_EXPORT_TABLE_05" successfully loaded/unloaded
Dump file set for SYS.SYS_EXPORT_TABLE_05 is:
/u01/export/GSDBA_APPROVED_TL.dmp
Job "SYS"."SYS_EXPORT_TABLE_05" successfully completed at 12:00:36
Edited by: 831134 on Jan 25, 2012 6:42 AM
Edited by: 831134 on Jan 25, 2012 6:43 AM
Edited by: 831134 on Jan 25, 2012 6:43 AMI assume you want this so you can run a script to check the count? If not and this is being done manually, then just add up the individual rows. I'm not very good at writing scripts, but I would think that someone here could come up with a script that would sum up the row count for the partitions of the tables from your log file. This is not something that Data Pump writes to the log file.
Dean -
Hi there,
I want to know if there is any way to scheduale ana automatic export for a specific user, and if there is a way for that, how can i do this?
Please be specific.
Thanks in advance.Hi.
If you have a folder to contain all files that are required :
d:\oracle\export
Make a parfile, d:\oracle\export\exp.par, that contains all in parameters to exp.exe
Example:
USERID = SYSTEM/****
BUFFER=5000000
FILE=d:\oracle\export\exp_full.dmp
LOG=d:\oracle\export\exp_full.log
CONSISTENT=Y
STATISTICS=none
Create a bat file d:\oracle\export\exp_full.bat:
set ORACLE_HOME="ORACLE_HOME dirctory"
set ORACLE_SID="SID name"
%ORACLE_HOME%\bin\exp parfile=d:\oracle\export\exp.par
Open scheduler from your control panel and create a job where the program to run is this bat file.
If you only want some particular schema for export, replace the full=Y parameter
with OWNER=schema_owner1,schema_owner2,...schema_ownerN.
Hopes this helps you
PS : Not tested, ad hoc.
rgds
Kjell -
Version incompatible for export in 10g Oracle DB
Hi,
I have an Oracle DB with version 10.2.0.5.0 in Windows 64.
But when I am trying to generate the export file using the standard exp utility:
exp MTT2USER/mtt2user@MTT162 file=C:/export/exportMTOfull1.dmp log=C:/export/exportfullMTO1.log full=y rows=y
I cannot use datapump export utility expdp since the version is 10g.
It is giving Oracle error EXP-00904 (MAXSIZE: invalid identifier) which implies we need to use export version 10 instead of 11.
Does anybody know how to install export version 10.2 or is there any other way you guys perform export.
Please let me know.
Thanks
Amitava.Hi,
As Srini says expdp is available in 10.2.0.5 and is the preferred option - its much better than exp.
If you do wantexp it will be on the remote server you are connecting to, or if you do a full admin install of the client then you will get exp installed locally.
Rich -
I'm trying to export a schema (using OS credentials that have batch job privs), and get all the way through the workflow in EM to submit the job, but it returns the following:
Export Submit Failed
Index: 3, Size: 3
followed by
Export Type Schemas
Statistics type Estimate optimizer statistics when data is imported
Direct Path No
Fail jobs only on errors (not on warnings) No
Files to Export C:\KFC\DB Exports\EXPDAT.DMP
Maximum File Size (MB) 2048
Log File C:\KFC\DB Exports\EXPDAT.LOG
Export paramaters are:
FILE=C:\KFC\DB
Exports\EXPDAT.DMP
LOG=C:\KFC\DB
Exports\EXPDAT.LOG
OWNER=SYS
GRANTS=y
INDEXES=y
ROWS=y
CONSTRAINTS=y
CONSISTENT=n
What is this indicative of?Just to wrap this up, it appears that in order for this to work, you need to have a local account with the "log on as batch" privilige, and I'm guessing it is the account you tell Oracle about during the installation. I was unable to give this account (a local admin account) that right, but our MIS folks were able to do that and then things worked. I don't know if both the local and the domain accounts need to have that right, but there at least has to be a local one that does, and I am guessing it is because the job is run as a batch job using that local user account.
Case closed!
Maybe you are looking for
-
Why Outlook email messages show unreadable characters
When you receive a new email in Outlook, you may see unreadable characters in the email message body. This issue could be caused by one of the following reasons on Outlook client side: The email you received is converted to plain text format or a dif
-
Is there a way to open an apple store account without adding card info?
I want to have an apple store account for 2 reasons only. 1. To download the art covers for my music 2. So iTunes stops asking me every 30 seconds 'do I want an iTunes account' At the end of registering I realised that apple are asking for my card de
-
File problem - checking the existence of a remote file
I have absolute paths to several text files from a ftp server (which don't require any authentication and can be accessed over the internet). I am trying to write a Java method that can check any of the above absolute ftp file paths and returns true/
-
How to reinstall a CS4 suite to windows 7 system
I have purchased a CS4 suite and have downloaded from the website, which has been installed in my old computer (with windows XP system). Currently I replaced my old computer, my new computer is with a Windows 7 system. I have copied all the files I d
-
Sales Order - VBAK - FKARA (order-related billing document) Update
Hello guru, We have one issue where we created one new billing document type to replace existing billing document. We also assigned the new billing doc type in VOV8. But unfortunatly we have so many open sales order which captured exising billing doc