Export 50K tables using exp/expdp command
Hi,
I am trying to export around 52,000 tables using parfile in oracle 10g/Solaris machine.
But I am not able to do that....
IS there a limitation to specify for the number of tables in a parfile ??? If so, how much..
I get the error "LRM-00116: syntax error at <table_name> followed by <table_name>
If I split the tables counts, I am able to export using the same parfile.....
I have used both exp and expdp...but get the same error...
Pleas help...
I assume your version is 10gr2
when I checked the docs I can see there is no table count limitation.
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_export.htm#sthref165
see "Restrictions" section.
Double check your table name list please.
I would try to use wildcards in TABLES definition or I would export full SCHEMAS.
Similar Messages
-
Inserting data in table using XSU's command line interface
I have this XML document to store:
<?xml version="1.0" ?>
- <ROWSET>
- <CustOrder>
<OrderID>1</OrderID>
- <Customer>
<Name>Shirley Cohen</Name>
<Address>2425 skylane, Dallas, TX</Address>
<Phone>615-414-4112</Phone>
</Customer>
- <ItemList>
- <Item num="1">
<ProductID>111</ProductID>
<ProductName>Computer</ProductName>
<Quantity>2</Quantity>
<Price>2000</Price>
</Item>
- <Item num="2">
<ProductID>113</ProductID>
<ProductName>Monitor</ProductName>
<Price>865</Price>
</Item>
</ItemList>
</CustOrder>
</ROWSET>
I have created the table custorder (below) to store the above XML document shown below:
create or replace type CustObj as object
Name varchar2(40),
Address varchar2(70),
Phone varchar2(20)
create or replace type Item as object
ProductID number,
ProductName varchar2(50),
Quantity number,
Price number
--create type Items as table of Item;
create table CustOrder
OrderID number,
Customer CustObj,
ItemList Items
) nested table ItemList store as nested_itemlist;
However, using the following command in XSU; I get this error:
--THE COMMAND
C:\Documents and Settings\SKOS>java OracleXML putXML -user "samuel/samuel" -fil
eName "\ProjectDocuments\myXMLStore\temp.xml" "custorder"
--THE ERROR MESSAGE
file:/C:/ProjectDocuments/myXMLStore/temp.xml<Line 1, Column 11>: XML-0109: (Fat
al Error) PI names starting with 'xml' are reserved.
oracle.xml.sql.OracleXMLSQLException: PI names starting with 'xml' are reserved.
at oracle.xml.sql.dml.OracleXMLSave.saveXML(OracleXMLSave.java:2173)
at oracle.xml.sql.dml.OracleXMLSave.insertXML(OracleXMLSave.java:1264)
at OracleXML.Put_XML(OracleXML.java:467)
at OracleXML.ExecutePutXML(OracleXML.java:389)
at OracleXML.main(OracleXML.java:177)
Can anyone please let me know what I am doing wrong? ThanksPlease remove the xml declaration,
<?xml version="1.0" ?>, from the xml document. -
Oracle export error while using exp.
Hi folks,
Any idea why am i getting this error when I tried to get an export file?
Error:
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
With the Partitioning, Oracle Label Security, OLAP and Data Mining Scoring Engine optionsSegmentation Fault (core dumped)
Steps:
I was trying to get export file from db test/test_pass@env1 with below export command.
Encountered the above error.
exp test/test_pass@env1 FILE=another_test_dmp.dmp OWNER=another_test TRIGGERS=n GRANTS=y ROWS=y COMPRESS=y
Log file created with 0 bytes.
Some file name with core created. (huge in volume)
another_test_dmp.dmp file created with 0 bytes.
Appreciated your help in this.
Thank you.
Edited by: TechMahi.com on May 6, 2010 5:28 AMHi,
You should look your core dump to see more details of your error. If you want you could post it to see if we can help you.
You can find it looking at the parameter CORE_DUMP_DEST
Regards,
Mario Alcaide
http://marioalcaide.wordpress.com -
Error in updating custom database table using UPDATE SET Command
Hi,
I developed an automated collection system program in one of our clients simulating FBE1 (Payment Advice Creation) and F-28 (Posting of Incoming Payments) transaction codes. Upon posting, we encountered an error in PRD server wherein, for some of the posted items, the STATUS field in our custom table was not successfully updated. We tried to reverse the clearing document and debug the program. Upon posting, we were able to successfully update the status. We could not simulate the error in QAS since all the postings in QAS were successful and all status' relating to these postings were updated as well. We tried posting multiple times and the program seems to be working fine. In PRD, this behavior is just random and there is no specific pattern as to how the error occured. Provided below is the code I used for updating custom table ZFIACSF28. The STATUS field should be equal to '4' once the posting is successful. We are not sure if this is a database related issue. 7 users are authorized to do the postings in PRD and they are using the same user account.
CODE:
CALL TRANSACTION 'F-28' USING gt_bdcdata
MODE 'E'
UPDATE 'S'
MESSAGES INTO gt_messtab.
READ TABLE gt_messtab INTO wa_messtab WITH KEY msgtyp = 'S'
msgnr = '312'.
IF sy-subrc EQ 0.
update status for items cleared bearing the same OR Number*
UPDATE zfiacsf28 SET status = '4'
zsapdocument = wa_messtab-msgv1(10)
zruntime = sy-uzeit
zrundate = sy-datum
WHERE zor = gv_zor
AND customer = gv_customer.
COMMIT WORK.
ENDIF.Hi,
it's not possible that status isn't updated in zfiacsf28 while date and time are updated there, because it's hard coded.
There must be other programs that also change your table zfiacsf28 and maybe clear those fields. Maybe the key fields zor and customer cannot provide duplicate access, what is the meaning of field zor ?
Please check the where used list of table zfiacsf28 to find other update programs.
Regards,
Klaus -
Using exp to export part of a table
Hi, can anybody help with how to export a portion of a table using exp?
Ive tried the following but it errors.
Table(T) or Partition(T:P) to be exported: (RETURN to quit) > test.table where user = 'john'
Does exp work like this or do i need to export the whole table?
ThanksNot really. What I posted is all there is to it. I suggest you open the Utilities manual for your version of Oracle and look at the chapter on exp for more information. The query must apply to every table in the export, if you are exporting more than one table.
Actually if you are on 11g pulling the 9.2 manual (available online) might be a better manual since the utilities manual now spends most of its coverage on expdp.
The expdp utility also has an ability to filter tables based on a query.
HTH -- Mark D Powell -- -
Sir,
How Can I export one table from a oracle server and import the same table in another table by using exp/imp command by normal user who has read/write right.
AjitPost in the right thread, and perhaps you should at least attempt to read the documentation. Your answers are there.
Bazza -
Hi,
I am trying to export a table using Datapump export utility but the operation is failing. Firstly, this is the table I am trying to export:
SQL> select owner, table_name from dba_tables where table_name='DEMO1';
OWNER TABLE_NAME
SYS DEMO1
SQL> select * from DEMO1;
ID
1
2
3
This is how the expdp command is run:
$ expdp DUMPFILE=demo1.dmp TABLES=DEMO1
Export: Release 11.1.0.6.0 - Production on Wednesday, 05 November, 2008 12:37:28
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Username: sys as sysdba
Password:
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYS"."SYS_EXPORT_TABLE_01": sys/******** AS SYSDBA DUMPFILE=demo1.dmp TABLES=DEMO1
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 0 KB
ORA-39166: Object DEMO1 was not found.
ORA-31655: no data or metadata objects selected for job
Job "SYS"."SYS_EXPORT_TABLE_01" completed with 2 error(s) at 12:37:37
I have searched for the error ORA-39166 but most of the results talk about either missing table or lack of privileges. I don't think either is the cause of the problem here. Any help is appreciated.
Thanks,
Raghusql prompt > host expdp sys/pass word directory=dir1 dumpfile = test1234.dmp tables = <schema_name>.tablename;I get the same failure as before. Please note that I passed "SYS" explicitly as schema this time.
SQL> host expdp DUMPFILE=demo1.dmp TABLES=SYS.DEMO1;
Export: Release 11.1.0.6.0 - Production on Wednesday, 05 November, 2008 13:58:06
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Username: sys as sysdba
Password: ********
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYS"."SYS_EXPORT_TABLE_01": sys/******** AS SYSDBA DUMPFILE=demo1.dmp TABLES=SYS.DEMO1
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 0 KB
ORA-39165: Schema SYS was not found.
ORA-39166: Object DEMO1 was not found.
ORA-31655: no data or metadata objects selected for job
Job "SYS"."SYS_EXPORT_TABLE_01" completed with 3 error(s) at 13:58:15
***** -
Reg: Export and Import using Dbms_datapump
Hi,
I would like to export a table using dbms_datapump package. I have a procedure to do this (In Oracle 10g R10.2.0.1.0). This procedure have parameter for schema name and table name and this particular schema table should be exported as
dump file.
PROCEDURE PR_EXPORT(PV_SCHEMA IN VARCHAR2,
PV_TABLE VARCHAR2,
PV_STATUS OUT VARCHAR2) AS
l_dp_handle NUMBER;
l_last_job_state VARCHAR2(30) := 'UNDEFINED';
l_job_state VARCHAR2(30) := 'UNDEFINED';
l_sts KU$_STATUS;
l_schema varchar2(256);
l_table varchar2(256);
BEGIN
l_schema := 'IN(''' || PV_SCHEMA || ''')'; --'IN(''VALIDATION'')'
l_table := 'IN(''' || pv_table || ''')'; -- 'IN(''TABLE1'')'
DBMS_OUTPUT .PUT_LINE('SCHEMA ' || L_SCHEMA);
DBMS_OUTPUT .PUT_LINE('TABLE ' || L_TABLE);
l_dp_handle := DBMS_DATAPUMP.open(operation => 'EXPORT',
job_mode => 'TABLE',
remote_link => NULL,
job_name => 'EMP_EXPORT13',
version => 'LATEST');
DBMS_DATAPUMP .add_file(handle => l_dp_handle,
filename => 'SCOTT12.dmp',
directory => 'BACKUP_DIR');
DBMS_DATAPUMP .add_file(handle => l_dp_handle,
filename => 'SCOTT12.log',
directory => 'BACKUP_DIR',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP .metadata_filter(handle => l_dp_handle,
name => 'SCHEMA_EXPR',
VALUE => l_schema --'IN(''VALIDATION'')'
DBMS_DATAPUMP .metadata_filter(handle => l_dp_handle,
name => 'NAME_EXPR',
VALUE => l_table -- 'IN(''TABLE1'')'
DBMS_DATAPUMP .start_job(l_dp_handle);
DBMS_DATAPUMP .detach(l_dp_handle);
END PR_EXPORT;
Sometime the above procedure correctly creating the dump file. But sometimes, it is showing the below error:
The following error has occurred:
ORA-26535: %ud byte row cache insufficient for table with rowsize=%ud
Please help me on this.
Thanks and Regards,
VijayThe only information I could find so far is this [http://ora-26535.ora-code.com/].
I could not find out how to change the buffer size - there does not seem to be an option in [DBMS_DATAPUMP|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_datpmp.htm#i1007277]. Maybe you have to search the [Advanced Replication documentation|http://download.oracle.com/docs/cd/B19306_01/server.102/b14226/toc.htm].
HTH -
Export - only tables with data
Hai Everybody!
I have a database in Oracle, which is having hundreds of tables. When i export the database, all the tables will be exported into the .dmp file.
i want to export only the tables, which are having atleast one row to the .dmp file.
Can anyone help me by providing the solution.
Thanks
JDJayaDev(JD) wrote:
version 10gIn 11g, there is feature called "deferred segment creation". It means, if you create a new table and do not insert any data into it, Oracle will not create segment for that table, hence it will not be displayed in dba_segments.
That is why, if you try export that table with exp utility, it will not be exported. -
Tables used in standard t-code
Hi Experts,
Can anyone tell me how to find tables used in standard transactions.
Thanks,
Swarna.hi,
u can find all tables in DD02l table.
check this program.it will give u all tables used in a program.
so in ur standard transaction goto system - status - program name n execute tht in this report.
*& AS : ALV report to display the dictionary objects
*& (tables/structures/views of all types of delivery classes)
*& used by a program. *
REPORT ZALV_TABLESPROG .
*ALV type pools declarations
TYPE-POOLS : slis.
*Internal table and work area declarations for dd02l and dd02t
DATA : it_dd02l TYPE STANDARD TABLE OF dd02l,
wa_dd02l TYPE dd02l,
it_dd02t TYPE STANDARD TABLE OF dd02t,
wa_dd02t TYPE dd02t.
*DATA DECLARATIONS FOR PROGRAM NAMES
DATA : progname LIKE sy-repid.data : prognames(60) type c.
*Structure for output
TYPES : BEGIN OF ty_output,
tabname LIKE dd02l-tabname,
tabclass(20) TYPE c,
contflag(80) TYPE c,
text LIKE dd02t-ddtext,
END OF ty_output.
*Internal table and work area declarations for output
DATA : it_output TYPE STANDARD TABLE OF ty_output,
wa_output TYPE ty_output.
*Structure for table names
TYPES : BEGIN OF ty_names,
name LIKE dd02l-tabname,
END OF ty_names.
*Internal table and work area declarations for table names
DATA : it_names TYPE STANDARD TABLE OF ty_names.
*data declarations for ALV
DATA: it_layout TYPE slis_layout_alv,
wa_fieldcat TYPE slis_fieldcat_alv,
it_fieldcat TYPE slis_t_fieldcat_alv.
SELECTION SCREEN ************************
PARAMETERS : program LIKE sy-repid.
INITIALIZATION **********************
INITIALIZATION.
START OF SELECTION.
START-OF-SELECTION.
*Select to check if the program exists
select single name from trdir into prognames where name = program.
*If Program does not exist message is thrown
IF sy-subrc <> 0. MESSAGE 'PROGRAM DOES NOT EXIST' TYPE 'I'.
EXIT.
ENDIF.
*Calling FM to get the tables associated with the program
progname = program. CALL FUNCTION 'GET_TABLES'
EXPORTING
progname = progname
TABLES
tables_tab = it_names.
*Check if there are tables in the internal tabel
IF it_names IS INITIAL.
MESSAGE 'DATA DOES NOT EXIST' TYPE 'I'.
EXIT.
ELSE.
*Subroutine to get the table details
PERFORM TABLES_IN_PROGRAM.
ENDIF.
*output display
PERFORM alv_output.
*& Form TABLES_IN_PROGRAM
text
FORM TABLES_IN_PROGRAM.
*To fetch Tables and their features
IF it_names[] IS NOT INITIAL.
SELECT tabname tabclass contflag FROM dd02l
INTO CORRESPONDING FIELDS OF TABLE it_dd02l
FOR ALL ENTRIES IN it_names
WHERE tabname EQ it_names-name.
ENDIF.
*To fetch the texts for the table
IF it_dd02l[] IS NOT INITIAL.
SELECT tabname ddtext FROM dd02t INTO CORRESPONDING FIELDS OF TABLE it_dd02t
FOR ALL ENTRIES IN it_dd02l WHERE tabname EQ it_dd02l-tabname AND ddlanguage = 'E'.
ENDIF.
*If no data is selected throw message
IF sy-subrc <> 0.
MESSAGE 'DATA DOES NOT EXIST' TYPE 'I'.
EXIT.
ENDIF.
*Appending values to the output table
LOOP AT it_dd02l INTO wa_dd02l. wa_output-tabname = wa_dd02l-tabname.
wa_output-tabclass = wa_dd02l-tabclass.
wa_output-contflag = wa_dd02l-contflag. READ TABLE it_dd02t INTO wa_dd02t WITH KEY tabname = wa_dd02l-tabname.
wa_output-text = wa_dd02t-ddtext.
APPEND wa_output TO it_output.
CLEAR wa_output. ENDLOOP.
*modifying the values in the output table for texts
LOOP AT it_output INTO wa_output. AT NEW tabname.
READ TABLE it_dd02l INTO wa_dd02l WITH KEY tabname = wa_output-tabname. CASE wa_dd02l-contflag.
WHEN 'A'.
wa_output-contflag = 'Application table (master and transaction data)'.
WHEN 'C'.
wa_output-contflag = 'Customizing table, maintenance only by cust., not SAP import '.
WHEN 'L'.
wa_output-contflag = 'Table for storing temporary data, delivered empty'.
WHEN 'G'.
wa_output-contflag = 'Customizing table, protected against SAP Upd., only INS all'.
WHEN 'E'.
wa_output-contflag = 'Control table, SAP and customer have separate key areas '.
WHEN 'S'.
wa_output-contflag = 'System table, maint. only by SAP, change = modification'.
WHEN 'W'.
wa_output-contflag = 'System table, contents transportable via separate TR objects '.
WHEN ' '.
wa_output-contflag = 'Delivery class not available '. ENDCASE. CASE wa_dd02l-tabclass.
WHEN 'TRANSP'.
wa_output-tabclass = 'Transparent table'.
WHEN 'INTTAB'.
wa_output-tabclass = 'Structure'.
WHEN 'CLUSTER'.
wa_output-tabclass = 'Cluster table'.
WHEN 'POOL'.
wa_output-tabclass = 'Pooled table'.
WHEN 'VIEW'.
wa_output-tabclass = 'General view structure '.
WHEN 'APPEND'.
wa_output-tabclass = 'Append structure'. ENDCASE. MODIFY it_output FROM wa_output TRANSPORTING contflag
tabclass
WHERE tabname EQ wa_output-tabname.
CLEAR : wa_output , wa_dd02l. ENDAT.
ENDLOOP.ENDFORM. " TABLES_IN_PROGRAM&----
*& Form ALV_OUTPUT
text
FORM alv_output.
*Fieldcatalogue
PERFORM build_fieldcat.
*Layout
PERFORM build_layout.
*Display
PERFORM alv_display.ENDFORM. "ALV_OUTPUT
*& Form build_fieldcat
text
*Field catalogue
FORM build_fieldcat. CLEAR wa_fieldcat.
wa_fieldcat-row_pos = '1'.
wa_fieldcat-col_pos = '1'.
wa_fieldcat-fieldname = 'TABNAME'.
wa_fieldcat-tabname = 'IT_OUTPUT'.
wa_fieldcat-seltext_m = 'TABLENAME'.
APPEND wa_fieldcat TO it_fieldcat. CLEAR wa_fieldcat.
wa_fieldcat-row_pos = '1'.
wa_fieldcat-col_pos = '2'.
wa_fieldcat-fieldname = 'TABCLASS'.
wa_fieldcat-tabname = 'IT_OUTPUT'.
wa_fieldcat-seltext_m = 'CATEGORY'.
APPEND wa_fieldcat TO it_fieldcat. CLEAR wa_fieldcat.
wa_fieldcat-row_pos = '1'.
wa_fieldcat-col_pos = '3'.
wa_fieldcat-fieldname = 'TEXT'.
wa_fieldcat-tabname = 'IT_OUTPUT'.
wa_fieldcat-seltext_m = 'DESCRIPTION'.
APPEND wa_fieldcat TO it_fieldcat. CLEAR wa_fieldcat.
wa_fieldcat-row_pos = '1'.
wa_fieldcat-col_pos = '4'.
wa_fieldcat-fieldname = 'CONTFLAG'.
wa_fieldcat-tabname = 'IT_OUTPUT'.
wa_fieldcat-seltext_m = 'Delivery Class'.
APPEND wa_fieldcat TO it_fieldcat.ENDFORM.
*& Form build_layout
text
*Layout
FORM build_layout. it_layout-zebra = 'X'.
it_layout-colwidth_optimize = 'X'.ENDFORM. "build_layout
*& Form alv_display
text
*ALV output
FORM alv_display.
CALL FUNCTION 'REUSE_ALV_GRID_DISPLAY'
exporting
i_callback_program = sy-repid
i_callback_html_top_of_page = 'HTML_TOP_OF_PAGE'
it_fieldcat = it_fieldcat
is_layout = it_layout
TABLES
t_outtab = it_output.
ENDFORM. "alv_display
FORM html_top_of_page *
FORM HTML_TOP_OF_PAGE USING top TYPE REF TO cl_dd_document. data tstring type SDYDO_TEXT_ELEMENT. tstring = program. CALL METHOD top->add_text EXPORTING text = 'Tables used in the program'
sap_style = 'heading' .
CALL METHOD top->add_text EXPORTING text = tstring
sap_style = 'Heading' .ENDFORM.
r -
Hi,
We need your help.
We want to update a table using the MODIFY command but it does not delete the previous entries instead it INSERTs a new entry.
Scenario:
1. Upon calling the Screen for Editing the entry, the previous entry must be deleted
2. When SAVEd, the new entry must replace the previous entry.Make sure that in the definition or your table you have specified the key. Otherwise, the system may "think" that all the fields are part of the key, therefore all lines are different.
This is a extract from SAP's documentation:
"The line to be changed is determined by the fact that the content of the table key matches the content of the corresponding components in the wa work area. For tables with a key that is not unique, the first entry that is found is changed. "
Regards. -
Incremental Export using EXP Command
Hi all,
I am trying to get incremental backup using oracle 9i "Exp" command. But this backup is being taken full of table data where changes made. Actually incremental backup should take a backup only the data last changes made.
Please help me, how can I take the real incremental backup using exp command.
Thanks
Khalilshould take a backup only the data last changes made.In order to be able to do that, Oracle would have to track rows that have changed since the last export. Think of a database with few hundred to a few thousand tables and tables with tens to tens-of-million rows.
In a table, a single column in a single row may have been updated.
However, in another table 40 columns in 3million rows may have been updated since the last export.
In another table, some rows might be deleted. (how would "incremental" compute the difference between "there were 1000 rows yesterday, there are 942 rows today")
In a fourth table, 3million rows have been inserted since the last export.
Can you conceive of the scale of "tracking" that Oracle would have to do if you want "only the last changes" to be exported ? Next, how would you "merge" data from two exports when you import them -- e.g. if the non-primarykey columns have been updated ; or primarykey has been updated ; or the table has no primarykey ?
The only way an "Incremental" change can be identified is by tracking if any change (whether 1 columm in 1 row or all the columns in all the rows) has been made to the table since the last export. If a single byte has been changed in the table, the whole table will be exported.
Are you sure you were able to use Incremental Export in 9i ? Incremental Export is documented in 8.1.7 but hasn't appeared in the 9.2 documentation.
The 8.1.7 documentation says "+An incremental Export backs up only tables that have changed since the last incremental, cumulative, or complete Export. An incremental Export exports the table definition and all its data, *not just the changed rows*. Typically, you perform incremental Exports more often than cumulative or complete Exports.+ "
Oracle had announced the end of Error Correction Support for the Incremental Export effective 31-Dec-1999. It ended Extended Assistance Support 31-Dec-2002. See Oracle Note#170483.1 on the support site.
Hemant K Chitale
http://hemantoracledba.blogspot.com -
How to export a table with a mixed case name using EXP untility
i'm trying to export a table which has a name with mixed upper and lower case. The command i'm trying is
exp USER/password TABLES=("MyTableName") FILE=ExportCNCS1.dat
and does not work.
Error:
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc
tion
With the Partitioning, OLAP and Data Mining options
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
About to export specified tables via Conventional Path ...
EXP-00011: USER.MYTABLENAME does not exist
Export terminated successfully with warnings.
I presume its saying the table doesnt exist because its looking for MYTABLENAME rather than MyTableName and doing this because its ignoring the double quotes. Any ideas?Or, let the export prompt you for the table name:
C:\Temp>exp
Export: Release 9.2.0.7.0 - Production on Mon Aug 14 11:33:28 2006
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Username: scott/tiger
Connected to: Oracle9i Enterprise Edition Release 9.2.0.7.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.7.0 - Production
Enter array fetch buffer size: 4096 >
Export file: EXPDAT.DMP >
(2)U(sers), or (3)T(ables): (2)U > T
Export table data (yes/no): yes >
Compress extents (yes/no): yes > no
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
About to export specified tables via Conventional Path ...
Table(T) or Partition(T:P) to be exported: (RETURN to quit) > MyTableName
EXP-00011: SCOTT.MYTABLENAME does not exist
Table(T) or Partition(T:P) to be exported: (RETURN to quit) > "MyTableName"
. . exporting table MyTableName 14 rows exported
Table(T) or Partition(T:P) to be exported: (RETURN to quit) >
Export terminated successfully with warnings.
C:\Temp> -
Error while exporting a table - EXP-00091
I am doing an export of a table. The table has 1000838 rows. After the export is completed,
when I checked the log - it said
Connected to: Oracle9i Enterprise Edition Release 9.2.0.4.0 - 64bit Production
With the Partitioning option
JServer Release 9.2.0.4.0 - Production
Export done in US7ASCII character set and AL16UTF16 NCHAR character set
About to export specified tables via Conventional Path ...
. . exporting table FIDA_LABEL 1000838 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
Export terminated successfully with warnings.
I looked in the Oracle error messages document and found this out ---
EXP-00091 Exporting questionable statistics
Cause: Export was able to export statistics, but the statistics may not be useable. The statistics are questionable because one or more of the following happened during export: a row error occurred, client character set or NCHARSET does not match with the server, a query clause was specified on export, only certain partitions or subpartitions were exported, or a fatal error occurred while processing a table.
Action: To export non-questionable statistics, change the client character set or NCHARSET to match the server, export with no query clause, or export complete tables. If desired, import parameters can be supplied so that only non-questionable statistics will be imported, and all questionable statistics will be recalculated.
And this how my export command looks like -
exp vincent/passphr query=\"where state in \(\'MD\',\'CA\',\'WI\'\)\" file=$EXPDIR/fida_label_9i.dmp tables=vincent.fida_label
log=$LOGDIR/fida_label_exp.log
Ofcourse, I am using the query clause because I really need to and it has always worked when we were in the Oracle 8i environment. We recently moved to the 9i. And this happens in this 9i version...
And I certainly do not want to specify the import parameters to ignore the questionable statistics as no changes are desired in that area...(my hands are tied..).
What could " a fatal error occurred while processing a table " mean? And how can this be traced and troubleshooted ? Or how can I find out if any row errors occurred ? And if required, how do I check the character sets and other likes ?? (I have no idea in this area)
Thanks. All I needed was to get around this error. Your suggesions/responses would be highly appreciatedWhat version of Oracle 9i are you using? Do you have a standard 'NLS_LANG' environment variable set on client's machines? Or do you set it to different values on different machines?
Here is one of way you could get around it.
Could you specify the export parameter 'STATISTICS=NONE' while exporting the table data?
Try this and see.
If this is successful, you could use the import utility as usual. You could always compute or estimate statistics on the table after import. -
EXPORT ONLY TABLES IN A SCHEMA USING DATA PUMP
Hi folks,
Good day. I will appreciate if I can get a data pump command to export only TABLE oBJECT in a specific shema .
The server is a 4 node RAC with 16 CPU per node.
Thanks in advanceIf all you want is the table definitions, why can you use something like:
expdp user/password direcory=my_dir dumfile=my_dump.dmp tables=schama1.table1,schema1.table2,etc content=metadata_only include=table
This will export just the table definitions. If you want the data to, then remove the content=metadata_only, if you want the dependent objects, like indexes, table_statistics, etc, then remove the include=table.
Dean
Maybe you are looking for
-
Null values not displaying in the LOV on the parameter form.
My report works perfect while in Oracle Reports 10g, however when I move it to our menu (Oracle Forms 10g) it does not display the null in the LOV on the parameter form. This report allows the the user to select by inspector or district or everyone f
-
I
-
I want to hook speakers up to Apple TV. My amplifier does not have an optical output. I tried a optical/RCA converter but did not work. What's the best route to go?
-
ASCII control characters in XML
hi,guys I want to export InDesign content to an XML file using javascript. The problem is, except for TAB, LF, and CR the control characters (those below ASCII 32) are not allowed in well-formed XML and any parser worth its salt will puke. Adobe (mis
-
Can i use a cable to print to a printer
I want to attach a compatable ipad printer to my ipad but not using wifi. I want to attach the printer thru a cable.