EXP 11.2.0.1 export tables with at leat 1 row
I have oracle db 11.2.0.1 and user the export utility to export
tables, and row dati from the database to a file dmp.
I have notice that mow it extract the table with at leat 1 row
and do not export table empty
and so when i execute an import i lost a lot of tables...
Very dangerous ....
There is and explanations ???
You may get this effect in release 11.2.0.1 if the table has no segment. You probably have deferred_segment_creation=true. This query may help identify the tables without segments:
select table_name from user_tables
minus
select segment_name from user_segments where segment_type='TABLE';
(you will of course have to modify the query to handle more complex table structures).
Data Pump does not have this problem.
John Watson
Oracle Certified Master DBA
Similar Messages
-
Exporting Table with CLOB Columns
Hello All,
I am trying to export table with clob columns with no luck. It errors saying EXP-00011TABLE do not exist.
I can query the table and the owner is same as what i am exporting it from.
Please let me know.An 8.0.6 client definitely changes things. Other posters have already posted links to information on what versions of exp and imp can be used to move data between versions.
I will just add that if you were using a client to do the export then if the client version is less than the target database version you can upgrade the client or better yet if possilbe use the target database export utility to perform the export.
I will not criticize the existance of an 8.0.6 system as we had a parent company dump a brand new 8.0.3 application on us less than two years ago. We have since been allowed to update the database and pro*c modules to 9.2.0.6.
If the target database is really 8.0.3 then I suggest you consider using dbms_metadata to generate the DDL, if needed, and SQLPlus to extact the data into delimited files that you can then reload via sqlldr. This would allow you to move the data with some potential adjustments for any 10g only features in the code.
HTH -- Mark D Powell -- -
Hi!
I have to export table with lob column (3 GB is the size of lob segment) and then drop that lob column from table. Table has about 350k rows.
(I was thinking) - I have to:
1. create new tablespace
2. create copy of my table with CTAS in new tablespace
3. alter new table to be NOLOGGING
4. insert all rows from original table with APPEND hint
5. export copy of table using transport tablespace feature
6. drop newly created tablespace
7. drop lob column and rebuild original table
DB is Oracle 9.2.0.6.0.
UNDO tablespace limited on 2GB with retention 10800 secs.
When I tried to insert rows to new table with /*+append*/ hint operation was very very slow so I canceled it.
How much time should I expect for this operation to complete?
Is my UNDO sufficient enough to avoid snapshot too old?
What do you think?
Thanks for your answers!
Regards,
Marko SuticI've seen that document before I posted this question.
Still I don't know what should I do. Look at this document - Doc ID: 281461.1
From that document:
FIX
Although the performance of the export cannot be improved directly, possible
alternative solutions are:
+1. If not required, do not use LOB columns.+
or:
+2. Use Transport Tablespace export instead of full/user/table level export.+
or:
+3. Upgrade to Oracle10g and use Export DataPump and Import DataPump.+
I just have to speed up CTAS little more somehow (maybe using parallel processing).
Anyway thanks for suggestion.
Regards,
Marko -
How can I implement the equivilent of a temporary table with "on commit delete rows"?
hi,
I have triggers on several tables. During a transaction, I need to gather information from all of them, and once one of the triggers has all the information, it creates some data. I Can't rely on the order of the triggers.
In Oracle and DB2, I'm using temporary tables with "ON COMMIT DELETE ROWS" to gather the information - They fit perfectly to the situation since I don't want any information to be passed between different transactions.
In SQL Server, there are local temporary tables and global. Local temp tables don't work for me since apparently they get deleted at the end of the trigger. Global tables keep the data between transactions.
I could use global tables and add some field that identifies the transaction, and in each access to these tables join by this field, but didn't find how to get some unique identifier for the transaction. @@SPID is the session, and sys.dm_tran_current_transaction
is not accessible by the user I'm supposed to work with.
Also with global tables, I can't just wipe data when "operation is done" since at the triggers level I cannot identify when the operation was done, transaction was committed and no other triggers are expected to fire.
Any idea which construct I could use to acheive the above - passing information between different triggers in the same transaction, while keeping the data visible to the current transaction?
(I saw similar questions but didn't see an adequate answer, sorry if posting something that was already asked).
Thanks!This is the scenario: If changes (CRUD) happen to both TableA and TableB, then log some info to TableC. Logic looks something like this:
Create Trigger TableA_C After Insert on TableA {
If info in temp tables available from TableB
Write info to TableC
else
Write to temp tables info from TableA
Create Trigger TableB_C After Insert on TableB {
If info in temp tables available from TableA
Write info to TableC
else
Write to temp tables info from TableB
So each trigger needs info from the other table, and once everything is available, info to TableC is written. Info is only from the current transaction.
Order of the triggers is not defined. Also there's no gurantee that both triggers would fire - changes can happen only to TableA / B and in that case I don't want to write anything to TableC.
The part that gets and sets info to temp table is implemented as temp tables with "on commit delete rows" in DB2 / Oracle.
What do you think? As I've mentioned, I could use global temp tables with a field that would identify the transaction, but didn't find something like that in SQL Server. And, the lifespan of local temp tables is too short. -
SQL: Find table with max no. of rows
I have a table containing list of table names for each owner; as
## Table: db_tables
OWNER TABLE_NAME
a ta_1
a ta_2
a ta_3
b tb_1
b tb_2
c tc_1
Now, i want to know the table with max. no. of rows for each owner
Plz....can anyone gimme a solution for the above ......Assuming 10g and above:
SQL> SELECT owner,
MAX(table_name) KEEP (DENSE_RANK FIRST ORDER BY XMLQUERY (t RETURNING CONTENT).getnumberval() DESC) table_name,
MAX(XMLQUERY (t RETURNING CONTENT).getnumberval()) cnt
FROM (SELECT owner,table_name, 'count(ora:view("' || table_name || '"))' t
FROM all_tables
WHERE owner IN ('MICHAEL','SCOTT'))
GROUP BY owner
OWNER TABLE_NAME CNT
MICHAEL SERVICE_ZIP 1000000
SCOTT EMP 14
2 rows selected. -
Export tables with lowercase names
Has anyone ever done a table export on a table with a lower case name?
I have set up a parameter file with the entry:
tables=USER."table_name"
but I get a:
EXP-00011: USER."table_name" does not exist
The table DOES exist, and if I ever find the person who designed an entire schema with lower case table names I'll strangle him...
The syntax, with the double quotes, is what's shown in the Docs I have to hand.
Anyone?Yes, so have I. But they still translate to uppercase when the export actually happens.
I actually have a schema full of tables, all with lowercase names, and I can't export them explicitly.
I have tried it with 8.1.5 and 8.1.6 (Solaris).
I'll try it with 9i and 8.0.6 when I get the instances up, but I'd really like it if someone else could try this and tell me I'm hallucinating. (I'd like to see your parameter syntax too of course!)
BTW, if you do try experimenting with lowercase table names, be prepared for things like TOAD and some of the OEM software to screw up too. -
InDesign CS6 ePub Export : Tables with header and footer in HTML
Hey there,
does anyone know, whether InDesign CS6 also exports Table Headers and footers correctly into the XHTML-File of the ePub.
What I mean, is whether the elements <thead> and <tfoot> are created?
Or is it only possible to steer this via the CSS-Classnames which can be given in the tableformats?
Generally I think it would be better if the user had the chance to map other exporttags to its elements than just p, em, strong, h1-h6.
it would be useful to also put in other elements by hand.
Best regrads.Magnolee2 wrote:
does anyone know, whether InDesign CS6 also exports Table Headers and footers correctly into the XHTML-File of the ePub.
What I mean, is whether the elements <thead> and <tfoot> are created?
By "also", do you mean the behavior is changed with respect to CS5/CS5.5? In those, thead and tfoot are created correctly. (Although, quite disconcerting, in the order "thead / tfoot / tbody". ePub renderers based on Webkit display them correctly nevertheless, but others do not. An extremely annoying free interpretation of the W3C rules.) -
EA3 export table with SDO types
Hi!
I've tried export table from user (not mdsys) with mdsys.sdo_geometry column (build 1.5.0.53.04, tools export wizard, all checks except storage).
got in log:
SEVERE 40 0 oracle.dbtools.db.DBUtil
ORA-31603: объект "SDO_ELEM_INFO_ARRAY" с типом TYPE не найден в схеме "MDSYS" ORA-06512: на "SYS.DBMS_SYS_ERROR", line 105 ORA-06512: на "SYS.DBMS_METADATA", line 2805 ORA-06512: на "SYS.DBMS_METADATA", line 4333 ORA-06512: на line 1
SEVERE 41 641 oracle.dbtools.db.DBUtil ORA-31603: объект "SDO_GEOMETRY" с типом TYPE не найден в схеме "MDSYS" ORA-06512: на "SYS.DBMS_SYS_ERROR", line 105 ORA-06512: на "SYS.DBMS_METADATA", line 2805 ORA-06512: на "SYS.DBMS_METADATA", line 4333 ORA-06512: на line 1
SEVERE 42 94 oracle.dbtools.db.DBUtil ORA-31603: объект "SDO_ORDINATE_ARRAY" с типом TYPE не найден в схеме "MDSYS" ORA-06512: на "SYS.DBMS_SYS_ERROR", line 105 ORA-06512: на "SYS.DBMS_METADATA", line 2805 ORA-06512: на "SYS.DBMS_METADATA", line 4333 ORA-06512: на line 1
SEVERE 43 3093 oracle.dbtools.raptor.ddl.DDLGenerator getDDL - null
In SQL script i've got some strange result (see bold):
-- File created - среда-Апрель-09-2008
DROP TABLE "TEST" cascade constraints;
DROP SEQUENCE "MDRS_C0A4$";
-- DDL for Sequence MDRS_C0A4$
CREATE SEQUENCE "MDRS_C0A4$" MINVALUE 1 MAXVALUE 999999999999999999999999999 INCREMENT BY 1 START WITH 101 CACHE 100 ORDER NOCYCLE ;
-- DDL for Type SDO_ELEM_INFO_ARRAY
-- Unable to Render DDL with DBMS_METADATA using internal generator.
CREATE TYPE SDO_ELEM_INFO_ARRAY
AS VARRAY (1048576) of NUMBER
GRANT EXECUTE ON "SDO_ELEM_INFO_ARRAY" TO PUBLIC WITH GRANT OPTION;
GRANT EXECUTE ON "SDO_ELEM_INFO_ARRAY" TO PUBLIC WITH GRANT OPTION;
-- DDL for Type SDO_GEOMETRY
-- Unable to Render DDL with DBMS_METADATA using internal generator.
CREATE TYPE MDSYS.SDO_GEOMETRY;
CREATE TYPE SDO_GEOMETRY AS OBJECT
( SDO_GTYPE NUMBER,
SDO_SRID NUMBER,
SDO_POINT SDO_POINT_TYPE,
SDO_ELEM_INFO SDO_ELEM_INFO_ARRAY,
SDO_ORDINATES SDO_ORDINATE_ARRAY,
MEMBER FUNCTION GET_GTYPE RETURN NUMBER DETERMINISTIC,
MEMBER FUNCTION GET_DIMS RETURN NUMBER DETERMINISTIC,
MEMBER FUNCTION GET_LRS_DIM RETURN NUMBER DETERMINISTIC,
MEMBER FUNCTION GET_WKB RETURN BLOB DETERMINISTIC,
MEMBER FUNCTION GET_WKT RETURN CLOB DETERMINISTIC,
MEMBER FUNCTION ST_CoordDim RETURN SMALLINT DETERMINISTIC,
MEMBER FUNCTION ST_IsValid RETURN INTEGER DETERMINISTIC,
CONSTRUCTOR FUNCTION SDO_GEOMETRY(wkt IN CLOB, asrid IN INTEGER DEFAULT NULL)
RETURN SELF AS RESULT,
CONSTRUCTOR FUNCTION SDO_GEOMETRY(wkt IN VARCHAR2, asrid IN INTEGER DEFAULT NULL)
RETURN SELF AS RESULT,
CONSTRUCTOR FUNCTION SDO_GEOMETRY(wkb IN BLOB, asrid IN INTEGER DEFAULT NULL)
RETURN SELF AS RESULT
GRANT EXECUTE ON "SDO_GEOMETRY" TO PUBLIC WITH GRANT OPTION;
GRANT EXECUTE ON "SDO_GEOMETRY" TO PUBLIC WITH GRANT OPTION;
-- DDL for Type SDO_ORDINATE_ARRAY
-- Unable to Render DDL with DBMS_METADATA using internal generator.
CREATE TYPE SDO_ORDINATE_ARRAY
AS VARRAY(1048576) OF NUMBER
GRANT EXECUTE ON "SDO_ORDINATE_ARRAY" TO PUBLIC WITH GRANT OPTION;
GRANT EXECUTE ON "SDO_ORDINATE_ARRAY" TO PUBLIC WITH GRANT OPTION;
-- DDL for Table TEST
CREATE TABLE "TEST"
( "NUM" NUMBER,
"GEOLOC" "SDO_GEOMETRY"
-- Constraints for Table TEST
ALTER TABLE "TEST" MODIFY ("NUM" NOT NULL ENABLE);
ALTER TABLE "TEST" ADD PRIMARY KEY ("NUM") ENABLE;
-- DDL for Index TEST_SX
CREATE INDEX "TEST_SX" ON "TEST" ("GEOLOC")
INDEXTYPE IS "MDSYS"."SPATIAL_INDEX" PARAMETERS ('SDO_INDX_DIMS=2 LAYER_GTYPE="COLLECTION"');
-- DDL for Index SYS_C004552
CREATE UNIQUE INDEX "SYS_C004552" ON "TEST" ("NUM")
Why it create all grant twice?
Why it try to recreate type SDO_GEOMETRY twice - once in MDSYS schema second in users schema?
Why it recreate types SDO_ELEM_INFO_ARRAY and SDO_ORDINATE_ARRAY only once in user schema?Why it create all grant twice?
Why it try to recreate type SDO_GEOMETRY twice - once in MDSYS schema second in users schema?
Why it recreate types SDO_ELEM_INFO_ARRAY and SDO_ORDINATE_ARRAY only once in user schema?
All of these seem to be an issue with the older internal ddl generator. While I will log a bug on this we do not actually own the code so I am not able to fix it. The easiest way to make this work correctly would be to fix your DBMS_METADATA to work for those other user objects...
Try granting your export user SELECT_CATALOG_ROLE.
Although I'm not sure why you are including the dependencies for that table as you know it's going to just get you the MDSYS objects, try unchecking the "Automatically Include Dependent Objects" option on the first page of wizzard. -
Problem Export tables with data pump
Hi,
I want to export a 300 tables with datapump. But a received this message:
ORA-39001: invalid argument value
ORA-39071: Value for TABLES is badly formed.
when i put only half of the tables i have neither any problem and it works ..
anyone know this problem, i need to put all this tables to export?
Thanks in advance.the file (*.dat) semble of this:
DIRECTORY=PUMPDIR
DUMPFILE=MyFile.dmp
LOGFILE=MylogFile.log
TABLES= User.Mytbale:partition1_0
User.Mytbale:partition1_1
User.Mytbale:partition1_2
User.Mytbale:partition1_3
OtherUser.Table:partition1_1
in order 300 tables -
How to export a table with half a million rows?
I need to export a table that has 535,000 rows. I tried to export to Excel and it exported only 65,535 rows. I tried to export to a text file and it said it was using the clipboard (?) and 65,000 rows was the maximum. Surely there has to be a way to export
the entire table. I've been able to import much bigger csv files than this, millions of rows.What version of Access are you using? Are you attempting to copy and paste records or are you using Access' export functionality from the menu/ribbon? I'm using Access 2010 and just exported a million record table to both a text file and to Excel
(.xlsx format). Excel 2003 (using .xls 97-2003 format) does have a limit of 65,536 rows but the later .xlsx format does not.
-Bruce -
AMDP exporting table with fields from various sources
I am writing an AMDP that has an output table a list of employees and attributes. These attributes have various sources with various keys. If I cannot with any practicality construct this table with a single select statement, which given my search criteria, I probably can't, must I break my output table into multiple output tables that can be created with a single select?
I declare the structure of output table et_employees in the class as something like
BEGIN OF ty_employee,
emp_id(12) TYPE c,
emp_name(80) TYPE c,
org_unit(8) TYPE c,
region(5) TYPE c,
country(3) TYPE c,
jb_prf_as TYPE c,
sol_gr_as TYPE c,
snippet(1000) TYPE c,
score TYPE integer,
END OF ty_employee,
tt_employes type table of ty_employee.
As far as I can tell, I cannot do an update et_employees in the method to modify individual fields. I can only do the select statement.
I see my choices as
elaborate select statement that I may or may not be able to construct (haven't thought this through but it may be doable)
more than one output table
create multiple local tables and then do a join on them for the output table.
I am seeing the last one as my best option.
To throw in an unrelated issue, AMDP procedure cannot seem to cope with a table that begins with a slash, e.g., /MRSS/D_SQP_ESTR. The USING statement is o.k. but any access in a select or inner join gets an error.Hi Deborah,
let me do some assumptions on your problem and then try to help you:
Assumption A: You only like to query data from tables, which, I simply assume, are available in the ABAP data dictionary. In this case, I don't think there is a performance gain with AMDPs compared to OpenSQL, so just use OpenSQL and do joins on the relevant tables, leading to exaclty one resultset in the output.
Assumption B: You need the AMDP because you have a good reason and you like to query data from tables employee_source_a and employee_source_b for your resultset. In case yes, you can e.g. use "temporary" tables (don't declare them explicitly), e.g. use the construct like:
lt_employee_source_a = select ...from employee_source_a ...;
lt_employee_source_b = select ... from employee _source_b...;
et_employee = select ... from :lt_employee_source_a ... (inner/left outer) join lt_employee_source_b on...;
Or you could use the CE_JOIN function if that suits your SQLScript development better.
To elaborate a query statement for et_employee for without the lt_xxx tables is hard to say from your question - it should be posible if there are no nasty aggregations/calculations which prevent it :-).
There's no need to use two resultsets in the output in case that's not what you need in the application.
The option to have several resultsets is rather a feature of DB Procedures, which allow for several resultsets while a view/OpenSQL query can only give you one resultset.
Conclusion: You answered your question yourself, the last option seems to be the best option :-).
Concerning the "slash issue": Guessing around I'd propose to use quote the table name like "/MRSS/D_SQP_ESTR"... but just a guess. Could you please post the error message or open a second discussion on the issue?
Cheers,
Jasmin -
Tables with buttons to add rows
Morning
I am having trouble with a table that has buttons to add, show and hide rows. There are 2 rows in my table which I need to be able to add depending on which button is clicked. I've managed to get the first button to add an instance using (Table.Row1.instanceManager.addInstance(1);) but I can't get row 2 to add to the bottom of the table with a button. I've tried to amend the script to fit the second row but it doesn't work.
Table.Row2.instanceManager.addInstance(2);
I'd appreciate some help
I can send a sample if need be.
Many thanks
BenThe correct syntax is addInstance(1) (or addInstance(true)).
As long as the row is set to repeat (Object>Binding>Repeat Row for Each Data Item) it should work. If the row doesn't exist yet then try using the underscore shortcut for the Instance Manager: Table._Row2.addInstance(1); -
Table with a sub-heading row - under only two columns
Hi all,
Using TCS2 on Win 7 64-bit. This is likely a silly question but I can't for the life of me figure it out.
I want to create a table as follows:
-one table, with 4 columns and 5 rows
-i would like the header row to span all 4 columns, but only be divided into 3 pieces, so that in the first row beneath the header, I can further sub-divide the farthest 2 right-columns
I've attached a little screen diagram to try and give a sense of what I'm looking to do.
I'm sure this is a simple thing but I just can't figure it out!
Any help is greatly appreciated.
Thanks,
Adrianaadrianaharper wrote:
Hi all,
Using TCS2 on Win 7 64-bit. This is likely a silly question but I can't for the life of me figure it out.
I want to create a table as follows:
-one table, with 4 columns and 5 rows
-i would like the header row to span all 4 columns, but only be divided into 3 pieces, so that in the first row beneath the header, I can further sub-divide the farthest 2 right-columns
I've attached a little screen diagram to try and give a sense of what I'm looking to do.
I'm sure this is a simple thing but I just can't figure it out!
Any help is greatly appreciated.
Thanks,
Adriana
Select the two right-most cells in the top header row and choose Table > Straddle.
HTH
Regards,
Peter
Peter Gold
KnowHow ProServices -
How to Capture a Table with large number of Rows in Web UI Test?
HI,
Is there any possibility to capture a DOM Tabe with large number of Rows (say more than 100) in Web UI Test?
Or is there any bug?Hi,
You can try following code to capture the table values.
To store the table values in CSV :
*web.table( xpath_of_table ).exportToCSVFile("D:\exporttable.csv", true);*
TO store the table values in a string:
*String tblValues=web.table( xpath_of_table ).exportToCSVString();*
info(tblValues);
Thanks
-POPS -
ORA-00904 error while export table with CLOB
All,
I'm trying to export from Oracle Client 8.0.4 an specific Oracle 9i R2 schema, but this error appears. This error is related with tables that have CLOB field types, because schemas with tables without this field type can be exported with no error. I've already run the catexp.sql script, but it haven't solved this problem.
Does anyone can help me?
Thanks,
DaviYou can try performing the import of the dump to see if it would work with 8i client or the 8.0.4 client.
if not, you may not be able to use this method to move data into 8.0.4 database that is no longer spported by current tools.
you may then want to try use other techniques like dumping tables into flat files and then using SQL*Loader to load into 8.0.4.
http://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:88212348059#14506966201668
Maybe you are looking for
-
Profit Center AND Cost Center Report
Hi Friends, Is it possible to get a report from Sap, the list of PC and CC where there are no transactions in last 3 months? Regards, Raji Moderator: Please, search SDN
-
How to Restrict Excise invoice of cancelled Proforma Invoice?
How to Restrict Excise invoice of cancelled Proforma Invoice? I have cancelled (Completed ) Proforma invoice and now I want to restrict creation of excise invoice
-
BPEL Console-Authentication failed.
HI' I am using SOA Suite While Installation of SOA suite I provided it with User name and password and not able to log in to BPEL Console, I am able to log in to Rule Author as it is using the same user name and password as Administrator, did nay one
-
Hi, is there any way to go to directly on query from the form or report of SAP B ONE?
-
Spotlight search for emails outside of iMail
Hi all, By default, spotlight searches in iMail. I use another email client (Airmail) and I would like to include it in spotlight searches. Is that possible? Thanks