ROWID - data type export import
Hi all,
I have a table which has a column of datatype ROWID. Size of the table is 5GB. When I try to export it takes 40 mins for this table. While importing it takes 7hrs to complete.
I feel it is taking more time than it is supoposed to. Is it because of the ROWID data type. Is it doing something extra, can someone throw some ilight on this.
Thanks in Advance,
Jaggyam
No it will not be because of the ROWID column data type.
Probably because of:
- creation of indexes on the table.
- creation of constraints (PK/UK/FK/CHECK) on the table.
These actions take time, and do not happen when you export a table.
Btw:
Are you aware that the rowid values in that particular column will (very likely, depends a bit on your case) have been rendered un-usable after the import?
Similar Messages
-
Hi All,
I pass rowid from select stmt to update stmt in plsql, so update is quicker. There is bunch of if else statements before update is issued.
Question is I declared v_rowid to be varchar2 and it gives me numeric error. Should I declare it as rowid data type ? Do I have to worry about the size ?
Thanks for any help
VissuI am not sure For me it is working
Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - 64bit Production
With the Partitioning, OLAP and Data Mining options
SQL> declare
2 r rowid;
3 begin
4 select rowid into r from t where rownum<2;
5 end;
6 /
PL/SQL procedure successfully completed.
SQL>
Connected to:
Oracle9i Enterprise Edition Release 9.2.0.4.0 - 64bit Production
With the Partitioning option
JServer Release 9.2.0.4.0 - Production
SQL> ed
Wrote file afiedt.buf
1 declare
2 r rowid;
3 begin
4 select rowid into r from t1 where rownum<2;
5* end;
SQL> /
PL/SQL procedure successfully completed.
SQL> r
1 declare
2 r rowid;
3 begin
4 select rowid into r from t1 where rownum<2;
5* end;
PL/SQL procedure successfully completed.
SQL> Message was edited by:
devmiral -
Hi friends,
I need a new DATA TYPE .The structure is just like a bapi structure. I would not like to write everything by hand.
I would like to import XSD of bapi.
It is possible?
if yes, how then?Hi,
Hey just checked,
If you Import the RFC and then open the Request and Response, then in the menu on the right side,
Select TOOLS-->EXPORT XSD. This XSD then can be made as External defination and then you can use it as MT for mapping.
Is this what you need or am I just missing the point.
Regards
Vijaya -
Using Complex Data Types in Import JavaBean Model
Hi,
I have searched and read forums and weblogs related to import javabean model.
But I am not clear about how to use complex data types like ArrayList, Collection etc in my java bean.
If I use these complex datatypes in my bean, when creating model in WDF it displays the Complex data elements in Model Relation. I dont know how to use this model relation in my WD project.
Anyone please explain the<b> step by step solution</b> about using complex data type(used in Bean) in WD Project.
Thanks,
Krishna KumarHi Krishna,
Valery`s blog contains sample code ( http://www.jroller.com/resources/s/silaev/Employees.zip )
Another blogs from this area:
/people/anilkumar.vippagunta2/blog/2005/09/02/java-bean-model-importer-in-web-dynpro
/people/valery.silaev/blog/2005/08/30/javabean-model-import-when-it-really-works
And forum topics:
Import JavaBean Model
Problem Importing JavaBean Model in NetWeaver Developer Studio
Issue on "Import JavaBean Model"
import JavaBean Model: no executable Methods?
JavaBeans Model Import
POLL : JavaBean Model Importer
JavaBean-Model
Invalid Class - Javabean not available for import
WebDynpro Using JavaBean Model ->Please Help
Best regards, Maksim Rashchynski. -
SYSTEM Copy with Data ( SAP Export/Import way)
Hello ,
There is requirement at my client site to build SAP system copy from Production System without copying Data but
should have Programs/ Structures / Repository/ Tables & views etc.
Though We have thought of building SAP system with Export/ Import way and then deleting the Client from copied system
after that running Remote client copy from Source to Target system with SAP_CUST profile
But I have heard with SAP Export/Import way , We can have SAP data copy skipped and only structure to copy. If there is any way
of such kind then Please help me in letting know the same
Thanks
Deepak GosainHi Deepak
Kindly refer the SCN link difference between the Client copy Export / import & Remote copy method
Difference between remote client copy and client Import/Export- Remote client copy steps
BR
SS -
Pre Checks before running Data Pump Export/Import
Hi,
Oracle :-11.2
OS:- Windows
Kindly share the pre-checks required for data pump export and import which should be followed by a DBA.
ThanksWhen you do a tablespace mode export, Data Pump is essentially doing a table mode export of all of the tables in the tablespaces mentioned. So if you have this:
tablespace a contains table 1
table 2
index 3a(on table 3)
tablespace b contains index 1a on table 1
index 2a on table 2
table 3
and if you expdp tablespaces=a ...
you will get table 1, table 2, index 1a, and index 2a.
My belief is that you will not get table 3 or index 3a. The way I understand the code to work is that you get the tables in the tablespaces you mention and their dependent objects, but not the other way around. You could easily verify this to make sure.
Dean -
Oracle 10g - Data Pump: Export / Import of Sequences ?
Hello,
I'm new to this forum and also to Oracle (Version 10g). Since I could not find an answer to my question, I open this post in hoping to get some help from the experienced users.
My question concerns the Data Pump Utility and what happens to sequences which were defined in the source database:
I have exported a schema with the following command:
"expdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp LOGFILE=logfile.log"
This worked fine and also the import seemed to work fine with the command:
"impdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp"
It loaded the exported objects directly into the schema of the target database.
BUT:
Something has happened to my sequences. :-(
When I want to use them, all sequences start again with value "1". Since I have already included data with higher values in my tables, I get into trouble with the PK of these tables because I used sequences sometimes as primary key.
My question go in direction to:
1. Did I something wrong with Data Pump Utility?
2. How is the correct way to export and import sequences that they keep their actual values?
3. When the behaviour described here is correct, how can I correct the values that start again from the last value that was used in the source database?
Thanks a lot in advance for any help concerning this topic!
Best regards
FireFighter
P.S.
It might be that my english sounds not perfect since it is not my native language. Sorry for that! ;-)
But I hope that someone can understand nevertheless. ;-)My question go in direction to:
1. Did I something wrong with Data Pump Utility?I do not think so. But may be with the existing schema :-(
2. How is the correct way to export and import
sequences that they keep their actual values?If the Sequences exist in the target before the import, oracle does not drop and recreate it. So you need to ensure that the sequences do not already exist in the target or the existing ones are dropped before the import.
3. When the behaviour described here is correct, how
can I correct the values that start again from the
last value that was used in the source database?You can either refresh with the import after the above correction or drop and manually recreate the sequences to START WITH the NEXT VALUE of the source sequences.
The easier way is to generate a script from the source if you know how to do it -
Unable to send structure data type as import parameter in RFC
I have one RFC to web service scenario.My sending RFC is as follows:
FUNCTION ZIXI_001_EMP_DET.
""Local Interface:
*" IMPORTING
*" VALUE(EMP_DET) TYPE ZIXI_EMP
*" VALUE(OFFER_DT) TYPE ZXI_OFFER_DATA
ENDFUNCTION.
Here ZIXI_EMP is a structure with following details:
EMP_ID INT4 INT4 10 0 Natural Number
NAME STRING128 CHAR 128 0 128 Lowercase and Uppercase Letters
SAL INT4 INT4 10 0 Natural Number
and ZXI_OFFER_DATA is another structure with following details:
REQ_ID STRING128 CHAR 128 0 128 Lowercase and Uppercase Letters
PRICE STRING128 CHAR 128 0 128 Lowercase and Uppercase Letters
QTY STRING128 CHAR 128 0 128 Lowercase and Uppercase Letters
After configuring everything in IR and ID, I called the RFC from R3 system with following data:
EMP_DET:
EMP_ID 15
NAME XYZ
SAL 5.000
OFFER_DT:
REQ_ID A235
PRICE 50
QTY 3
but when I am checking in sxmb_moni, payload only containing first field of each structure. i.e.,
<?xml version="1.0" encoding="UTF-8" ?>
- <rfc:ZIXI_001_EMP_DET xmlns:rfc="urn:sap-com:document:sap:rfc:functions">
- <EMP_DET>
<EMP_ID>15</EMP_ID>
<NAME />
<SAL>0</SAL>
</EMP_DET>
- <OFFER_DT>
<REQ_ID>A235</REQ_ID>
<PRICE />
<QTY />
</OFFER_DT>
</rfc:ZIXI_001_EMP_DET>
Why this is happening I don't know. is there any constraint with SAP XI 3.0 that we can not pass values using structure?...
Or any configuration needs to be done?
Please help....Guess I got it... U should define the structure as Tables parameter and not as Import Parameter...
should be like
FUNCTION ZIXI_001_EMP_DET.
""Local Interface:
*" TABLES
*" EMP_DET STRUCTURE ZIXI_EMP
*" OFFER_DT STRUCTURE ZXI_OFFER_DATA
ENDFUNCTION.
~SaNv...
Edited by: Santhosh Kumar V on Mar 18, 2010 6:35 PM -
Data Manager - export/import
My questions is as follows:
Export from: Oracle 8.17 (on Unix)
Import to: Oracle 8.05 (Windows2000)
How should I set the parameters in the Data Manager to make all tables/procedures/triggers etc.. come across complete and undamaged.?
I have tried many things - but usually the procedures become invalid and no triggers come across.
I have used IMP80 as well for importing with same result.
Regards
Olav Torvund
[email protected]This is exactly why everyone out there should not get too comfortable using OEM to manage their databases. If you know how to do a task using the command-line, then by all means do it that way. Use OEM or DBA Studio for things you're not so sure of or if you need a visual to help you do it right. Command-line is quicker and more percise because you have total control of what info you're sending to the db.
Others may disagree with the advice I'm giving - which is to NOT use OEM if you know how to do the task via command-line - and further input is welcome. What do others think about this?????????
Roy -
Data transfer (Export & Import) of Delta Queue between two system.
We are planning to implement the following senario for a certain R/3 and BW system.
1. Implement storage snap shot copy solution for R/3 database and create replicated system of production R/3 every day in order to avoid load of production R/3 to BW directly. That is to say, we will extract business data from the replicated R/3 system and forward to BW.
2. We need to utilize Delta Queue function in order to feed the only gap data between production R/3 and BW. but as stated in 1. the source data will be replicated from the production R/3 every day and so we could not keep the Delta Queue data in R/3.
3. So, we are examining the following method in order to keep the Delta Data in the replicated R/3 system. i,e after feeding business data from the replicated R/3 to BW, we will extract the following Delta Queue table and keep it.
- TRFCQOUT (Client Dependent pointer table per queue name and destination.
- ARFCSSTATE: Likn between TRFCQOUT and ARFCSDARA
- ARFCSDATA: Compressed data for tRFC/qRFC
And then, next day, after replicating R/3 system, we will import those Delta data to the replicated R/3 system before stating extract business data and feed to BW.
Is the such method realistic? if not, is there any alternative?
Please advice!
Best regards,I found the following document and this is one that I want to inplement.
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5d51aa90-0201-0010-749e-d6b993c7a0d6
Additionaly, I want to execute this senario repeatedly. Does someone have experience such implementation?
Eguchi. -
10G data pumps export / import
window XP pro sp-2,
I run expdp from the command line I got this
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
tion
With the Partitioning, OLAP and Data Mining options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-39087: directory name DATA_PUMP_DIR is invalid
What am I missing ?
EricMaybe you've missed some parameters? Expdmp alsways needs to have some sort of info on what you want to export, i.e.:
expdp hr/hr TABLES=employees,jobs DUMPFILE=dpump_dir1:table.dmp NOLOGFILE=y
or
expdp hr/hr PARFILE=exp.par
with parfile contents being:
DIRECTORY=dpump_dir1
DUMPFILE=dataonly.dmp
CONTENT=DATA_ONLY
EXCLUDE=TABLE:"IN ('COUNTRIES', 'LOCATIONS', 'REGIONS')"
QUERY=employees:"WHERE department_id !=50 ORDER BY employee_id"
These examples are from the documentation on Utilities, so you can look up all the possible parameters there. -
Using export/import to migrate data from 8i to 9i
We are trying to migrate all data from 8i database to 9i database. We plan to migrate the data using export/import utility so that we can have the current 8i database intact. And also the 8i and 9i database will reside on the same machine. Our 8i database size is around 300GB.
We plan to follow below steps :
Export data from 8i
Install 9i
Create tablespaces
Create schema and tables
create user (user used for exporting data)
Import data in 9i
Please let me know if below par file is correct for the export :
BUFFER=560000
COMPRESS=y
CONSISTENT=y
CONSTRAINTS=y
DIRECT=y
FEEDBACK=1000
FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
FILESIZE=2048GB
FULL=y
GRANTS=y
INDEXES=y
LOG=export.log
OBJECT_CONSISTENT=y
PARFILE=exp.par
ROWS=y
STATISTICS=ESTIMATE
TRIGGERS=y
TTS_FULL_CHECK=TRUE
Thanks,
Vinod BhansaliI recommend you to change some parameters and remove
others:
BUFFER=560000
COMPRESS=y -- This will increase better storage
structure ( It is good )
CONSISTENT=y
CONSTRAINTS=y
DIRECT=n -- if you set that parameter in yes you
can have problems with some objects
FEEDBACK=1000
FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
FILESIZE=2048GB
FULL=y
GRANTS=y -- this value is the default ( It is
not necesary )
INDEXES=y
LOG=export.log
OBJECT_CONSISTENT=y -- ( start the database in restrict
mode and do not set this param )
PARFILE=exp.par
ROWS=y
STATISTICS=ESTIMATE -- this value is the default ( It is
not necesary )
TRIGGERS=y -- this value is the default ( It is
not necesary )
TTS_FULL_CHECK=TRUE
you can see what parameters are not needed if you apply
this command:
[oracle@ozawa oracle]$ exp help=y
Export: Release 9.2.0.1.0 - Production on Sun Dec 28 16:37:37 2003
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
You can let Export prompt you for parameters by entering the EXP
command followed by your username/password:
Example: EXP SCOTT/TIGER
Or, you can control how Export runs by entering the EXP command followed
by various arguments. To specify parameters, you use keywords:
Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
USERID must be the first parameter on the command line.
Keyword Description (Default) Keyword Description (Default)
USERID username/password FULL export entire file (N)
BUFFER size of data buffer OWNER list of owner usernames
FILE output files (EXPDAT.DMP) TABLES list of table names
COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
GRANTS export grants (Y) INCTYPE incremental export type
INDEXES export indexes (Y) RECORD track incr. export (Y)
DIRECT direct path (N) TRIGGERS export triggers (Y)
LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
ROWS export data rows (Y) PARFILE parameter filename
CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
OBJECT_CONSISTENT transaction set to read only during object export (N)
FEEDBACK display progress every x rows (0)
FILESIZE maximum size of each dump file
FLASHBACK_SCN SCN used to set session snapshot back to
FLASHBACK_TIME time used to get the SCN closest to the specified time
QUERY select clause used to export a subset of a table
RESUMABLE suspend when a space related error is encountered(N)
RESUMABLE_NAME text string used to identify resumable statement
RESUMABLE_TIMEOUT wait time for RESUMABLE
TTS_FULL_CHECK perform full or partial dependency check for TTS
VOLSIZE number of bytes to write to each tape volume
TABLESPACES list of tablespaces to export
TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
TEMPLATE template name which invokes iAS mode export
Export terminated successfully without warnings.
[oracle@ozawa oracle]$
Joel P�rez -
hi..
i've problem in creating data type in xi after importing idoc from the R/3 sender
plz tell me the basic steps for creating data type in the receiver i.e xi.....
thanks in advanceHi,
You don't need to create a data type after importing the IDOC.
Idoc itself acts as message type and message interface.
check this for creating a data type..
http://help.sap.com/saphelp_nw70/helpdata/en/2d/c0633c3a892251e10000000a114084/frameset.htm
Thanks,
Vijaya. -
Where can I find detailed information about exporting and then importing IFS data
between 2 different computers (database instances)?
Note: IFS data to export/import contain custom objects, including custom DirectoryUsersValean,
If you look in the 9iFS Administration guide there is a section there on the ifsexportcontent and related tools. These give you the ability to export content and also users and groups.
Should be what you are after.
Chris
Seregengeti Systems
www.serengeti-systems.com -
Hi experts,
Source database having below data types , Can you please confirm is there any restrictions in GoldenGate to replicate the below data type data? is the below data types are supported or not supported , please confirm. DB version 11.2.0.2 , OS : Oracle exadata machine , OGG 11.2
TIMESTAMP(6) WITH LOCAL TIME ZONE
UROWID
UNDEFINED
NVARCHAR2
RAW
ROWIDHi
About UNDEFINED data type, do you know what kind of object does it belong to?
I had the same question but when i queried the database dictionary UNDEFINED data type columns were columns of views with "INVALID" status.
If this is your case you may prefer to ignore this data type.
About ROWID columns i have the same concern as you, by now i have experimented with an initial load with 2 databases 11.2.0.3 with identical structure (datafiles, tablespaces), using GoldenGate 11.2.1 and had no problem, the ROWID data type colums loaded correctly.
I'll continue investigating and if you find out something any feedback would be apreciated.
Maybe you are looking for
-
Hi all, This is my detail explaination regarding this issue. When I run balance carryforward for period 1/2008, figure for posting level 20 will carryforward.(RM27,441.0). This is IU figure from period 16/2007. But after I run task IU Elimination, th
-
Copying Data from a table to new table
Hi, I am copying some columns data from one table to another table and my query is INSERT INTO TestTable (empno, empName,deptno) SELECT empno, empname,deptno FROM emp and it's working fine but I want to insert one more column data into testTable on t
-
How to make a print look like a real polaroid?
For a certain project, I am trying to replicate the "look" of a polaroid, as in from an instant polaroid camera. I am wondering if anyone is aware of any photo printing paper in a format that looks like a polaroid? Or any other tips that might be hel
-
Portrait Photos don't show in Slideshows...
Okay, so I was making a Slideshow to send to iDVD when I noticed that while previewing the show, none of my portrait photos were showing up. The landscapes, however, were OK. After going back to the portraits, I saw that for every single portrait, a
-
Can I daisy chain 2 time capsules together to increase my wired network capability from 3 to 6
I am getting a new TC and would like to use my other one so I can hardwire other networked devices i.e Xbox, playstation, 2nd airport express