Unload data with DATAPUMP external tables performance
Hi,
I archive some tables with external tables ORACLE_DATAPUMP.
How can I increase the performance of the table creation?
I try PARALLEL, and I want to know if some others parameters or init parameters can increase the performance.
Thanks for your feedback.
Elodie
Hi,
take a look here:
http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_perf.htm
You will find some hints.
Acr
Similar Messages
-
Uploading data from an external table with a where clause
Hi,
How do insert data from an external table in to a table already created in the database where person_id = person-_id on the external table?
Example:
External table
XX_EXTERNAL_TBL (Person_id, emp_number, emp_name)
Internal table
XX_SQL_LOADER_TEST (Person_id, emp_number, emp_name)
and person id already exists on the internal table (only want the last to columns to import where the person ids are the same)
ThanksINSERT
INTO XX_SQL_LOADER_TEST
SELECT *
FROM XX_EXTERNAL_TBL
WHERE Person_id IN (
SELECT Person_id
FROM XX_SQL_LOADER_TEST
/SY. -
How to load external data without using external tables
Hi,
I'm asked to develop an ETL for loading external data into a database but, unfortunately, the vers. is 8i so I can't use external tables.
I know that it's possible by using SQL-Loader. What I want to know is whether there's another way in 8i to load external data from a text file or a CSV file directly with a stored procedure without having to write CTL of SQL-Loader on the server. I mean, is there a way to write everything on the database the same manner I'd do whether I wrote data from a table to a file using for example UTL_FILE or the unique way is to write control files of SQL-Loader directly on the server?
Thanks!Mark1970 wrote:
Thank you very much Karthick
I didn't know I could use UTL_FILE also for reading files. I've always use it only for writing data into external files.Yes you can use UTL_FILE to read. The version of oracle you are using i last used in 2004 :) Still remember giving the OS path of the file to open the file. That is long gone now. Its surprising that you are still in 8i. -
External Table Performance and Sizing
Hi,
Can anyone tell me anything about best practices for external tables?
I have an application that writes structured log data to flat files. The size of these files can be configured and when the size limit is reached, they are rolled over. The data itself is queriable via an external table in oracle. Every so often the data is migrated (materialized) to a normal database table so it can be indexed, etc. and to keep the external file size down.
My questions are:
<ol><li> is there an optimum file size for an external table (overall size / number of rows) - by that, I suppose I mean, is there a limit where performance degrades significantly rather than constantly?
</li>
<li>is it better to have one large file mapped to the external table or multiple smaller ones mapped to the same table? e.g. does oracle do some parallel work on multiple smaller files at the same time which might improve things?
</li>
</ol>
If there are any resources discussing these issues, that would be great - or if there is any performance data for external tables in this respect, I would love to see it.
Many thanks,
DaveHi Dave
is there an optimum file size for an external table (overall size / number of rows) - by
that, I suppose I mean, is there a limit where performance degrades significantly rather
than constantly?AFAIK there is no such limit. In other words, access time is proportional to the size (number of rows).
is it better to have one large file mapped to the external table or multiple smaller ones
mapped to the same table? e.g. does oracle do some parallel work on multiple smaller
files at the same time which might improve things?The DOP of a parallel query on an external table is limited by the number of files. Therefore, to use parallel processing, more than one file is needed.
HTH
Chris Antognini
Troubleshooting Oracle Performance, Apress 2008
http://top.antognini.ch -
Hi All,
I have a flat file which has date field that looks like '2007-08-09 19:04:03.597000000'. I have to create an external table to read the file. I tried giving these two date formats 'yyyy-mm-dd hh24:mi:ss.ff' and 'yyyy-mm-dd hh24:mi:ss.ms'. But none of them works. what should be the date format to be mentioned in external table defnition?
The external table defnition is like
CREATE TABLE EXT_TEST
( "ID" NUMBER(10,0),
"DATE_COL" DATE
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY "EXT_TAB_DIR"
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY '|'
MISSING FIELD VALUES ARE NULL(
ID,
"DATE_COL" DATE "yyyy-mm-dd hh24:mi:ss.ff"
LOCATION
( 'test.dat'
REJECT LIMIT UNLIMITED;
ThanksThanks for the help,
I have one more date column and the format is like 1/31/2007 4:03:56 PM. The external table defnition looks like
CREATE TABLE EXT_TEST
( "ID" NUMBER(10,0),
"DATE_COL" DATE
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY "EXT_TAB_DIR"
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY '|'
MISSING FIELD VALUES ARE NULL(
ID,
"DATE_COL" char date_format date mask "M/d/yyyy h:mm:ss tt"
LOCATION
( 'test.dat'
REJECT LIMIT UNLIMITED;
But this again gives an error. Is this the correct date format? -
More than one flat files with same external table
Is it possible to create external table in owb associated with more than one file ie to generate code like LOCATION ( FILE1,FILE2) in create table ddl.
Hi,
Yes, you can add multiple files by using the configuration of the external table, and create many instances of 'files'. Every 'file' you specify requires you to specify a location and a file name.
For more details, please review chapter 5 of the user's guide (http://www.oracle.com/technology/documentation/warehouse.html), page 5-15.
Hope this helps.
Mark. -
How join a type table data with a real table.
Hello,
I want make a join with a type table data and real table.
But i don't know how make this.
Can you help me?????
I have this pl/sql:
SET LINESIZE 5000;
DECLARE
valor NUMBER;
TYPE reg_tablita IS RECORD
(cuenta B_TB_CTAFACTU.B_CO_CTAFACTU%TYPE,sistema B_TB_CTAFACTU.B_CO_SISTEMA%TYPE);
TYPE Tabla IS TABLE OF reg_tablita INDEX BY BINARY_INTEGER;
I BINARY_INTEGER;
Array Tabla;
BEGIN
Array(1).cuenta:='000001';
Array(1).sistema:='AA';
Array(2).cuenta:='000002';
Array(2).sistema:='BB';
Array(3).cuenta:='000003';
Array(3).sistema:='CC';
Array(4).cuenta:='000004';
Array(4).sistema:='DD';
Array(5).cuenta:='000005';
Array(5).sistema:='EE';
SELECT count(*) into valor FROM
TABLA A, Array WHERE
B_CO_CTAFACTU=Array.cuenta and B_CO_SISTEMA=Array.sistema
quit;
Thanks very much!!!!You need to have the types defined as SQL types (created in database) in order to be able to use them for joining. PL/SQL object types can not be joined.
-
How to upload data with dynamic internal table
Hi,
I have to upload the basic , sales, purchasing view data by using bapi depend on check box selected for views.
i have filled fieldcatalog for selected views and pass the field catalog structure to dynamic int table and
import it into the field symbol <fs_data>. this field symbol assigned to <fs_1>.
then callED the 'GUI_UPLOAD ' and when i pass field symbo for function module i am getting first 10 rows 0f the file .
flat file has basic data,sales ,purchase fields data
ex: i was selected basicview and purchase view and execute i am getting basic,sales view data.
how can i get only basic and purchase data.hi ,
please find code.
CALL METHOD cl_alv_table_create=>create_dynamic_table
EXPORTING
it_fieldcatalog = lt_fieldcatalog
IMPORTING
ep_table = <fs_data>
EXCEPTIONS
generate_subpool_dir_full = 1
OTHERS = 2.
IF sy-subrc <> 0. "#EC NEEDED
ENDIF.
So <FS_1> now points to our dynamic internal table.
ASSIGN <fs_data>->* TO <fs_1>.
Next step is to create a work area for our dynamic internal table.
CREATE DATA new_line LIKE LINE OF <fs_1>.
A field-symbol to access that work area
ASSIGN new_line->* TO <fs_2>.
ASSIGN new_line->* TO <fs_3>.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = p_f
filetype = 'DAT'
TABLES
data_tab = <fs_1>
EXCEPTIONS
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
no_authority = 6
unknown_error = 7
bad_data_format = 8
header_not_allowed = 9
separator_not_allowed = 10
header_too_long = 11
unknown_dp_error = 12
access_denied = 13
dp_out_of_memory = 14
disk_full = 15
dp_timeout = 16
OTHERS = 17.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF. -
How to insert date with timestamp into table values
hi,
I have a table
create table abc1(dob date);
insert into abc1 values (to_date(sysdate,'RRRR/MM/DD HH24:MI:SS'))
but when i see in data base it shows as normal date without time stamp.
Is it possible to insert into back end with timestamp.
Thanks..First, SYSDATE is a DATE already, no need to convert it to a DATE using the TO_DATE() function.
The date ALWAYS has a time component, whether or not it is displayed is up to your NLS settings. for example:
SQL> CREATE TABLE ABC1(DOB DATE);
Table created.
SQL> ALTER SESSION SET NLS_DATE_FORMAT='MM/DD/YYYY';
Session altered.
SQL> INSERT INTO ABC1 VALUES(SYSDATE);
1 row created.
SQL> SELECT * FROM ABC1;
DOB
02/04/2010
SQL> ALTER SESSION SET NLS_DATE_FORMAT='MM/DD/YYYY HH24:MI:SS';
Session altered.
SQL> SELECT * FROM ABC1;
DOB
02/04/2010 12:54:57
SQL> DROP TABLE ABC1;
Table dropped. -
DATE fields and LOG files in context with external tables
I am facing two problems when dealing with the external tables feature in Oracle 9i.
I created an External Table with some fileds with the DATE data type . There were no issues during the creation part. But when i query the table, the DATE fields are not properly selected though the data is there in the files. Is there any ideas to deal with this ?
My next question is regarding the log files. The contents in the log file seems to be growing when querying the external tables. Is there a way to control this behaviour?
Suggestions / Advices on the above two issues are welcome.
Thanks
LakshminarayananHi
If you have date datatypes than:
select
greatest(TABCASER1.CASERRECIEVEDDATE, EVCASERS.FINALEVDATES, EVCASERS.PUBLICATIONDATE, EVCASERS.PUBLICATIONDATE, TABCASER.COMPAREACCEPDATE)
from TABCASER, TABCASER1, EVCASERS
where ...-- join and other conditions
1. greatest is good enough
2. to_date creates date dataype from string with the format of format string ('mm/dd/yyyy')
3. decode(a, b, c, d) is a function: if a = b than return c else d. NULL means that there is no data in the cell of the table.
6. to format the date for display use to_char function with format modell as in the to_date function.
Ott Karesz
http://www.trendo-kft.hu -
Dynamically Update External Table Data
I loaded a data table from an external csv file. But the csv file is an output from a program that appends a new line to the csv file every couple days.
I made a Report with the loaded data, but in a couple days when a few new lines of data appear in the csv file, I want my report to update as well.
Is there a way to create a "Refresh Button" in my application that will load the data from the csv when pressed?
I have oracle database 11g express with APEX 4.1.1Hi,
what you describe could certainly be achieved, but it would have to be designed and built, it is not going to be provided "out of the box". I would probably go with an external table based on your file and a process that you would build would be callable from the button and this would do some sort of a merge of the external table data into your database table. Apart from being callable on demand, the process could also be made to be a scheduled process that loads the data on a regular basis, say once a day or once an hour, depending on your requirements. To use an external table it is assumed that the file will be available on the database server. If not and the file is available on only a client machine, then doing an on demand load process would still be possible, but a little more difficult.
Andre -
Selecting data from external table
Hi there
I was wondering if somebody could assist me. When I try to select data from an external table, no data is displayed, and in my log file I receive the following error:
KUP-04026: field too long for datatype. Please find attached my external table script.
CREATE TABLE DEMO_FILE_EXT
MACODE NUMBER(7),
MANO NUMBER(7),
DEPNO VARCHAR2(2 BYTE),
DEPTYPE NUMBER(5),
STARTDATE NUMBER(8),
ENDDATE NUMBER(8),
OPTIONSTART NUMBER(8),
BENEFITSTART NUMBER(8),
STARTSUSPEND NUMBER(8),
ENDSUSPEND NUMBER(8),
INITIALS VARCHAR2(5 BYTE),
FIRSTNAME VARCHAR2(20 BYTE),
SURNAME VARCHAR2(25 BYTE),
STR1 VARCHAR2(30 BYTE),
STR2 VARCHAR2(30 BYTE),
STR3 VARCHAR2(30 BYTE),
STR4 VARCHAR2(30 BYTE),
SCODE VARCHAR2(6 BYTE),
POS1 VARCHAR2(30 BYTE),
POS2 VARCHAR2(30 BYTE),
POS3 VARCHAR2(30 BYTE),
POS4 VARCHAR2(30 BYTE),
PCODE VARCHAR2(6 BYTE),
TELH VARCHAR2(10 BYTE),
TELW VARCHAR2(10 BYTE),
TELC VARCHAR2(10 BYTE),
IDNUMBER VARCHAR2(13 BYTE),
DOB NUMBER(8),
GENDER VARCHAR2(1 BYTE),
EMPLOYER_CODE VARCHAR2(10 BYTE),
EMPLOYER_NAME VARCHAR2(900 BYTE)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY DEMO_FILES
ACCESS PARAMETERS
( RECORDS DELIMITED BY newline
BADFILE 'Tinusb.txt'
DISCARDFILE 'Tinusd.txt'
LOGFILE 'Tinusl.txt'
SKIP 1
FIELDS TERMINATED BY '|'
MISSING FIELD VALUES ARE NULL
(MACODE,
MANO,
DEPNO,
DEPTYPE,
STARTDATE,
ENDDATE,
OPTIONSTART,
BENEFITSTART,
STARTSUSPEND,
ENDSUSPEND,
INITIALS,
FIRSTNAME,
SURNAME,
STR1,
STR2,
STR3,
STR4,
SCODE,
POS1,
POS2,
POS3,
POS4,
PCODE,
TELH,
TELW,
TELC,
IDNUMBER,
DOB,
GENDER,
EMPLOYER_CODE,
EMPLOYER_NAME
LOCATION (DEMO_FILES:'Test1.txt')
REJECT LIMIT UNLIMITED
LOGGING
NOCACHE
NOPARALLEL;
I have the correct privileges on the directory, but the error seems to be on the EMPLOYER_NAME field. The file I try to upload is in pipe-delimited format. The last field in the file does not have a pipe-delimiter at the end. Can this be the problem? Must I go and look for any trailing spaces? Can I specify in the external table script how many characters I need for the employer_name field? We receive this file from an external company
Thank you very much for the help
Ferdiecommon mistake, you gave the field sizes in the
column listing of the table, but not in the file
definition. oracle does not apply one to the other.
in the file defintion section, give explict field
sizes.Hi shoblock
Sorry for only coming back to you now, thank you for your help, I had to give the explicit field size for the last column (employer name).
Thank you once again!!
Ferdie -
How to clone data with in Table with dynamic 'n' number of columns
Hi All,
I've a table with syntax,
create table Temp (id number primary key, name varchar2(10), partner varchar2(10), info varchar2(20));
And with data like
insert itno temp values (sequence.nextval, 'test', 'p1', 'info for p1');
insert into temp values (sequence.nextval, 'test', 'p2', 'info for p2');
And now, i need to clone the data in TEMP table of name 'test' for new name 'test1' and here is my script,
insert into Temp select sequence.nextval id, 'test1' name, partner, info from TEMP where name='test1';
this query executed successfully and able to insert records.
The PROBLEM is,
if some new columns added in TEMP table, need to update this query.
How to clone the data with in the table for *'n' number of columns and*
some columns with dynamic data and remaining columns as source data.
Thanks & Regards
PavanPinnu.
Edited by: pavankumargupta on Apr 30, 2009 10:37 AMHi,
Thanks for the quick reply.
My Scenario, is we have a Game Details table. When ever some Game get cloned, we need to add new records in to that Table for the new Game.
As, the id will be primary key, this should populate from a Sequence (in our system, we used this) and Game Name will be new Game Name. And data for other columns should be same as Parent Game.
when ever business needs changes, there will be some addition of new columns in Table.
And with the existing query,
insert into Temp (id, name, partner, info) select sequence.nextval id, 'test1' name, partner, info from TEMP where name='test'_
will successfully add new rows but new added columns will have empty data.
so, is there any way to do this, i mean, some columns with sequence values and other columns with existing values.
One way, we can do is, get ResultSet MetaData (i'm using Java), and parse the columns. prepare a query in required format.
I'm looking for alternative ways in query format in SQL.
Thanks & Regards
PavanPinnu.
Edited by: pavankumargupta on Apr 30, 2009 11:05 AM
Edited by: pavankumargupta on Apr 30, 2009 11:05 AM -
How to ''give'' error for this case of an EXTERNAL TABLE?
Our external table routine works fine:
-- We have a csv file with 2 cols.
-- When we create the table referring the csv it works fine.
-- Even if the csv has more the 2 cols, the ET command only takes the 2 cols and it works fine.
-- Now, users are saying that if the csv has more than 2 cols, the ET command should give an error
I went through the command but cannot find any clause which will do this.
Is there any other way or workaround?
CREATE TABLE <table_name> (
<column_definitions>)
ORGANIZATION EXTERNAL
(TYPE oracle_loader
DEFAULT DIRECTORY <oracle_directory_object_name>
ACCESS PARAMETERS (
RECORDS DELIMITED BY newline
BADFILE <file_name>
DISCARDFILE <file_name>
LOGFILE <file_name>
[READSIZE <bytes>]
[SKIP <number_of_rows>
FIELDS TERMINATED BY '<terminator>'
REJECT ROWS WITH ALL NULL FIELDS
MISSING FIELD VALUES ARE NULL
(<column_name_list>))\
LOCATION ('<file_name>'))
[PARALLEL]
REJECT LIMIT <UNLIMITED | integer>;
Is it possible to use the READSIZE?
Edited by: Channa on Sep 23, 2010 2:28 AM-- Now, users are saying that if the csv has more than 2 cols, the ET command should give an error
I went through the command but cannot find any clause which will do this.
Is there any other way or workaround?I looked at Serverprocess' sql*loader script and did not see how that would answer your question - how to raise an error if the file has more than 2 columns. If I missed something can Serverprocess explain?
I can't think of a direct way to do this with your external table either, but there may be indirect ways. Some brainstorming ideas of perhaps dubious usefulness follow.
Placing a view over the external table can limit results to the first two columns but won't raise an error.
A pipelined function can read the external table, check for data where there shouldn't be any, and raise an exception when you find data in columns where there should not be any.
Similarly, you could ditch the external table and use utl_file to read the file, manually parsing and checking the data. LOTS more work but more control on your end. External tables are much easer to use :(
Or, first load the external table into a work table before the "real" select. Check the work table for the offending data programatically and raise an error if data is where it should not be. You could keep the existing external table and not have to do a lot of recoding.
Or, also load the data into an otherwise unneeded global temporary table first. Use a trigger on the load to look for the unwanted data and raise an error if offending data is there
These ideas are boiling down to variations on validating the data before you use it.
Good luck! -
External tables in Oracle 11g database is not loading null value records
We have upgraded our DB from Oracle 9i to 11g...
It was noticed that data load to external tables in 9i is rejecting the records with null columns..However upgrading it to 11g,it allows the records with null values.
Is there are way to restrict loading the records that has few coulmns that are null..
Can you please share if this is the expected behaviour in Oracle 11g...
Thanks.Data isn't really loaded to an External Table. Rather, the external table lets you query an external data source as if it were a regular database table. To not see the rows with the NULL value, simply filter those rows out with your SQL statement:
SELECT * FROM my_external_table WHERE colX IS NOT NULL;
HTH,
Brian
Maybe you are looking for
-
How to move an instance from one activity to another
Hi all, Is it possible to move an instance from one activity to another using an external process? Is it possible to do this with PAPI and pass parameters to the instance in order to determine the instance's flow? Regards
-
I am willing to buy an Apple TV device in order to download HD rentals from iTunes, but I don't own a TV set--I watch television through eyeTV on my iMac. So here's my question: Is it possible to use Apple TV on an iMac only? I'm using an updated 24"
-
WebDynpro in HTMLControl of SAP-GUI?
Hi, I want to call my WebDynpro Application from the SAP-GUI in a HTML-Control. Displaying is not the problem, but I also want to notice in SAP-GUI application (which called the WebDynpro) which button was pressed by the user in WebDynpro. Is this po
-
5C00 error correction on MG8120
Can I fix error 5C00 on my MG8120?
-
Safari always crashes on launch
Today I DL'ed the Safari update hoping it would fix the way Safari continually crashes on me. The symptoms are pretty straigh forward, I click the Safari icon in the Dock, Safari launches with a blank page and then the beachball start spinning. After