Load Partitioned Table using OWB
The instructor of my OWB class said that OWB's partition exchange loading (PEL) requires all of a table's partitions to be in the same tablespace. This obviously is not a good thing for partitioned tables.
Does anyone know if this is true for the current version(s) of OWB (I'm using client 9.0.3.35.3)?
What's your work-around for this problem? SQL*Loader flat-file loading?
David
David,
At the moment there are a number of restrictions for Partition Exchange Loading (PEL). Indeed, one of the ones is the fact that all partitions have to be in the same tablespace. Partitioning also has to be on a DATE column.
If you want to implement any other type of PEL, you could do so... by writing everything into a procedure (use dynamic SQL). Alternatively, use ordinary loading... which of course is less fast.
Mark.
Similar Messages
-
Hi Gurus,
I am new to owb and as per requirement we need to load xml files into oracle table using owb.
below is the xml file:
<bookstore>
<book category="COOKING">
<title lang="en">Everyday Italian</title>
<author>Giada De Laurentiis</author>
<year>2005</year>
<price>30.00</price>
</book>
<book category="CHILDREN">
<title lang="en">Harry Potter</title>
<author>J K. Rowling</author>
<year>2005</year>
<price>29.99</price>
</book>
<book category="WEB">
<title lang="en">Learning XML</title>
<author>Erik T. Ray</author>
<year>2003</year>
<price>39.95</price>
</book>
</bookstore>
please help me in loading above xml file using owb.You can leverage the XML SQL functions to extract from XML using the database, see the blog post below;
https://blogs.oracle.com/warehousebuilder/entry/leveraging_xdb
For example to extract information from your XML document the following SQL can be generated from OWB;
select extractValue(value(s), '/book/author'),
extractValue(value(s), '/book/year'),
extractValue(value(s), '/book/price') from
( select XMLType('<bookstore>
+<book category="COOKING">+
+<title lang="en">Everyday Italian</title>+
+<author>Giada De Laurentiis</author>+
+<year>2005</year>+
+<price>30.00</price>+
+</book>+
+<book category="CHILDREN">+
+<title lang="en">Harry Potter</title>+
+<author>J K. Rowling</author>+
+<year>2005</year>+
+<price>29.99</price>+
+</book>+
+<book category="WEB">+
+<title lang="en">Learning XML</title>+
+<author>Erik T. Ray</author>+
+<year>2003</year>+
+<price>39.95</price>+
+</book>+
+</bookstore>') adoc from dual+) r,
table(XMLSequence(extract(r.adoc, '/bookstore/book'))) s;
Cheers
David -
SQL Loader : Loading multiple tables using same ctl file
Hi ,
We tried loading multiple tables using the same ctl file but the data was not loaded and no errors were thrown.
The ctl file content is summarised below :
LOAD DATA
APPEND INTO TABLE TABLE_ONE
when record_type ='EVENT'
TRAILING NULLCOLS
record_type char TERMINATED BY ',' ,
EVENT_SOURCE_FIELD CHAR TERMINATED BY ',' ENCLOSED BY '"',
EVENT_DATE DATE "YYYY-MM-DD HH24:MI:SS" TERMINATED BY ',' ENCLOSED BY '"',
EVENT_COST INTEGER EXTERNAL TERMINATED BY ',' ENCLOSED BY '"',
EVENT_ATTRIB_1 CHAR TERMINATED BY ',' ENCLOSED BY '"',
VAT_STATUS INTEGER EXTERNAL TERMINATED BY ',' ENCLOSED BY '"',
ACCOUNT_REFERENCE CONSTANT 'XXX',
bill_date "to_date('02-'||to_char(sysdate,'mm-yyyy'),'dd-mm-yyyy')",
data_date "trunc(sysdate)",
load_date_time "sysdate"
INTO TABLE TABLE_TWO
when record_type ='BILLSUMMARYRECORD'
TRAILING NULLCOLS
RECORD_TYPE char TERMINATED BY ',' ,
NET_TOTAL INTEGER EXTERNAL TERMINATED BY ',' ENCLOSED BY '"',
LOAD_DATE_TIME "sysdate"
INTO TABLE BILL_BKP_ADJUSTMENTS
when record_type ='ADJUSTMENTS'
TRAILING NULLCOLS
RECORD_TYPE char TERMINATED BY ',' ,
ADJUSTMENT_NAME CHAR TERMINATED BY ',' ENCLOSED BY '"',
LOAD_DATE_TIME "sysdate"
INTO TABLE BILL_BKP_CUSTOMERRECORD
when record_type ='CUSTOMERRECORD'
TRAILING NULLCOLS
RECORD_TYPE char TERMINATED BY ',' ,
GENEVA_CUSTOMER_REF CHAR TERMINATED BY ',' ENCLOSED BY '"',
LOAD_DATE_TIME "sysdate"
INTO TABLE TABLE_THREE
when record_type ='PRODUCTCHARGE'
TRAILING NULLCOLS
RECORD_TYPE char TERMINATED BY ',' ,
PROD_ATTRIB_1_CHRG_DESC CHAR TERMINATED BY ',' ENCLOSED BY '"',
LOAD_DATE_TIME "sysdate"
Has anyone faced similar errors or are we going wrong somewhere ?
Regards,
SandipanThis is the info on the discard in the log file :
Record 1: Discarded - failed all WHEN clauses.
Record 638864: Discarded - failed all WHEN clauses.
While some of the records were loaded for one table.
Regards,
Sandipan -
ORA-00054 error when loading Oracle table using Data Services
Hello,
we are facing ORA-00054 error when loading Oracle table using BO Data services
(Oracle 10g database, BODS Xi 3.2 SP3)
Test Job performs
1- truncate table
2- load table (tested in standard and bulk load modes)
Scenario when issue happens is:
1- Run loading Job
2- Job end in error for any Oracle data base error
3- When re-running the same Job, Job fails with following error
ORA-00054: resource busy and acquire with NOWAIT specified
It seems after first failure, Oracle session for loading the table stays active and locks the table.
To be able to rerun the Job, we are forced need to kill Oracle session manually to be able to run the Job again.
Expected behaviour would be : on error rollback modifications made on table and BODS stops Oracle session in a clean way.
Can somebody tell me / or point me to any BODS best practice about Oracle error handling to prevent such case?
Thanks in advance
Paul-Mariethe ora-0054 can occure depending how the job failed before. If this occures you will need the DBA to release the lock on the table in question
Or
AL_Engine.exe on The server it creates the Lock. Need to Kill Them. Or stop it..
This Problem Occurs when we select The Bulkloading Option in orclae We also faced the same issue,Our admin has Killed the session. Then everything alright. -
How to find out the Non Partitioned Tables used 2Gb on oracle
Hi team
how to find out the Non Partitioned Tables used > 2Gb on oracle where not is sys & system
regardsheres 1 I made earlier
set pagesize 999
set linesize 132
col owner format a25
col segment_name format a60
select owner,segment_name,segment_type,(bytes/1024/1024)"MB size"
from dba_segments
where owner not in ('SYS','SYSTEM','XDB','MDSYS','SYSMAN') -- edit for taste
and segment_type = 'TABLE'
having (bytes/1024/1024) > 2000
group by bytes, segment_Type, segment_name, owner
order by 4 asc -
Create Partition table using CTAS
Hi there,
Is it possible to create a duplicate partition table from a existing partition table using CTAS method? If yes, could you explain how ?If No , how to make a duplicate partition table?
Thanks in advance?
Rajesh MarathEasily:
conn / as sysdba
CREATE TABLESPACE part1
DATAFILE 'c:\temp\part01.dbf' SIZE 50M
BLOCKSIZE 8192
EXTENT MANAGEMENT LOCAL UNIFORM SIZE 256K
SEGMENT SPACE MANAGEMENT AUTO
ONLINE;
CREATE TABLESPACE part2
DATAFILE 'c:\temp\part02.dbf' SIZE 50M
BLOCKSIZE 8192
EXTENT MANAGEMENT LOCAL UNIFORM SIZE 256K
SEGMENT SPACE MANAGEMENT AUTO
ONLINE;
CREATE TABLESPACE part3
DATAFILE 'c:\temp\part03.dbf' SIZE 50M
BLOCKSIZE 8192
EXTENT MANAGEMENT LOCAL UNIFORM SIZE 256K
SEGMENT SPACE MANAGEMENT AUTO
ONLINE;
ALTER USER uwclass QUOTA UNLIMITED ON part1;
ALTER USER uwclass QUOTA UNLIMITED ON part2;
ALTER USER uwclass QUOTA UNLIMITED ON part3;
conn uwclass/uwclass
CREATE TABLE hash_part (
prof_history_id NUMBER(10),
person_id NUMBER(10) NOT NULL,
organization_id NUMBER(10) NOT NULL,
record_date DATE NOT NULL,
prof_hist_comments VARCHAR2(2000))
PARTITION BY HASH (prof_history_id)
PARTITIONS 3
STORE IN (part1, part2, part3);
CREATE TABLE duplicate_hash_part
PARTITION BY HASH (prof_history_id)
PARTITIONS 3
STORE IN (part1, part2, part3) AS
SELECT * FROM hash_part;Follow the same logic for list and range partitions -
Load Multiple Table using SQLLDR
Hi,
We are trying to Load Multiple Table using SQLLDR in same loading session.
Here's a snippet of our script:
LOAD DATA INFILE '/u07/dd/yyyy_mm_dd.dat'
append INTO TABLE table1
WHEN STORE_NO != ' '
STORE_NO position(1:5) ,
TR_DATE position(6:27) DATE "mm/dd/yy
truncate INTO TABLE table2
WHEN STORE_NO != ' '
STORE_NO position(1:5) ,
TR_DATE position(6:27) DATE "mm/dd/yy
Now upon execution this gives an error saying truncate shouldn't be present before 2nd INTO Clause. The script runs fine if we do same operation say Append / Replcae / Truncate on both tables. My Question here is can we have SQLLDR perform two diff operations (Append & Truncate in our case) in one session?
Thanks
Chinmay
PS: Referred to http://www.orafaq.com/wiki/SQL*Loader_FAQ#Can_one_load_data_from_multiple_files.2F_into_multiple_tables_at_once.3F alreadyNo.
did you read the syntax diagram and the paragraph below:
"When you are loading a table, you can use the INTO TABLE clause to specify a table-specific loading method (INSERT, APPEND, REPLACE, or TRUNCATE) that applies only to that table. That method overrides the global table-loading method. The global table-loading method is INSERT, by default, unless a different method was specified before any INTO TABLE clauses." -
SQL Loader Exception while loading Partitioned table
Hi,
I have a table EMP and it has year wise partitions created based on creation_date column.
Now, I using SQL Loader to load bulk data using Java Program. But I am getting SQLLoader Exception. When I drop partition on the table, same code is working fine.
Do I need to do anything extra for the partitioned table?
Please help me.
ThanksSQL Loader should produce a log file with an error code(s) in it. Check for that.
-
How to implement "delete" on a table using OWB?
Hi All,
How do I implement a simple delete using OWB?
Assumption:
(1) Table abcd (id number, age number, amt number)
(2) the_number is a variable
(3) the_id is a variable
Want to implement following transformation in OWB?
DELETE FROM ABCD WHERE AMT=0 AND number = the_number AND id = the_id ;
Rgds,
DeepakWe implemented delete mappings, and delete flows to be able to reverse a failed load. This is in my opinion a very sound and valid reason for deleting from a datawarehouse. Also if the need is there it could be used for deleting old superfluous data from the datawarehouse.
There are a few things to consider: closed records in type II should be opened up (post mapping).
Test, test, test.
It is indeed a bit tricky to realize,but certainly working and possible.
steps to take are following:
1) create new mapping
2) drop mapping table where to delete from onto mapping (2 times, 1 source, 1 target)
3) map all fields from source to their corresponding fields in target, except the ones that determine the "where" clause (Refered to as filter fields)
4) Either create a select, or a mapping input parameter which should result in generating the filter-values for your delete.
5) map above step to the filter fields.
6)define a delete mapping by altering target table properties as follows:
6a) Loading Type => Delete
6b) Match by constraint => No constraints
7) set properties each field as folows:
7a) filter fields match column when deleting => Yes
7b) other fields match column when deleting => No
Hope this helps,
Wilco -
Extract Data from XML and Load into table using SQL*Loader
Hi All,
We have a XML file (sample.xml) which contains credit card transaction information. We have a standard SQL*Loader control file which loads the data from a flat file and the control file code is written as position based method. Our requirement is to use this control file as per our requirement(i.e) load the data into the table from our XML file), But we need help in converting the XML to a flat file or Extract the data from the XML tags and pass the information to the control file and in turn it loads the table.
Your suggestion is highly appreciated.
Thanks in advanceHi,
First of all go to PSA maintanance ( Where you will see PSA records ).
Goto list---> Save-> File---> Spreadsheet (Choose Radio Button)
> Give the proper file name where you want to download and then-----> Generate.
You will get ur PSA data in Excel Format.
Thanks
Mayank -
Create a partitioned table using Round Robin
Hi,
I want to create a partitioned table.
My concern is that if I use round robin now, whether I can create a primary key for this table in the future?
Thanks!You cannot use Round Robin, if your table has primary keys.
But as you must know , now we can create a HASH Partitioning on the tables with out primary key too, i thought ii would rather test once again and comment:
Please find the screen shot below which confirms that Round Robin cannot be done on table with Primary keys.
Where as the table without primary keys am able to use Round robin. Till here it was fine for me.
Now in the second case where i created a table with Round robin and then later i added the primary key using Alter statement, then am able to have Round Robin on the table with primary keys. ( don't know what does that means )
Column screenshot:
Regards,
Krishna Tangudu -
Loading Flat files with into multiple tables using OWB
Hi,
How to implement the following logic in OWB.
LOAD DATA
INFILE 'myfile.txt'
BADFILE 'myfile.bad'
DISCARDFILE 'myfile.dsc'
APPEND
Into TABLE_Awhen (1:1) = 'A'
(Col1 Position(1:1) CHAR,
Col2 Position(2:5) CHAR)
Into TABLE_Bwhen (1:1) = 'B'
(Col1 Position(1:1) CHAR,
Col2 Position(2:20) EXTERNAL INTEGER)
Into TABLE_C
when (1:1) = 'C'
(Col1 Position(1:1) CHAR,
Col2 Position(2:20) EXTERNAL INTEGER)
I am using 10g version of OWB.I tried using the splitter operator.
I am getting the following error when i use the splitter.
An invalid combination of operators prevents the generation of code in a single implementation language (PL/SQL code, or SQL*Loader code, or ABAP code). For example, you may have included a SQL*Loader only operator such as the Data Generator in a mapping with a PL/SQL implementation type. If you designed the mapping to generate PL/SQL code, an invalid combination or sequence of operators prevents the generation of PL/SQL code in any of the operating modes(set based, row based, row based target only). If the mapping contains an operator that generates only PL/SQL output, all downstream dataflow operators must also be implementable by PL/SQL. You can use SQL operators in such a mapping only after loading the PL/SQL output to a target. Detail is as follows:
PL/SQL set based operating mode: Operator trailer_source_txt does not support SQL generation.
PL/SQL row based operating mode: Operator trailer_source_txt does not support SQL and PL/SQL generation.
PL/SQL row based (target only) operating mode: Operator trailer_source_txt does not support SQL and PL/SQL generation.
Both SQL and PL/SQL handlers are not supported by trailer_source_txt as output
SQL*Loader: Operator SPLIT does not support SQL*Loader generation.
ABAP: Operator trailer_source_txt does not support ABAP generation.
Thanks in advance,
VInayHi
Splitter can be used ib PL/SQL mappings, but if you use a flat file in a mapping, than it will be an SQLLoader mapping. So I suggest to you to create a mapping which load your flat file into a table, and from this table you load the data into the three table with the spillet in a PL/SQL mapping. Create two mappings.
Or you can use an external table in a mapping with a splitter.
Ott Karesz
http://www.trendo-kft.hu -
Load Setup table using Sales Document Ranges
Hi,
Is there any way to load the Setup table for Billing using Multiple Sales Document Ranges or Multiple single values.
Since the field has no option of Multiple Entries.
I don't want to execute multiple setup table with different ranges.
I have to use the Sales Document number as its needed to load on the data for a particular period using Sales Document numbers.
Thanks in advance.
SBHi SB,
If Sales Documnent Number is your selection criteria, as you rightly said, there is no way you can have multiple selections. I don't think you have any option other than loading different mutually exclusive sales document number ranges. You can schedule these ranges in parallel and that way you can save time. Hope this makes sense.
Thanks and Regards
Subray Hegde -
Hi All,
I have a unique requirement where the data comes from two different sources using the following cases in perspective
- Table A and Table B have common ID's but different information regarding the same entity.
- Table A has the commanding records going into the target table and table B fills in some other details not available in table A.
so in essence I am merging data which can be accomplished by UPDATE/INSERT ....How ever I am constrained by the date element , data from table B merges in to the target table different for a given period of time , so what I have done is created a bunch of mappings just to update the table with the logic for the different years and I am using the UPDATE operator to update the records and mapping is taking a long time , just to update 122,360 records the mapping has been running for about 2 hours.
Any suggestions on how to better approach this?
Regards,
MalamHi Malam,
there is no simple answer to your question. Maybe SQL tuning is the most complex subject for relational databases.
For example of successful SQL tuning look at this thread Hanging deployment
Regards,
Oleg -
Vista, unrecognized partition table use fdisk to fix?
I have a NTFS Vista partition and when I boot into it I get a lot of white type on black screens telling me there is an unknown partition located and to use fdisk to fix.
Vista comes up and works fine but I'd like to get rid of those screens.
Any ideas appreciated ...
Thanks ... KenTwo yrs going back to RC1 in Oct '06 and I've never seen, nor do I have any idea, what you did or have, and whether you have altered or touched the GPT or EFI partitions or USB device.
And I've got 4 internal NTFS drives, most with two partitions, some flash devices, FireWire 800, along with native SATA external.
And never even seen fdisk mentioned.
Maybe you should click on properties and schedule chkdsk for every partition you can and look in Computer -> Manage.
Maybe you are looking for
-
Mac Mini (server) 2011 not accepting 4 GB
My Mac Mini Server (mid 2011) does not accept 4 GB of memory. It does boot succesfully with a 2 GB SODIMM in slot 0 or in slot 1, but when I insert 2 2GB SODIMMs in both slots, it does not boot. It does give the start up beep and then after a few sec
-
Rejected items of RFQ should not appear in PO
Dear Friends, In a given RFQ i will select/ reject some items but while processing PO with Reference to RFQ system should display only selected RFQ items (RFQ items which have rejection indicator should not appear in PO). validation has been kept for
-
I have been having trouble when i export and upload to youtube it gets blurry. I render with the settings that are recommended by others for youtube uploads but it remains blurry on youtube in the project it is clear but after rendering and changing
-
I recently had a problem where the 'parts' directory in my search collection began to bloat with tons of files resulting in a HUGE and unexpected bill from my hosting company (a shared host). I had the following set to run every hour <cfindex collect
-
Trying to log in to manage my set top box
Well, Verizon has done it again! In the past, when trying to log in on the web to manage my box, Verizon gave the option to pick whether to use a verizon.net or a verizon.com account. I don't have a Verizon phone, so they can never figure out who I a