Dynamic column support in Bulk Load
Hi,
Is there any solution for the following scenario in bulk insert.
"The csv file which is going to get processed for a particular table will change its column order periodically , and sometimes some columns will be deleted and sometimes some new columns will be added . I need to map corresponding column value from csv
to table when the order changes or values deleted or new values added"
Hope you will get ?
~Selva
Is there any other solution?
Nope
Other than using script task to build a data flow programtically
see example here
http://mahiways4dotnet.blogspot.in/2005/10/how-to-programmatically-create-ssis.html
Please Mark This As Answer if it solved your issue
Please Vote This As Helpful if it helps to solve your issue
Visakh
My Wiki User Page
My MSDN Page
My Personal Blog
My Facebook Page
Similar Messages
-
RichTable dynamic column support with sorting
I have a table of data that has a list of custom properties which is dynamic. They could be text, integer, etc. I would like to display each custom property as a column in the table with the static list of properties as follows:
Name Data Date CustomProp1 CustomProp2
dataname some data 10/10/2010 my prop1 12345
Since properties are dynamic, my collection for the table does not have a fixed set of columns.
Before I get too far, has anyone had experience doing this such that the columns can be sorted and modified and are not part of a static list? For example, can I somehow override the sort function or what would I put for sortProperty for the column?
So far I plan on my table being a collection of my data object, then when defining the columns, use a foreach to get the list of custom columns...for the data, I would call a custom tag function that gets the value for the property / row pair (not sure if this will work or if there's a better way)...however, I'm stuck on sorting for sure.
Any help appreciated.
Thanks,
KrisFrank,
Okay, you confirmed my plan. For displaying the dynamic column data, can I add a hashtable to the table's model as follows, sounds like this could work:
<af:forEach ...>
<af:column>
<af:outputText value="#{tablevar.hashTable[foreachvar]}"> </af:outputText>
</af:column>
</af:forEach>
At least this is what I'm going to try...
Actually, my first roadblock is what to put into the sortable property or is there a way for me to override and use my own method for sort?:
<af:column sortable="true" sortProperty="">
</af:column>
Thank you,
Kris
Edited by: KrisFromOhio on Dec 3, 2008 1:37 PM -
OIM 11g R2: Bulk load utility
Hi all,
I'm going to import users data to OIM 11g R2 using bulk load utility. My question is: does bulk load utility support user defined fileds (UDFs)?
Reading the documentation I don't find any reference about UDFs or UDFs limitation, but I would be aware of any your experience before starting.
Thank for support,
DanieleIt should support. Bulk load loads the data directly into database table using sql loader. So as long as you have UDF column in USR table and you have specified it in csv file, i believe it should work.
-
Hi,
I have a file where fields are wrapped with ".
=========== file sample
"asdsa","asdsadasdas","1123"
"asdsa","asdsadasdas","1123"
"asdsa","asdsadasdas","1123"
"asdsa","asdsadasdas","1123"
==========
I am having a .net method to remove the wrap characters and write out a file without wrap characters.
======================
asdsa,asdsadasdas,1123
asdsa,asdsadasdas,1123
asdsa,asdsadasdas,1123
asdsa,asdsadasdas,1123
======================
the .net code is here.
========================================
public static string RemoveCharacter(string sFileName, char cRemoveChar)
object objLock = new object();
//VirtualStream objInputStream = null;
//VirtualStream objOutStream = null;
FileStream objInputFile = null, objOutFile = null;
lock(objLock)
try
objInputFile = new FileStream(sFileName, FileMode.Open);
//objInputStream = new VirtualStream(objInputFile);
objOutFile = new FileStream(sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString(), FileMode.Create);
//objOutStream = new VirtualStream(objOutFile);
int nByteRead;
while ((nByteRead = objInputFile.ReadByte()) != -1)
if (nByteRead != (int)cRemoveChar)
objOutFile.WriteByte((byte)nByteRead);
finally
objInputFile.Close();
objOutFile.Close();
return sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString();
==================================
however when I run the bulk load utility I get the error
=======================================
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 3 (NumberOfMultipleMatches).
==========================================
the bulk insert statement is as follows
=========================================
BULK INSERT Temp
FROM '<file name>' WITH
FIELDTERMINATOR = ','
, KEEPNULLS
==========================================
Does anybody know what is happening and what needs to be done ?
PLEASE HELP
Thanks in advance
VikramTo load that file with BULK INSERT, use this format file:
9.0
4
1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 "\",\"" 1 col1 Latin1_General_CI_AS
3 SQLCHAR 0 0 "\",\"" 2 col2 Latin1_General_CI_AS
4 SQLCHAR 0 0 "\"\r\n" 3 col3 Latin1_General_CI_AS
Note that the format file defines four fields while the fileonly seems to have three. The format file defines an empty field before the first quote.
Or, since you already have a .NET program, use a stored procedure with table-valued parameter instead. I have an example of how to do this here:
http://www.sommarskog.se/arrays-in-sql-2008.html
Erland Sommarskog, SQL Server MVP, [email protected] -
COLUMN DATA VALUE EXCEEDS-REJECTED WHILE BULK LOADING
hi
am having a table like
CREATE TABLE SMLC_CELLDATA
MCC NUMBER,
MNC NUMBER,
LAC VARCHAR2(10 BYTE),
CELL_ID VARCHAR2(10 BYTE),
CELL_NAME VARCHAR2(500 BYTE),
LAT FLOAT(20),
LON FLOAT(20),
ORIENTATION NUMBER,
OPENING NUMBER,
RANGE NUMBER,
BSIC NUMBER,
ARFCN NUMBER,
HANDOVER VARCHAR2(4000 BYTE),
EIRP NUMBER,
ENVIRONMENT VARCHAR2(50 BYTE)
TABLESPACE SYSTEM
PCTUSED 40
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
FREELISTS 1
FREELIST GROUPS 1
BUFFER_POOL DEFAULT
LOGGING
NOCACHE
NOPARALLEL;WHEN I TRY TO BULK LOAD DATA IN TO THIS TABLE CERTAIN ROWS WERE REJECTED. AND IT SAYS COLUMN HANDOVER VALUE EXCEEDS THE LIMIT.
BUT THE DATA IS
404-80-101-1021 404-80-101-1022 404-80-101-1023 404-80-101-1101 404-80-101-1103 404-80-101-1131 404-80-101-1132 404-80-101-1133 404-80-101-1151 404-80-101-1153 404-80-101-1161 404-80-101-1163 404-80-101-1322 404-80-101-1392 404-80-101-1393 404-80-120-18312
WHICH IS OF 256 BYTES. HOW COME IT CAN REJECT IT..?
AEMUNATHAN
Edited by: Aemunathan on Sep 26, 2008 6:13 PMthe bad file contains
404 80 101 1102 110_Hotel_Madura_H2 10.79805556 78.68110833 120 60 3000 38 77 404-80-101-1021 404-80-101-1022 404-80-101-1023 404-80-101-1101 404-80-101-1103 404-80-101-1131 404-80-101-1132 404-80-101-1133 404-80-101-1151 404-80-101-1153 404-80-101-1161 404-80-101-1163 404-80-101-1322 404-80-101-1392 404-80-101-1393 404-80-120-18312 52 u
404 80 105 9012 901_Tansi_Kumbakonam_H2 10.96108056 79.39166667 120 60 3000 34 75 404-80-105-9011 404-80-105-9013 404-80-105-9032 404-80-105-9051 404-80-105-9052 404-80-105-9091 404-80-105-9092 404-80-105-9093 404-80-105-9132 404-80-105-9133 404-80-105-9181 404-80-105-9183 404-80-105-9233 404-80-105-9282 404-80-121-17013 404-80-121-17033 52 u
404 80 107 13012 1301_Attur_H2 11.59705556 78.59446944 100 60 3000 32 77 404-80-107-13011 404-80-107-13013 404-80-107-13023 404-80-107-13041 404-80-107-13043 404-80-107-13112 404-80-107-13113 404-80-107-13121 404-80-107-13122 404-80-107-13123 404-80-107-13441 404-80-107-13443 404-80-107-13471 404-80-107-13473 404-80-232-32842 404-80-232-32843 52 u
404 80 107 13303 1330_Omallur_H2 11.73888889 78.04583056 210 60 3000 36 69 404-80-107-13101 404-80-107-13102 404-80-107-13241 404-80-107-13301 404-80-107-13302 404-80-107-13313 404-80-107-13322 404-80-107-13352 404-80-107-13423 404-80-204-4051 404-80-204-4052 404-80-204-4053 404-80-204-4072 404-80-204-4073 404-80-207-7201 404-80-207-7203 52 u
404 80 107 13423 1342_Yercaud_Support_H1 11.77083056 78.19860833 240 60 3000 32 63 404-80-107-13071 404-80-107-13072 404-80-107-13073 404-80-107-13091 404-80-107-13092 404-80-107-13093 404-80-107-13102 404-80-107-13132 404-80-107-13171 404-80-107-13173 404-80-107-13301 404-80-107-13302 404-80-107-13303 404-80-107-13311 404-80-107-13313 404-80-107-13322 404-80-107-13351 404-80-107-13352 404-80-107-13353 404-80-107-13421 404-80-107-13422 404-80-136-36102 404-80-204-4021 404-80-204-4031 404-80-204-4051 404-80-204-4073 404-80-204-4261 404-80-204-4263 52 u
404 80 109 15152 1515_Paramathy_Exg_H1U 11.15138889 78.02471944 200 60 3000 32 71 404-80-109-15141 404-80-109-15142 404-80-109-15143 404-80-109-15151 404-80-109-15153 404-80-109-15251 404-80-109-15253 404-80-109-15311 404-80-109-15312 404-80-109-15313 404-80-109-15322 404-80-109-15401 404-80-109-15403 404-80-205-5232 404-80-207-7531 404-80-207-7533 52 u
404 80 109 15172 1517_Andalurgate_H1 11.44916667 78.15916667 80 60 3000 39 68 404-80-109-15011 404-80-109-15053 404-80-109-15161 404-80-109-15163 404-80-109-15171 404-80-109-15173 404-80-109-15181 404-80-109-15182 404-80-109-15183 404-80-109-15193 404-80-109-15261 404-80-109-15333 404-80-109-15421 404-80-109-15430 404-80-109-15463 404-80-207-7101 404-80-207-7102 404-80-207-7162 52 u
404 80 112 20012 2001_Attur_TE_H2D 11.59705556 78.59446944 85 60 1500 32 77 404-80-107-13121 404-80-107-13122 404-80-107-13123 404-80-112-20011 404-80-112-20013 404-80-112-20022 404-80-112-20023 404-80-112-20141 404-80-112-20143 404-80-112-20153 404-80-112-20231 404-80-112-20233 404-80-112-20251 404-80-112-20253 404-80-232-32842 404-80-232-32843 52 u
404 80 115 12222 1222_Venkateswara_Ngr_H1 11.11722222 79.66166667 210 60 3000 35 71 404-80-115-12221 404-80-115-12223 404-80-115-12231 404-80-115-12232 404-80-115-12241 404-80-115-12242 404-80-115-12243 404-80-115-12253 404-80-115-12261 404-80-115-12262 404-80-115-12311 404-80-115-12312 404-80-115-12313 404-80-115-12321 404-80-115-12353 404-80-115-12413 404-80-115-12471 52 u
404 80 121 17011 1701_Eravancheri_H1 10.93305556 79.55416667 40 60 3000 34 82 404-80-105-9112 404-80-105-9113 404-80-105-9163 404-80-105-9742 404-80-115-12242 404-80-115-12243 404-80-115-12252 404-80-115-12403 404-80-115-12473 404-80-121-17012 404-80-121-17013 404-80-121-17041 404-80-121-17043 404-80-121-17131 404-80-121-17133 404-80-121-17182 52 u
404 80 121 17442 1744_Exg_Pattukkottai_H2 10.42666667 79.31891667 130 60 3000 32 76 404-80-121-17111 404-80-121-17113 404-80-121-17162 404-80-121-17401 404-80-121-17402 404-80-121-17403 404-80-121-17422 404-80-121-17423 404-80-121-17431 404-80-121-17433 404-80-121-17441 404-80-121-17443 404-80-121-17461 404-80-121-17472 404-80-121-17522 404-80-121-17523 52 u
404 80 130 30022 Coonoor_MW_2 11.34527778 76.81027778 270 65 3000 47 68 404-80-130-30041 404-80-130-30051 404-80-130-30151 404-80-130-30012 404-80-130-30031 404-80-130-30032 404-80-130-30033 404-80-130-30021 404-80-130-30052 404-80-130-30152 404-80-130-30153 404-80-130-30011 404-80-130-30211 404-80-130-30162 404-80-130-30163 404-80-130-30013 404-80-130-30361 404-80-130-30023 404-80-130-30363 404-80-130-30251 404-80-130-30253 52 u
404 80 130 30093 Forest_Land_Ooty_3 11.395 76.70277778 270 65 3000 44 75 404-80-130-30433 404-80-130-30182 404-80-130-30142 404-80-130-30143 404-80-130-30103 404-80-130-30091 404-80-130-30092 404-80-130-30082 404-80-130-30083 404-80-130-30053 404-80-130-30321 404-80-130-30183 404-80-130-30422 404-80-130-30283 404-80-130-30423 404-80-130-30413 404-80-130-30432 404-80-130-30431 52 u
404 80 130 30163 Vandhisolai_3 11.37222222 76.81861111 220 65 3000 47 74 404-80-130-30161 404-80-130-30162 404-80-130-30153 404-80-130-30011 404-80-130-30012 404-80-130-30151 404-80-130-30073 404-80-130-30022 404-80-130-30033 404-80-130-30051 404-80-130-30361 404-80-130-30013 404-80-130-30303 404-80-130-30383 404-80-130-30023 404-80-130-30203 404-80-130-30253 52 u
404 80 132 32731 Collectorate_IBS TNJ 10.81458 79.60622 0 65 3000 47 74 404-80-121-17071 404-80-121-17091 404-80-121-17093 u
404 80 134 34741 Parambikulam_IBS_1 10.390417 76.774889 0 65 3000 42 63 404-80-132-32443 m
404 80 134 34751 TNG_Palayam_IBS2_1 10.390417 76.774889 0 65 1000 40 63 404-80-134-34141 404-80-134-34651 404-80-134-34652
404 80 149 49011 Tirunelveli MSC Switch 8.72777 77.7044 0 360 200
404 80 149 49012 Tirunelveli MSC Switch 8.72777 77.7044 0 360 200 m
404 80 166 6611 Ranga_Pilllai BSNL Pondy Exchange building 11.56 79.5 0 360 200 404-80-160-60171 m
404 80 204 4261 426_Old_Suramangalam_H2 11.67055556 78.10999722 0 60 3000 39 70 404-80-107-13072 404-80-107-13073 404-80-107-13302 404-80-107-13423 404-80-204-4051 404-80-204-4071 404-80-204-4072 404-80-204-4073 404-80-204-4251 404-80-204-4253 404-80-204-4262 404-80-204-4263 404-80-204-4270 404-80-204-4753 404-80-204-4773 404-80-204-4910 52 u
404 80 205 5232 523_Kodumudi_H2 11.07916667 77.87999722 120 60 3000 33 69 404-80-109-15142 404-80-109-15152 404-80-109-15382 404-80-109-15412 404-80-109-15413 404-80-114-11103 404-80-114-11201 404-80-114-11203 404-80-205-5231 404-80-205-5233 404-80-205-5251 404-80-205-5252 404-80-205-5571 404-80-205-5572 404-80-206-6081 404-80-207-7552 52 u
404 80 205 5392 539_V.P.Palayam_Erode_H1 11.29610833 77.66944444 120 60 3600 33 73 404-80-205-5593 404-80-205-5591 404-80-205-5493 404-80-205-5492 404-80-205-5393 404-80-205-5391 404-80-205-5283 404-80-205-5281 404-80-205-5163 404-80-205-5161 404-80-205-5133 404-80-205-5123 404-80-110-16762 404-80-110-16651 404-80-110-16612 404-80-110-16611 52 u
404 80 232 32842 Thalaivasal_2 11.5698056 78.7496389 120 60 3000 43 80 404-80-107-13042 404-80-107-13441 404-80-107-13471 404-80-107-13473 404-80-2003-22253 404-80-232-32841 404-80-232-32843 u
each row ends with "u"
Edited by: Aemunathan on Sep 26, 2008 9:02 PM -
Hi All,
I was looking at an option of how we can bulk upload and extract supporting details into/from Hyperion Planning. Version is 4.0.2.2
Smartview??
ThanksYou will need to use SQL to bulk load and/or extract supporting detail.
The relevant tables are in your specific planning app DB are below:
Table name: HSP_COLUMN_DETAIL
Description: One row per line item detail will exist in this table. This table stores unique numbers which relate to the dimension names and also the detail_id which is a value to find the specific cell detail.
Table name: HSP_OBJECT
Description: This table is used to translate DIM1 – DIM22 ID’s into member names. Object_name is the member name. If the member has an alias the HSP_ALIAS table must be queried.
Table name: HSP_COLUMN_DETAIL_ITEM
Description: One row per line item detail will exist in this table. This table stores unique numbers which relate to the dimension names and also the detail_id which is a value to find the specific cell detail.
Regards,
-John -
Hi there
Just wanted to know is it a bug or feature - in case if Column store table has non capital letters in its name, then bulk load does not work and performance is ruined?
Mikeit looks like we're having performance issues here and bulk load is failing because of connection method which is being used with Sybase RS. If we use standard ODBC then everything is as it should be, but as soon as we swith to .NET world then nothing happens, silngle inserts/updates are ok
So, we have Application written in mixed J2ee/.NET and we use HANA applience as host for tables, Procedures and views.
This issues hs been sent to support, will update as soon as i get smth from them -
How to UPDATE a big table in Oracle via Bulk Load
Hi all,
in a datastore target as Oracle 11g, I have a big table having 300milions of record; the structure is One integer key + 10 columns attributes .
In IQ Source i have the same table with the same size ; the structure is One integer key + 1 column attributes .
What i need to do is to UPDATE that single field in Oracle from the values stored in IQ .
Any idea on how to organize efficiently the dataflow and the target writing mode ? bulk load ? api ?
thank you
MaurizioHi,
You cannot do bulk load when you need to UPDATE a field. Because all a bulk load does is add records to your table.
Since you have to UPDATE a field, i would suggest to go for SCD with
source > TC > MO > KG >target
Arun -
How to get the dynamic columns in UWL portal
Hi All,
I am working on UWL Portal. I am new to UWL. I have down loaded uwl.standard XML file and costomized for getting the values for "select a Subview" dropdown and I am able to see the values in the dropdown. Now my requirement is to get the dynamic columns based on the selection from dropdown value.
can any body suggest on how to get the dynamic columns in UWL portal.Hi Manorama,
1) If you have already created a portal system as mentioned in following blog
/people/marcel.salein/blog/2007/03/14/how-to-create-a-portal-system-for-using-it-in-visual-composer
2) If not, then try to create the same. Do not forgot to give the Alias name .
3) After creating a system, log on to the VC, Create one iView.
4) Now Click on "Find Data" button from the list which you can view in right side to Visual composer screen.
5) After clicking on "Find Data" button, it will ask for System. If you have created your system correctly and Alias name is given properly, then your mentioned Alias name is appeared in that list.
6) Select your system Alias name and perform search.
7) It will display all the BAPIs and RFCs in your systems.
8) Select required BAPI and develop the VC application.
Please let me know if you any further problems.
Thanks,
Prashant
Do reward points for useful answers. -
Hello All,
I have a requirement to generate dynamic column in the PDF.
I have got the same running using the below code
HEADER : <?split-column-header:XXX_TIME?><?split-column-width:@width?><?XXX_TIME_FROM?>
DATA : <?split-column-data:XXX_START?><?EMP_NO?>
The issue I am having is if i generate the output in pdf with small amount of data,it will work but I am not sure how much data can come at run time.
It can be sometime more than 50 or more columns also.In that case it truncates the data.
How can i do the same.
Pls help.
Thanks
SkIncrease the page size and printable page size, and reduce the column lengths,
so that you can accomadate them in single row.
BIP will run in single row, but we have option of making it to next row too.
But even if you make it in single row, page size has to support the length of the single row you are making :)..
so you got to increase the page width to Max -
Importing From Flat File with Dynamic Columns
HI
I am using ssis 2008,i have folder in which I have Four(4) “.txt” files each file will have 2 columns(ID, NAME). I loaded 4
files in one destination, but today I receive one more “.txt” file here we have 3 columns (ID, NAME, JOB) how can I get a message new column will receive in source. And how can I create in extra column in my destination table dynamically …please help meHi Sasidhar,
You need a Script Task to read the names and number of columns in the first row of the flat file each time and store it in a variable, then create a staging table dynamically based on this variable and modify the destination table definition if one ore more
new columns need to be added, and then use the staging table to load the destination table. I am afraid there is no available working script for your scenario, and you need some .NET coding experience to achieve your goal. Here is an example you can refer
to:
http://www.citagus.com/citagus/blog/importing-from-flat-file-with-dynamic-columns/
Regards,
Mike Yin
TechNet Community Support -
Hello,
I have one question regarding bulk loading. I did lot of bulk loading.
But my requirement is to call function which will do some DML operation and give ref key so that i can insert to fact table.
Because i can't use DML function in select statement. (which will give error). otherway is using autonomous transaction. which i tried working but performance is very slow.
How to call this function inside bulk loading process.
Help !!
xx_f is function which is using autonmous transction,
See my sample code
declare
cursor c1 is select a,b,c from xx;
type l_a is table of xx.a%type;
type l_b is table of xx.b%type;
type l_c is table of xx.c%type;
v_a l_a;
v_b l_b;
v_c l_c;
begin
open c1;
loop
fetch c1 bulk collect into v_a,v_b,v_c limit 1000;
exit when c1%notfound;
begin
forall i in 1..v_a.count
insert into xxyy
(a,b,c) values (xx_f(v_a(i),xx_f(v_b(i),xx_f(v_c(i));
commit;
end bulkload;
end loop;
close c1;
end;
I just want to call xx_f function without autonoumous transaction.
but with bulk loading. Please let me if you need more details
Thanks
yreddyrCan you show the code for xx_f? Does it do DML, or just transformations on the columns?
Depending on what it does, an alternative could be something like:
DECLARE
CURSOR c1 IS
SELECT xx_f(a), xx_f(b), xx_f(c) FROM xx;
TYPE l_a IS TABLE OF whatever xx_f returns;
TYPE l_b IS TABLE OF whatever xx_f returns;
TYPE l_c IS TABLE OF whatever xx_f returns;
v_a l_a;
v_b l_b;
v_c l_c;
BEGIN
OPEN c1;
LOOP
FETCH c1 BULK COLLECT INTO v_a, v_b, v_c LIMIT 1000;
BEGIN
FORALL i IN 1..v_a.COUNT
INSERT INTO xxyy (a, b, c)
VALUES (v_a(i), v_b(i), v_c(i));
END;
EXIT WHEN c1%NOTFOUND;
END LOOP;
CLOSE c1;
END;John -
Using API to run Catalog Bulk Load - Items & Price Lists concurrent prog
Hi everyone. I want to be able to run the concurrent program "Catalog Bulk Load - Items & Price Lists" for iProcurement. I have been able to run concurrent programs in the past using the fnd_request.submit_request API. But I seem to be having problems with the item loading concurrent program. for one thing, the program is stuck on phase code P (pending) status.
When I run the same concurrent program using the iProcurement Administration page it runs ok.
Has anyone been able to run this program through the backend? If so, any help is appreciated.
ThanksHello S.P,
Basically this is what I am trying to achieve.
1. Create a staging table. The columns available for it are category_name, item_number, item_description, supplier, supplier_site, price, uom and currency.
So basically the user can load item details into the database from an excel sheet.
2. use the utl_file api, create an xml file called item_load.xml using the data in the staging table. this will create the xml file used to load items in iprocurement and save it in the database directory /var/tmp/iprocurement This part works great.
3. use the api fnd_request.submit_request to submit the concurrent program 'Catalog Bulk Load - Items & Price Lists'. This is where I am stuck. The process simply says pending or comes up with an error saying:
oracle.apps.fnd.cp.request.FileAccessException: File /var/tmp/iprocurement is not accessable from node/machine moon1.oando-plc.com.
I'm wondering if anyone has used my approach to load items before and if so, have they been successful?
Thank you -
CBO madness after bulk loading
This is an extension of my other recent posts, but I felt it deserved it's own space.
I have a table of telephone call records, one row for each telephone call made or received by a customer. Our production table has a 10-field PK that I want to destroy. In my development version, the PK for this table is a compound key on LOC, CUST_NO, YEAR, MONTH, and SEQ_NO. LOC is a char(3), the rest are numbers.
After a bulk load into a new partition of this table, a query with these 5 fields in the where clause chooses a second index. That second index includes LOC, YEAR, MONTH, and two other fields not in the PK nor in the query. The production instance does the same thing, and I was certain that having the 5-field PK would be the magic bullet.
Oracle SQL Developer's autotrace shows a "Filter Predicates" on CUST_NO and SEQ_NO, and then the indexed range scan on the other 3 fields in the second index. Still noteworthy is that query on just LOC, CUST_NO, YEAR and MONTH does use the PK.
Here are the steps I've taken to test this:
1. Truncate the partition in question
2. Drop old PK constraint/index
3. Create new PK constraint/index
4. Gather table stats with cascade=>TRUE
5. Bulk load data (in this case, 1.96 million rows) into empty partition
6. autotrace select query
7. Write to dizwell in tears
This table also has two other partitions for past two cycles, each with around 30 million row.
Yes, gathering table stats again makes things behave as expected, but that takes a fair bit of time. For the meantime we've put an index hint in the application query that was suffering the most.First, the CBO doesn't actually choose a full table
scan, it chooses to use a second index.Depending on the query, of course. If the CBO thinks a partition is empty, I would suspect that it would find it most efficient to scan the smallest index, and the second index, with fewer columns, would be expected to be smaller. If it thinks they are equally costly, I believe it will use the one that was created first, though I wouldn't want to depend on that sort of failure.
I've lowered the sample percentage to 10% and set
CASCADE to FALSE and it still take 45 minutes in
production. The staging table was something I was
considering.
Are statistics included in partition exchange? I've
asked that question before but never saw an answer.Yes, partition-level statistics will be included. Table-level statistics will be automatically adjusted. From the SQL Reference
http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14200/statements_3001.htm#i2131250
"All statistics of the table and partition are exchanged, including table, column, index statistics, and histograms. Oracle Database recalculates the aggregate statistics of the table receiving the new partition."
You could also just explicitly set table-level statistics, assuming you don't need too many histograms, possibly gathering statistics for real later on.
Justin -
Error when Bulk load hierarchy data
Hi,
While loading P6 Reporting databases following message error appears atthe step in charge of Bulk load hierarchy data into ODS.
<04.29.2011 14:03:59> load [INFO] (Message) - === Bulk load hierarchy data into ODS (ETL_LOADWBSHierarchy.ldr)
<04.29.2011 14:04:26> load [INFO] (Message) - Load completed - logical record count 384102.
<04.29.2011 14:04:26> load [ERROR] (Message) - SqlLoaderSQL LOADER ACTION FAILED. [control=D:\oracle\app\product\11.1.0\db_1\p6rdb\scripts\DATA_WBSHierarchy.csv.ldr] [file=D:\oracle\app\product\11.1.0\db_1\p6rdb\temp\WBSHierarchy\DATA_WBSHierarchy.csv]
<04.29.2011 14:04:26> load [INFO] (Progress) - Step 3/9 Part 5/6 - FAILED (-1) (0 hours, 0 minutes, 28 seconds, 16 milliseconds)
Checking corresponding log error file (see below) I see that effectively some records are rejected. Question is: How could I identify the source of the problem and fix it?
QL*Loader: Release 11.1.0.6.0 - Production on Mon May 2 09:03:22 2011
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Control File: DATA_WBSHierarchy.csv.ldr
Character Set UTF16 specified for all input.
Using character length semantics.
Byteorder little endian specified.
Data File: D:\oracle\app\product\11.1.0\db_1\p6rdb\temp\WBSHierarchy\DATA_WBSHierarchy.csv
Bad File: DATA_WBSHierarchy.bad
Discard File: none specified
+(Allow all discards)+
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table WBSHIERARCHY, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
PARENTOBJECTID FIRST * WHT CHARACTER
PARENTPROJECTID NEXT * WHT CHARACTER
PARENTSEQUENCENUMBER NEXT * WHT CHARACTER
PARENTNAME NEXT * WHT CHARACTER
PARENTID NEXT * WHT CHARACTER
CHILDOBJECTID NEXT * WHT CHARACTER
CHILDPROJECTID NEXT * WHT CHARACTER
CHILDSEQUENCENUMBER NEXT * WHT CHARACTER
CHILDNAME NEXT * WHT CHARACTER
CHILDID NEXT * WHT CHARACTER
PARENTLEVELSBELOWROOT NEXT * WHT CHARACTER
CHILDLEVELSBELOWROOT NEXT * WHT CHARACTER
LEVELSBETWEEN NEXT * WHT CHARACTER
CHILDHASCHILDREN NEXT * WHT CHARACTER
FULLPATHNAME NEXT 8000 WHT CHARACTER
SKEY SEQUENCE (MAX, 1)
value used for ROWS parameter changed from 64 to 21
Record 14359: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 14360: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 14361: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 27457: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 27458: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 27459: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 38775: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 38776: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 38777: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 52411: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 52412: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 52413: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 114619: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 114620: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 127921: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 127922: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 164588: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 164589: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 171322: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 171323: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 186779: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 186780: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 208687: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 208688: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 221167: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 221168: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 246951: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 246952: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Table WBSHIERARCHY:
+384074 Rows successfully loaded.+
+28 Rows not loaded due to data errors.+
+0 Rows not loaded because all WHEN clauses were failed.+
+0 Rows not loaded because all fields were null.+
Space allocated for bind array: 244377 bytes(21 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 384102
Total logical records rejected: 28
Total logical records discarded: 0
Run began on Mon May 02 09:03:22 2011
Run ended on Mon May 02 09:04:07 2011
Elapsed time was: 00:00:44.99Hi Mandeep,
Thanks for the information.
But still it doesnot seem to work.
Actally, i have Group ID and Group Name as display field in the Hiearchy table.
Group ID i have directly mapped to Group ID.
I have created a Split Hierarchy of Group Name and mapped it.
I have also made all the options configurations as per your suggestions, but it doenot work still.
Can you please help.
Thanks,
Priya.
Maybe you are looking for
-
Mass update of a custom field at Product Revenue level
We are trying to update a custom field at Product Revenue level using the Opportunity Web Service. The webservice call is timing out as we have thousands of Opportunities. I would appreciate if anybody gives input on how we can massively update a cus
-
Is there a way to inform all Applications in a Workspace...?
Hi, I can inform the developers of an instance in the login Page and Home Page from the internal WS. Under Shared Componants I can send a message to the developer Homepage too, but I haven't found a way to inform all Applications of a Workspace or of
-
My Ipod Nano was working fine. I turned it off, and went to bed. The next morning I got up and turned it on. All that showed was a bright white screen. I reset it a couple of times, I pluged it up to the USB port, and I let it die completely. None of
-
Hi, I have installed OATS complete suite 9.3.1 on Windows 7 (32 Bit OS). On OpenScript tool, both Browser Types radio buttons are disabled and Firefox radio button is selected. I am unable to change the Browser Type in OpenScript preferences. When I
-
Hello, I have noticed this for a couple of weeks now. The display on my 2012 MacBook Pro (non-retina) will switch white balance for a second and then flip back to the normal colors. When it does this you can notice a distinct blue-ish tint to the c