Load big table (almost 1 bilion record)

Hello everybody,
I have a little problem: I'm working in a PRE-PRODUCTION environment (banking sector) and I have created a table with daily situation of all accounts from bank. I had to construct this table for the first 8 months of 2010 (since 09.2010 this table exists).
In our DB's, there is a protocol (which I believe is not only in our case) that re-deploy and overwrite the PRE-PRODUCTION environment (including DB's) with the PRODUCTION environment. Weekly! That mean that everything I done in this PRE-PRODUCTION DB (package, table, etc.) it will be overwritten in the week-end.
The package that load this table it's done, but the execution time it's huge (almost 24 hours).
This table is used in another reporting applications (to be deployed), which takes more than 2-3 weeks to be developed. That mean that (after I have created&loaded this table) I have to export it and import it at the beginning of every week, in order to use it in the other applications. The problem is, due the HUGE number of records in this table, that the loading time it's almost bigger like the execution time of the loading procedure. So, in this moment I'm in this situation: at the beginning of every week (until the other application it's developed&tested&approved) I have to:
1. load the table by executing the package
or
2. import the table
Both variants take the same time: almost 24 hours.
There is other possibilities that can help me to import or load this table faster? Something like PIPELINE functions?

Create table Daily_Movement --(contain many records/bank/account/day)
(data_contabile date,
bank_ID varchar2(5),
account_ID number,
acc_DO_balance number(20,3),
acc_EU_balance number(20,3),
currency_code varchar2(3));
-- account 11
-- 4 january: total daily movement = 103
insert into Daily_Movement values (to_date('20100104','yyyymmdd'), 'Bank1', 11, 50, 50, 'EUR');
insert into Daily_Movement values (to_date('20100104','yyyymmdd'), 'Bank1', 11, 25, 25, 'EUR');
insert into Daily_Movement values (to_date('20100104','yyyymmdd'), 'Bank1', 11, 28, 28, 'EUR');
-- 5 january: total daily movement = 33
insert into Daily_Movement values (to_date('20100105','yyyymmdd'), 'Bank1', 11, 11, 11, 'EUR');
insert into Daily_Movement values (to_date('20100105','yyyymmdd'), 'Bank1', 11, 22, 22, 'EUR');
-- 6 january: total daily movement = 44
insert into Daily_Movement values (to_date('20100106','yyyymmdd'), 'Bank1', 11, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100106','yyyymmdd'), 'Bank1', 11, -44, -44, 'EUR');
insert into Daily_Movement values (to_date('20100106','yyyymmdd'), 'Bank1', 11, 55, 55, 'EUR');
-- 7 january: total daily movement = 231
insert into Daily_Movement values (to_date('20100107','yyyymmdd'), 'Bank1', 11, 66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100107','yyyymmdd'), 'Bank1', 11, 77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100107','yyyymmdd'), 'Bank1', 11, 88, 88, 'EUR');
-- 8 january: total daily movement = 10
insert into Daily_Movement values (to_date('20100108','yyyymmdd'), 'Bank1', 11, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100108','yyyymmdd'), 'Bank1', 11, -100, -100, 'EUR');
insert into Daily_Movement values (to_date('20100108','yyyymmdd'), 'Bank1', 11, -11, -11, 'EUR');
-- 11 january: total daily movement = 33
insert into Daily_Movement values (to_date('20100111','yyyymmdd'), 'Bank1', 11, 22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100111','yyyymmdd'), 'Bank1', 11, -33, -33, 'EUR');
insert into Daily_Movement values (to_date('20100111','yyyymmdd'), 'Bank1', 11, 44, 44, 'EUR');
-- 12 january: total daily movement = 88
insert into Daily_Movement values (to_date('20100112','yyyymmdd'), 'Bank1', 11, -55, -55, 'EUR');
insert into Daily_Movement values (to_date('20100112','yyyymmdd'), 'Bank1', 11, 66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100112','yyyymmdd'), 'Bank1', 11, 77, 77, 'EUR');
-- 13 january: total daily movement = 89
insert into Daily_Movement values (to_date('20100113','yyyymmdd'), 'Bank1', 11, 88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100113','yyyymmdd'), 'Bank1', 11, -99, -99, 'EUR');
insert into Daily_Movement values (to_date('20100113','yyyymmdd'), 'Bank1', 11, 100, 100, 'EUR');
-- 14 january: total daily movement = 22
insert into Daily_Movement values (to_date('20100114','yyyymmdd'), 'Bank1', 11, 11, 11, 'EUR');
insert into Daily_Movement values (to_date('20100114','yyyymmdd'), 'Bank1', 11, -22, -22, 'EUR');
insert into Daily_Movement values (to_date('20100114','yyyymmdd'), 'Bank1', 11, 33, 33, 'EUR');
-- 15 january: total daily movement = -33
insert into Daily_Movement values (to_date('20100115','yyyymmdd'), 'Bank1', 11, -44, -44, 'EUR');
insert into Daily_Movement values (to_date('20100115','yyyymmdd'), 'Bank1', 11, -55, -55, 'EUR');
insert into Daily_Movement values (to_date('20100115','yyyymmdd'), 'Bank1', 11, 66, 66, 'EUR');
-- 18 january: total daily movement = -110
insert into Daily_Movement values (to_date('20100118','yyyymmdd'), 'Bank1', 11, 77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100118','yyyymmdd'), 'Bank1', 11, -88, -88, 'EUR');
insert into Daily_Movement values (to_date('20100118','yyyymmdd'), 'Bank1', 11, -99, -99, 'EUR');
-- 19 january: total daily movement = 111
insert into Daily_Movement values (to_date('20100119','yyyymmdd'), 'Bank1', 11, 100, 100, 'EUR');
insert into Daily_Movement values (to_date('20100119','yyyymmdd'), 'Bank1', 11, -11, -11, 'EUR');
insert into Daily_Movement values (to_date('20100119','yyyymmdd'), 'Bank1', 11, 22, 22, 'EUR');
-- 20 january: total daily movement = 132
insert into Daily_Movement values (to_date('20100120','yyyymmdd'), 'Bank1', 11, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100120','yyyymmdd'), 'Bank1', 11, 44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100120','yyyymmdd'), 'Bank1', 11, 55, 55, 'EUR');
-- 21 january: total daily movement = 77
insert into Daily_Movement values (to_date('20100121','yyyymmdd'), 'Bank1', 11, 66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100121','yyyymmdd'), 'Bank1', 11, -77, -77, 'EUR');
insert into Daily_Movement values (to_date('20100121','yyyymmdd'), 'Bank1', 11, 88, 88, 'EUR');
-- 22 january: total daily movement = 210
insert into Daily_Movement values (to_date('20100122','yyyymmdd'), 'Bank1', 11, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100122','yyyymmdd'), 'Bank1', 11, 100, 100, 'EUR');
insert into Daily_Movement values (to_date('20100122','yyyymmdd'), 'Bank1', 11, 11, 11, 'EUR');
-- 25 january: total daily movement = 55
insert into Daily_Movement values (to_date('20100125','yyyymmdd'), 'Bank1', 11, -22, -22, 'EUR');
insert into Daily_Movement values (to_date('20100125','yyyymmdd'), 'Bank1', 11, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100125','yyyymmdd'), 'Bank1', 11, 44, 44, 'EUR');
-- 26 january: total daily movement = 66
insert into Daily_Movement values (to_date('20100126','yyyymmdd'), 'Bank1', 11, 55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100126','yyyymmdd'), 'Bank1', 11, -66, -66, 'EUR');
insert into Daily_Movement values (to_date('20100126','yyyymmdd'), 'Bank1', 11, 77, 77, 'EUR');
-- 27 january: total daily movement = 87
insert into Daily_Movement values (to_date('20100127','yyyymmdd'), 'Bank1', 11, 88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100127','yyyymmdd'), 'Bank1', 11, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100127','yyyymmdd'), 'Bank1', 11, -100, -100, 'EUR');
-- 28 january: total daily movement = 44
insert into Daily_Movement values (to_date('20100128','yyyymmdd'), 'Bank1', 11, -11, -11, 'EUR');
insert into Daily_Movement values (to_date('20100128','yyyymmdd'), 'Bank1', 11, 22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100128','yyyymmdd'), 'Bank1', 11, 33, 33, 'EUR');
-- 29 january: total daily movement = 55
insert into Daily_Movement values (to_date('20100129','yyyymmdd'), 'Bank1', 11, 44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100129','yyyymmdd'), 'Bank1', 11, -55, -55, 'EUR');
insert into Daily_Movement values (to_date('20100129','yyyymmdd'), 'Bank1', 11, 66, 66, 'EUR');
-- total january 1347
-- 01 february: total daily movement = 264
insert into Daily_Movement values (to_date('20100201','yyyymmdd'), 'Bank1', 11, 77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100201','yyyymmdd'), 'Bank1', 11, 88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100201','yyyymmdd'), 'Bank1', 11, 99, 99, 'EUR');
-- 02 february: total daily movement = 111
insert into Daily_Movement values (to_date('20100202','yyyymmdd'), 'Bank1', 11, 100, 100, 'EUR');
insert into Daily_Movement values (to_date('20100202','yyyymmdd'), 'Bank1', 11, -11, -11, 'EUR');
insert into Daily_Movement values (to_date('20100202','yyyymmdd'), 'Bank1', 11, 22, 22, 'EUR');
-- 03 february: total daily movement = -66
insert into Daily_Movement values (to_date('20100203','yyyymmdd'), 'Bank1', 11, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100203','yyyymmdd'), 'Bank1', 11, -44, -44, 'EUR');
insert into Daily_Movement values (to_date('20100203','yyyymmdd'), 'Bank1', 11, -55, -55, 'EUR');
-- 04 february: total daily movement = 99
insert into Daily_Movement values (to_date('20100204','yyyymmdd'), 'Bank1', 11, -66, -66, 'EUR');
insert into Daily_Movement values (to_date('20100204','yyyymmdd'), 'Bank1', 11, 77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100204','yyyymmdd'), 'Bank1', 11, 88, 88, 'EUR');
-- 05 february: total daily movement = 10
insert into Daily_Movement values (to_date('20100205','yyyymmdd'), 'Bank1', 11, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100205','yyyymmdd'), 'Bank1', 11, -100, -100, 'EUR');
insert into Daily_Movement values (to_date('20100205','yyyymmdd'), 'Bank1', 11, 11, 11, 'EUR');
-- 08 february: total daily movement = 99
insert into Daily_Movement values (to_date('20100208','yyyymmdd'), 'Bank1', 11, 22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100208','yyyymmdd'), 'Bank1', 11, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100208','yyyymmdd'), 'Bank1', 11, 44, 44, 'EUR');
-- 09 february: total daily movement = 66
insert into Daily_Movement values (to_date('20100209','yyyymmdd'), 'Bank1', 11, 55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100209','yyyymmdd'), 'Bank1', 11, -66, -66, 'EUR');
insert into Daily_Movement values (to_date('20100209','yyyymmdd'), 'Bank1', 11, 77, 77, 'EUR');
-- 10 february: total daily movement = 287
insert into Daily_Movement values (to_date('20100210','yyyymmdd'), 'Bank1', 11, 88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100210','yyyymmdd'), 'Bank1', 11, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100210','yyyymmdd'), 'Bank1', 11, 100, 100, 'EUR');
-- 11 february: total daily movement = 22
insert into Daily_Movement values (to_date('20100211','yyyymmdd'), 'Bank1', 11, 11, 11, 'EUR');
insert into Daily_Movement values (to_date('20100211','yyyymmdd'), 'Bank1', 11, -22, -22, 'EUR');
insert into Daily_Movement values (to_date('20100211','yyyymmdd'), 'Bank1', 11, 33, 33, 'EUR');
-- 12 february: total daily movement = 77
insert into Daily_Movement values (to_date('20100212','yyyymmdd'), 'Bank1', 11, -44, -44, 'EUR');
insert into Daily_Movement values (to_date('20100212','yyyymmdd'), 'Bank1', 11, 55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100212','yyyymmdd'), 'Bank1', 11, 66, 66, 'EUR');
-- 15 february: total daily movement = 66
insert into Daily_Movement values (to_date('20100215','yyyymmdd'), 'Bank1', 11, 77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100215','yyyymmdd'), 'Bank1', 11, 88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100215','yyyymmdd'), 'Bank1', 11, -99, -99, 'EUR');
-- 16 february: total daily movement = 133
insert into Daily_Movement values (to_date('20100216','yyyymmdd'), 'Bank1', 11, 100, 100, 'EUR');
insert into Daily_Movement values (to_date('20100216','yyyymmdd'), 'Bank1', 11, 11, 11, 'EUR');
insert into Daily_Movement values (to_date('20100216','yyyymmdd'), 'Bank1', 11, 22, 22, 'EUR');
-- 17 february: total daily movement = 66
insert into Daily_Movement values (to_date('20100217','yyyymmdd'), 'Bank1', 11, -33, -33, 'EUR');
insert into Daily_Movement values (to_date('20100217','yyyymmdd'), 'Bank1', 11, 44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100217','yyyymmdd'), 'Bank1', 11, 55, 55, 'EUR');
-- 18 february: total daily movement = 77
insert into Daily_Movement values (to_date('20100218','yyyymmdd'), 'Bank1', 11, 66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100218','yyyymmdd'), 'Bank1', 11, -77, -77, 'EUR');
insert into Daily_Movement values (to_date('20100218','yyyymmdd'), 'Bank1', 11, 88, 88, 'EUR');
-- 19 february: total daily movement = 210
insert into Daily_Movement values (to_date('20100219','yyyymmdd'), 'Bank1', 11, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100219','yyyymmdd'), 'Bank1', 11, 100, 100, 'EUR');
insert into Daily_Movement values (to_date('20100219','yyyymmdd'), 'Bank1', 11, 11, 11, 'EUR');
-- 22 february: total daily movement = 99
insert into Daily_Movement values (to_date('20100222','yyyymmdd'), 'Bank1', 11, 22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100222','yyyymmdd'), 'Bank1', 11, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100222','yyyymmdd'), 'Bank1', 11, 44, 44, 'EUR');
-- 23 february: total daily movement = -44
insert into Daily_Movement values (to_date('20100223','yyyymmdd'), 'Bank1', 11, -55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100223','yyyymmdd'), 'Bank1', 11, -66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100223','yyyymmdd'), 'Bank1', 11, 77, 77, 'EUR');
-- 24 february: total daily movement = -111
insert into Daily_Movement values (to_date('20100224','yyyymmdd'), 'Bank1', 11, 88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100224','yyyymmdd'), 'Bank1', 11, -99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100224','yyyymmdd'), 'Bank1', 11, -100, 100, 'EUR');
-- 25 february: total daily movement = 22
insert into Daily_Movement values (to_date('20100225','yyyymmdd'), 'Bank1', 11, 11, 11, 'EUR');
insert into Daily_Movement values (to_date('20100225','yyyymmdd'), 'Bank1', 11, -22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100225','yyyymmdd'), 'Bank1', 11, 33, 33, 'EUR');
-- 26 february: total daily movement = 55
insert into Daily_Movement values (to_date('20100226','yyyymmdd'), 'Bank1', 11, 44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100226','yyyymmdd'), 'Bank1', 11, -55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100226','yyyymmdd'), 'Bank1', 11, 66, 66, 'EUR');
-- total february 1542
-- account 12
-- 4 january: total daily movement = 88
insert into Daily_Movement values (to_date('20100104','yyyymmdd'), 'Bank1', 12, 77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100104','yyyymmdd'), 'Bank1', 12, -88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100104','yyyymmdd'), 'Bank1', 12, 99, 99, 'EUR');
-- 5 january: total daily movement = 89
insert into Daily_Movement values (to_date('20100105','yyyymmdd'), 'Bank1', 12, 100, 100, 'EUR');
insert into Daily_Movement values (to_date('20100105','yyyymmdd'), 'Bank1', 12, -11, 11, 'EUR');
-- 6 january: total daily movement = 99
insert into Daily_Movement values (to_date('20100106','yyyymmdd'), 'Bank1', 12, 22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100106','yyyymmdd'), 'Bank1', 12, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100106','yyyymmdd'), 'Bank1', 12, 44, 44, 'EUR');
-- 7 january: total daily movement = -88
insert into Daily_Movement values (to_date('20100107','yyyymmdd'), 'Bank1', 12, 55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100107','yyyymmdd'), 'Bank1', 12, -66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100107','yyyymmdd'), 'Bank1', 12, -77, 77, 'EUR');
-- 8 january: total daily movement = 87
insert into Daily_Movement values (to_date('20100108','yyyymmdd'), 'Bank1', 12, 88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100108','yyyymmdd'), 'Bank1', 12, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100108','yyyymmdd'), 'Bank1', 12, -100, 100, 'EUR');
-- 11 january: total daily movement = 66
insert into Daily_Movement values (to_date('20100111','yyyymmdd'), 'Bank1', 12, 11, 11, 'EUR');
insert into Daily_Movement values (to_date('20100111','yyyymmdd'), 'Bank1', 12, 22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100111','yyyymmdd'), 'Bank1', 12, 33, 33, 'EUR');
-- 12 january: total daily movement = 55
insert into Daily_Movement values (to_date('20100112','yyyymmdd'), 'Bank1', 12, 44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100112','yyyymmdd'), 'Bank1', 12, -55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100112','yyyymmdd'), 'Bank1', 12, 66, 222, 'EUR');
-- 13 january: total daily movement = 88
insert into Daily_Movement values (to_date('20100113','yyyymmdd'), 'Bank1', 12, 77, 222, 'EUR');
insert into Daily_Movement values (to_date('20100113','yyyymmdd'), 'Bank1', 12, -88, 222, 'EUR');
insert into Daily_Movement values (to_date('20100113','yyyymmdd'), 'Bank1', 12, 99, 222, 'EUR');
-- 14 january: total daily movement = 67
insert into Daily_Movement values (to_date('20100114','yyyymmdd'), 'Bank1', 12, 100, 222, 'EUR');
insert into Daily_Movement values (to_date('20100114','yyyymmdd'), 'Bank1', 12, -11, 222, 'EUR');
insert into Daily_Movement values (to_date('20100114','yyyymmdd'), 'Bank1', 12, -22, 22, 'EUR');
-- 15 january: total daily movement = 132
insert into Daily_Movement values (to_date('20100115','yyyymmdd'), 'Bank1', 12, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100115','yyyymmdd'), 'Bank1', 12, 44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100115','yyyymmdd'), 'Bank1', 12, 55, 55, 'EUR');
-- 18 january: total daily movement = 99
insert into Daily_Movement values (to_date('20100118','yyyymmdd'), 'Bank1', 12, -66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100118','yyyymmdd'), 'Bank1', 12, 77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100118','yyyymmdd'), 'Bank1', 12, 88, 88, 'EUR');
-- 19 january: total daily movement = -188
insert into Daily_Movement values (to_date('20100119','yyyymmdd'), 'Bank1', 12, -99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100119','yyyymmdd'), 'Bank1', 12, -100, 100, 'EUR');
insert into Daily_Movement values (to_date('20100119','yyyymmdd'), 'Bank1', 12, 11, 11, 'EUR');
-- 20 january: total daily movement = 11
insert into Daily_Movement values (to_date('20100120','yyyymmdd'), 'Bank1', 12, 22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100120','yyyymmdd'), 'Bank1', 12, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100120','yyyymmdd'), 'Bank1', 12, -44, 44, 'EUR');
-- 21 january: total daily movement = 198
insert into Daily_Movement values (to_date('20100121','yyyymmdd'), 'Bank1', 12, 55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100121','yyyymmdd'), 'Bank1', 12, 66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100121','yyyymmdd'), 'Bank1', 12, 77, 77, 'EUR');
-- 22 january: total daily movement = 111
insert into Daily_Movement values (to_date('20100122','yyyymmdd'), 'Bank1', 12, -88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100122','yyyymmdd'), 'Bank1', 12, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100122','yyyymmdd'), 'Bank1', 12, 100, 100, 'EUR');
-- 25 january: total daily movement = -22
insert into Daily_Movement values (to_date('20100125','yyyymmdd'), 'Bank1', 12, -11, 11, 'EUR');
insert into Daily_Movement values (to_date('20100125','yyyymmdd'), 'Bank1', 12, 22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100125','yyyymmdd'), 'Bank1', 12, -33, 33, 'EUR');
-- 26 january: total daily movement = 33
insert into Daily_Movement values (to_date('20100126','yyyymmdd'), 'Bank1', 12, 44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100126','yyyymmdd'), 'Bank1', 12, 55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100126','yyyymmdd'), 'Bank1', 12, -66, 66, 'EUR');
-- 27 january: total daily movement = 264
insert into Daily_Movement values (to_date('20100127','yyyymmdd'), 'Bank1', 12, 77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100127','yyyymmdd'), 'Bank1', 12, 88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100127','yyyymmdd'), 'Bank1', 12, 99, 99, 'EUR');
-- 28 january: total daily movement = -111
insert into Daily_Movement values (to_date('20100128','yyyymmdd'), 'Bank1', 12, -100, 100, 'EUR');
insert into Daily_Movement values (to_date('20100128','yyyymmdd'), 'Bank1', 12, -11, 11, 'EUR');
insert into Daily_Movement values (to_date('20100128','yyyymmdd'), 'Bank1', 12, 22, 22, 'EUR');
-- 29 january: total daily movement = 132
insert into Daily_Movement values (to_date('20100129','yyyymmdd'), 'Bank1', 12, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100129','yyyymmdd'), 'Bank1', 12, 44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100129','yyyymmdd'), 'Bank1', 12, 55, 55, 'EUR');
-- total january: 1210
-- 01 february: total daily movement = 77
insert into Daily_Movement values (to_date('20100201','yyyymmdd'), 'Bank1', 12, 66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100201','yyyymmdd'), 'Bank1', 12, -77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100201','yyyymmdd'), 'Bank1', 12, 88, 88, 'EUR');
-- 02 february: total daily movement = -12
insert into Daily_Movement values (to_date('20100202','yyyymmdd'), 'Bank1', 12, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100202','yyyymmdd'), 'Bank1', 12, -100, 100, 'EUR');
insert into Daily_Movement values (to_date('20100202','yyyymmdd'), 'Bank1', 12, -11, 11, 'EUR');
-- 03 february: total daily movement = 33
insert into Daily_Movement values (to_date('20100203','yyyymmdd'), 'Bank1', 12, 22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100203','yyyymmdd'), 'Bank1', 12, -33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100203','yyyymmdd'), 'Bank1', 12, 44, 44, 'EUR');
-- 04 february: total daily movement = 66
insert into Daily_Movement values (to_date('20100204','yyyymmdd'), 'Bank1', 12, 55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100204','yyyymmdd'), 'Bank1', 12, -66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100204','yyyymmdd'), 'Bank1', 12, 77, 77, 'EUR');
-- 05 february: total daily movement = 111
insert into Daily_Movement values (to_date('20100205','yyyymmdd'), 'Bank1', 12, -88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100205','yyyymmdd'), 'Bank1', 12, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100205','yyyymmdd'), 'Bank1', 12, 100, 100, 'EUR');
-- 08 february: total daily movement = 66
insert into Daily_Movement values (to_date('20100208','yyyymmdd'), 'Bank1', 12, 11, 11, 'EUR');
insert into Daily_Movement values (to_date('20100208','yyyymmdd'), 'Bank1', 12, 22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100208','yyyymmdd'), 'Bank1', 12, 33, 33, 'EUR');
-- 09 february: total daily movement = -33
insert into Daily_Movement values (to_date('20100209','yyyymmdd'), 'Bank1', 12, -44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100209','yyyymmdd'), 'Bank1', 12, -55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100209','yyyymmdd'), 'Bank1', 12, 66, 66, 'EUR');
-- 10 february: total daily movement = 264
insert into Daily_Movement values (to_date('20100210','yyyymmdd'), 'Bank1', 12, 77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100210','yyyymmdd'), 'Bank1', 12, 88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100210','yyyymmdd'), 'Bank1', 12, 99, 99, 'EUR');
-- 11 february: total daily movement = -67
insert into Daily_Movement values (to_date('20100211','yyyymmdd'), 'Bank1', 12, -100, 100, 'EUR');
insert into Daily_Movement values (to_date('20100211','yyyymmdd'), 'Bank1', 12, 11, 11, 'EUR');
insert into Daily_Movement values (to_date('20100211','yyyymmdd'), 'Bank1', 12, 22, 22, 'EUR');
-- 12 february: total daily movement = -66
insert into Daily_Movement values (to_date('20100212','yyyymmdd'), 'Bank1', 12, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100212','yyyymmdd'), 'Bank1', 12, -44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100212','yyyymmdd'), 'Bank1', 12, -55, 55, 'EUR');
-- 15 february: total daily movement = -77
insert into Daily_Movement values (to_date('20100215','yyyymmdd'), 'Bank1', 12, -66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100215','yyyymmdd'), 'Bank1', 12, 77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100215','yyyymmdd'), 'Bank1', 12, -88, 88, 'EUR');
-- 16 february: total daily movement = 210
insert into Daily_Movement values (to_date('20100216','yyyymmdd'), 'Bank1', 12, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100216','yyyymmdd'), 'Bank1', 12, 100, 100, 'EUR');
insert into Daily_Movement values (to_date('20100216','yyyymmdd'), 'Bank1', 12, 11, 11, 'EUR');
-- 17 february: total daily movement = 99
insert into Daily_Movement values (to_date('20100217','yyyymmdd'), 'Bank1', 12, 22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100217','yyyymmdd'), 'Bank1', 12, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100217','yyyymmdd'), 'Bank1', 12, 44, 44, 'EUR');
-- 18 february: total daily movement = 66
insert into Daily_Movement values (to_date('20100218','yyyymmdd'), 'Bank1', 12, 55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100218','yyyymmdd'), 'Bank1', 12, -66, 66, 'EUR');
insert into Daily_Movement values (to_date('20100218','yyyymmdd'), 'Bank1', 12, 77, 77, 'EUR');
-- 19 february: total daily movement = 287
insert into Daily_Movement values (to_date('20100219','yyyymmdd'), 'Bank1', 12, 88, 88, 'EUR');
insert into Daily_Movement values (to_date('20100219','yyyymmdd'), 'Bank1', 12, 99, 99, 'EUR');
insert into Daily_Movement values (to_date('20100219','yyyymmdd'), 'Bank1', 12, 100, 100, 'EUR');
-- 22 february: total daily movement = -66
insert into Daily_Movement values (to_date('20100222','yyyymmdd'), 'Bank1', 12, -11, 11, 'EUR');
insert into Daily_Movement values (to_date('20100222','yyyymmdd'), 'Bank1', 12, -22, 22, 'EUR');
insert into Daily_Movement values (to_date('20100222','yyyymmdd'), 'Bank1', 12, -33, 33, 'EUR');
-- 23 february: total daily movement = 165
insert into Daily_Movement values (to_date('20100223','yyyymmdd'), 'Bank1', 12, 44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100223','yyyymmdd'), 'Bank1', 12, 55, 55, 'EUR');
insert into Daily_Movement values (to_date('20100223','yyyymmdd'), 'Bank1', 12, 66, 66, 'EUR');
-- 24 february: total daily movement = -110
insert into Daily_Movement values (to_date('20100224','yyyymmdd'), 'Bank1', 12, 77, 77, 'EUR');
insert into Daily_Movement values (to_date('20100224','yyyymmdd'), 'Bank1', 12, -88, -88, 'EUR');
insert into Daily_Movement values (to_date('20100224','yyyymmdd'), 'Bank1', 12, -99, -99, 'EUR');
-- 25 february: total daily movement = -133
insert into Daily_Movement values (to_date('20100225','yyyymmdd'), 'Bank1', 12, -100, -100, 'EUR');
insert into Daily_Movement values (to_date('20100225','yyyymmdd'), 'Bank1', 12, -11, -11, 'EUR');
insert into Daily_Movement values (to_date('20100225','yyyymmdd'), 'Bank1', 12, -22, -22, 'EUR');
-- 26 february: total daily movement = 44
insert into Daily_Movement values (to_date('20100226','yyyymmdd'), 'Bank1', 12, 33, 33, 'EUR');
insert into Daily_Movement values (to_date('20100226','yyyymmdd'), 'Bank1', 12, -44, 44, 'EUR');
insert into Daily_Movement values (to_date('20100226','yyyymmdd'), 'Bank1', 12, 55, 55, 'EUR');
-- total february: 924
commit;

Similar Messages

  • Very Big Table (36 Indexes, 1000000 Records)

    Hi
    I have a very big table (76 columns, 1000000 records), these 76 columns include 36 foreign key columns , each FK has an index on the table, and only one of these FK columns has a value at the same time while all other FK have NULL value. All these FK columns are of type NUMBER(20,0).
    I am facing performance problem which I want to resolve taking in consideration that this table is used with DML (Insert,Update,Delete) along with Query (Select) operations, all these operations and queries are done daily. I want to improve this table performance , and I am facing these scenarios:
    1- Replace all these 36 FK columns with 2 columns (ID, TABLE_NAME) (ID for master table ID value, and TABLE_NAME for master table name) and create only one index on these 2 columns.
    2- partition the table using its YEAR column, keep all FK columns but drop all indexes on these columns.
    3- partition the table using its YEAR column, and drop all FK columns, create (ID,TABLE_NAME) columns, and create index on (TABLE_NAME,YEAR) columns.
    Which way has more efficiency?
    Do I have to take "master-detail" relations in mind when building Forms on this table?
    Are there any other suggestions?
    I am using Oracle 8.1.7 database.
    Please Help.

    Hi everybody
    I would like to thank you for your cooperation and I will try to answer your questions, but please note that I am a developer in the first place and I am new to oracle database administration, so please forgive me if I did any mistakes.
    Q: Have you gathered statistics on the tables in your database?
    A: No I did not. And if I must do it, must I do it for all database tables or only for this big table?
    Q:Actually tracing the session with 10046 level 8 will give some clear idea on where your query is waiting.
    A: Actually I do not know what you mean by "10046 level 8".
    Q: what OS and what kind of server (hardware) are you using
    A: I am using Windows2000 Server operating system, my server has 2 Intel XEON 500MHz + 2.5GB RAM + 4 * 36GB Hard Disks(on RAID 5 controller).
    Q: how many concurrent user do you have an how many transactions per hour
    A: I have 40 concurrent users, and an average 100 transaction per hour, but the peak can goes to 1000 transaction per hour.
    Q: How fast should your queries be executed
    A: I want the queries be executed in about 10 to 15 seconds, or else every body here will complain. Please note that because of this table is highly used, there is a very good chance to 2 or more transaction to exist at the same time, one of them perform query, and the other perform DML operation. Some of these queries are used in reports, and it can be long query(ex. retrieve the summary of 50000 records).
    Q:please show use the explain plan of these queries
    A: If I understand your question, you ask me to show you the explain plan of those queries, well, first, I do not know how , an second, I think it is a big question because I can not collect all kind of queries that have been written on this table (some of them exist in server packages, and the others performed by Forms or Reports).

  • How to UPDATE a big table in Oracle via Bulk Load

    Hi all,
    in a datastore target as Oracle 11g, I have a big table having 300milions of record; the structure is One integer key + 10 columns attributes .
    In IQ Source i have the same table with the same size ; the structure is One integer key + 1 column attributes .
    What i need to do is to UPDATE that single field in Oracle from the values stored in IQ .
    Any idea on how to organize efficiently the dataflow and the target writing mode ? bulk load ? api ?
    thank you
    Maurizio

    Hi,
    You cannot do bulk load when you need to UPDATE a field. Because all a bulk load does is add records to your table.
    Since you have to UPDATE a field, i would suggest to go for SCD with
    source > TC > MO > KG >target
    Arun

  • Fetch records from ETL Load control tables in BODS

    Hi,
    Please anyone tell me, how to fetch the records from the ETL Load control tables in BODS.
    (E.g) ETL_BATCH, ETL_JOB, ETL_DATAFLOW, ETL_RECON, ETL_ERROR.
    These are some ETL load tables..
    Thanks,
    Ragrds,
    Ranjith.

    Hi Ranjith,
    You can ask your administrator for BODS repository login details.
    Once you get login details you will get all the tables.
    Please check following links :
    Data Services Metadata Query Part 1
    http://www.dwbiconcepts.com/etl/23-etl-bods/171-data-services-metadata-query-part-2.html
    http://www.dwbiconcepts.com/etl/23-etl-bods/171-data-services-metadata-query-part-3.html
    I hope this will help.
    If you want more info then please let me know.
    Thanks,
    Swapnil

  • Hello! Can't open an IDML file. ID file was created in CC (10). It is a 100 page (50 spreads) doc that is one big table. It was created in CC (10) and saved as an IDML file. I have CS6 and when I try to open it, it shuts down ID almost instantly. The file

    Hello! Can't open an IDML file. ID file was created in CC (10). It is a 100 page (50 spreads) doc that is one big table. It was created in CC (10) and saved as an IDML file. I have CS6 and when I try to open it, it shuts down ID almost instantly. The file was created on a MAC and I am trying to open it on a MAC. Any/all advice is greatly appreciated as I am up against a deadline with this unopened file! Many thanks in advance, Diane

    There's a good chance the file is corrupt. As whomever sent it to you to verify it opens on their end.

  • Snapshot too old when deleting from a "big" table

    Hello.
    I think this is a basic thing (release 8.1.7.4). I must say I don't know how rollback segments really work.
    A table, where new records are continuously inserted and the old ones can be updated in short transactions, should be purged every day by deleting old records.
    This purge has never been done and as a result it has now almost 4 million records, and when I launch the stored procedure that deletes the old records I get the "snapshot too old" error because of the read consistency.
    If I launch the procedure after stopping the application that inserts and updates in the table, then I don't get the error. I guess the problem is that meanwhile the procedure is being executed other transactions also need to use rollback segments so that the rollback segment space that the snapshot needs isn't enough. Do you think this is the problem?
    If this is the case then I suppose that the only solution is increasing the size of the only datafile of the only tablespace for my 4 rollback segments. Am I wrong?
    (Three more questions:
    - Could the problem be solved by locking some rollback segments for the snapshot? How could I do that?
    - What is a discrete transaction?
    I'm a developer, not a dba, but don't tell me to ask my dba because it isn't that easy. Thanks in advance.

    "snapshot too old indicates the undo tablespace does not have enough free space for a long running query" what does this mean? why do I get the same error in two different databases, in the first the size of the datafile of the undo tablespace is 2GB whilst in the second it is only 2MB? How can I know how big the datafile has to be?
    One possible solution could be not deleting the whole table at once but only a few records? Would this work? Why when I try "select count(*) from my_table where rownum = 1" I also get "snapshot too old" when other transactions are running.

  • Gather table stats takes time for big table

    Table has got millions of record and it is partitioned. when i analyze the table using the following syntax it takes more than 5 hrs and it has got one index.
    I tried with auto sample size and also by changing the estimate percentage value like 20, 50 70 etc. But the time is almost same.
    exec dbms_stats.gather_table_stats(ownname=>'SYSADM',tabname=>'TEST',granularity =>'ALL',ESTIMATE_PERCENT=>100,cascade=>TRUE);
    What i should do to reduce the analyze time for Big tables. Can anyone help me. l

    Hello,
    The behaviour of the ESTIMATE_PERCENT may change from one Release to another.
    In some Release when you specify a "too high" (>25%,...) ESTIMATE_PERCENT in fact you collect the Statistics over 100% of the rows, as in COMPUTE mode:
    Using DBMS_STATS.GATHER_TABLE_STATS With ESTIMATE_PERCENT Parameter Samples All Rows [ID 223065.1]For later Release, *10g* or *11g*, you have the possibility to use the following value:
    estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZEIn fact, you may use it even in *9.2*, but in this release it is recommended using a specific estimate value.
    More over, starting with *10.1* it's possible to Schedule the Statistics collect by using DBMS_SCHEDULE and, specify a Window so that the Job doesn't run during production hours.
    So, the answer may depends on the Oracle Release and also on the Application (SAP, Peoplesoft, ...).
    Best regards,
    Jean-Valentin

  • How to load unicode data files with fixed records lengths?

    Hi!
    To load unicode data files with fixed records lengths (in terms of charachters and not of bytes!) using SQL*Loader manually, I found two ways:
    Alternative 1: one record per row
    SQL*Loader control file example (without POSITION, since POSITION always refers to bytes!)<br>
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode.dat
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001111112234444
    01NormalDExZWEI
    02ÄÜÖßêÊûÛxöööö
    03ÄÜÖßêÊûÛxöööö
    04üüüüüüÖÄxµôÔµ Alternative2: variable length records
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode_var.dat "VAR 4"
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001501NormalDExZWEI002702ÄÜÖßêÊûÛxöööö002604üuüüüüÖÄxµôÔµ Problems
    Implementing these two alternatives in OWB, I encounter the following problems:
    * How to specify LENGTH SEMANTICS CHAR?
    * How to suppress the POSITION definition?
    * How to define a flat file with variable length and how to specify the number of bytes containing the length definition?
    Or is there another way that can be implemented using OWB?
    Any help is appreciated!
    Thanks,
    Carsten.

    Hi Carsten
    If you need to support the LENGTH SEMANTICS CHAR clause in an external table then one option is to use the unbound external table and capture the access parameters manually. To create an unbound external table you can skip the selection of a base file in the external table wizard. Then when the external table is edited you will get an Access Parameters tab where you can define the parameters. In 11gR2 the File to Oracle external table can also add this clause via an option.
    Cheers
    David

  • SQL Loader : Loading multiple tables using same ctl file

    Hi ,
    We tried loading multiple tables using the same ctl file but the data was not loaded and no errors were thrown.
    The ctl file content is summarised below :
    LOAD DATA
    APPEND INTO TABLE TABLE_ONE
    when record_type ='EVENT'
    TRAILING NULLCOLS
    record_type char TERMINATED BY ',' ,
    EVENT_SOURCE_FIELD CHAR TERMINATED BY ',' ENCLOSED BY '"',
    EVENT_DATE DATE "YYYY-MM-DD HH24:MI:SS" TERMINATED BY ',' ENCLOSED BY '"',
    EVENT_COST INTEGER EXTERNAL TERMINATED BY ',' ENCLOSED BY '"',
    EVENT_ATTRIB_1 CHAR TERMINATED BY ',' ENCLOSED BY '"',
    VAT_STATUS INTEGER EXTERNAL TERMINATED BY ',' ENCLOSED BY '"',
    ACCOUNT_REFERENCE CONSTANT 'XXX',
    bill_date "to_date('02-'||to_char(sysdate,'mm-yyyy'),'dd-mm-yyyy')",
    data_date "trunc(sysdate)",
    load_date_time "sysdate"
    INTO TABLE TABLE_TWO
    when record_type ='BILLSUMMARYRECORD'
    TRAILING NULLCOLS
    RECORD_TYPE char TERMINATED BY ',' ,
    NET_TOTAL INTEGER EXTERNAL TERMINATED BY ',' ENCLOSED BY '"',
    LOAD_DATE_TIME "sysdate"
    INTO TABLE BILL_BKP_ADJUSTMENTS
    when record_type ='ADJUSTMENTS'
    TRAILING NULLCOLS
    RECORD_TYPE char TERMINATED BY ',' ,
    ADJUSTMENT_NAME CHAR TERMINATED BY ',' ENCLOSED BY '"',
    LOAD_DATE_TIME "sysdate"
    INTO TABLE BILL_BKP_CUSTOMERRECORD
    when record_type ='CUSTOMERRECORD'
    TRAILING NULLCOLS
    RECORD_TYPE char TERMINATED BY ',' ,
    GENEVA_CUSTOMER_REF CHAR TERMINATED BY ',' ENCLOSED BY '"',
    LOAD_DATE_TIME "sysdate"
    INTO TABLE TABLE_THREE
    when record_type ='PRODUCTCHARGE'
    TRAILING NULLCOLS
    RECORD_TYPE char TERMINATED BY ',' ,
    PROD_ATTRIB_1_CHRG_DESC CHAR TERMINATED BY ',' ENCLOSED BY '"',
    LOAD_DATE_TIME "sysdate"
    Has anyone faced similar errors or are we going wrong somewhere ?
    Regards,
    Sandipan

    This is the info on the discard in the log file :
    Record 1: Discarded - failed all WHEN clauses.
    Record 638864: Discarded - failed all WHEN clauses.
    While some of the records were loaded for one table.
    Regards,
    Sandipan

  • Loading multiple tables with SQL Loader

    Hi,
    I want to load multiple tables from a single data file using SQL Loader.
    Here's the basic idea of what I want. Let's say I have two tables, table =T1
    and table T2:
    SQL> desc T1;
    COL1 VARCHAR2(20)
    COL2 VARCHAR2(20)
    SQL> desc T2;
    COL1 VARCHAR2(20)
    COL2 VARCHAR2(20)
    COL3 VARCHAR2(20)
    My data file, test.dat, looks like this:
    AAA|KBA
    BBR|BBCC|CCC
    NNN|BBBN|NNA
    I want to load the first record into T1, and the second and third record load into T2. How do I set up my control file to do that?
    Thank!

    Tough Job
    LOAD DATA
    truncate
    INTO table t1
    when col3 = 'dummy'
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS
    (col1,col2,col3 filler char nullif col3='dummy')
    INTO table t2
    when col3 != 'dummy'
    FIELDS TERMINATED BY '|'
    (col1,col2,col3 nullif col3='dummy')
    This will load t2 tbl but not t1.
    T1 Filler col3 is not accepting nullif. Its diff to compare columns have null using when condition. If i find something i will let you know.
    Can you seperate records into 2 file. Will a UNIX command work for you which will seperate 2col and 3col record types for you. and then you can execute 2 controlfiles on it.
    Thanks,
    http://www.askyogesh.com

  • Table.Join/Merge in Power Query takes extremly long time to process for big tables

    Hi,
    I tried to simply merge/inner join two big tables(one has 300,000+ rows after filtering and the other has 30,000+ rows after filtering) in PQ. However, for this simple join operation, PQ took at least 10 minutes (I killed the Query Editor after 10
    minutes' processing) to load the preview.
    Here's how I did the join job: I first loaded tables into the workbook, then did the filtering for each table and at last, used the merge function to do the join based on a same field.
    Did I do anything wrong here? Or is there any way to improve the load efficiency?
    P.S. no custom SQL was used during the process. I was hoping the so called "Query Folding" can help speed the process, but it seems it didn't work here.
    Thanks.
    Regards,
    Qilong

    Hi!
    You should import the source tables
    in Access. This will speed up the work of
    PQ in several times.

  • Extract Data from XML and Load into table using SQL*Loader

    Hi All,
    We have a XML file (sample.xml) which contains credit card transaction information. We have a standard SQL*Loader control file which loads the data from a flat file and the control file code is written as position based method. Our requirement is to use this control file as per our requirement(i.e) load the data into the table from our XML file), But we need help in converting the XML to a flat file or Extract the data from the XML tags and pass the information to the control file and in turn it loads the table.
    Your suggestion is highly appreciated.
    Thanks in advance

    Hi,
    First of all go to PSA maintanance ( Where you will see PSA records ).
    Goto list---> Save-> File---> Spreadsheet (Choose Radio Button)
    > Give the proper file name where you want to download and then-----> Generate.
    You will get ur PSA data in Excel Format.
    Thanks
    Mayank

  • Excution of a PL/SQL procedure with CURSOR for big tables

    I have prepared a proceudre that uses CURSOR to make a complex query for tables with big number of records, something like 900'000. And the execution failed; ORA-01652:impossible to extend the temporary segment of 64 in the space of storage TEMP.
    Any sugestion.

    This brings us to the following question: How could I calculate the bytes required by a cursor?. It is a selection of certain fields of very big tables. Let's say that the fields are NUMBER(4), NUMBER(8) and CHAR(2). The fields are in 2 relational tables of 900'000 each. What size is required for a procedure like this.
    Your help is really appreciated.

  • Delete 50 Million records from a table with 60 Million records

    Hi,
    I'm using oracle9.2.0.7 on win2k3 32bit.
    I need to delete 50M rows from a table that contains 60M records. This db was just passed on to me. I tried to use the delete statement but it takes too long. After reading the articles and forums, the best way to delete that many records from a table is to create a temp table, transfer the data needed to the temp table, drop the big table then rename temp table to big table. But the key here is in creating an exact replica of the big table.I have gotten the create table, indexes and constraints script in the export file from my production DB. But in the forums I read, I noticed that I haven't gotten the create grant script, is there a view I could use to get this? Can dbms.metadata get this?
    When I need to create an exact replica of my big table, I only need:
    create table, indexes, constraints, and grants script right? Did I miss anything?
    I just want to make sure that I haven't left anything out. Kindly help.
    Thanks and Best Regards

    Can dbms.metadata get this?
    Yes, dbms_metadata can get the grants.
    YAS@10GR2 > select dbms_metadata.GET_DEPENDENT_DDL('OBJECT_GRANT','TEST') from dual;
    DBMS_METADATA.GET_DEPENDENT_DDL('OBJECT_GRANT','TEST')
      GRANT SELECT ON "YAS"."TEST" TO "SYS"
    When I need to create an exact replica of my big table, I only need:
    create table, indexes, constraints, and grants script right? Did I miss anything?
    There are triggers, foreign keys referencing this table (which will not permit you to drop the table if you do not take care of them), snapshot logs on the table, snapshots based on the table, etc...

  • Populating a load-status table from SQL*Loader

    Hello,
    I am using SQL*LDR to load a table and once I'm done with this load I am supposed to populate a status table which will capture the 'SYSDATE', and the total number of rows I loaded in the other table.
    Can anybody help me?
    Thanks

    BTW, the load-status table would take the error-record-count as well as the load-count?
    Sorry missed that earlier!

Maybe you are looking for

  • What is the query for Accounts Payable Trial Balance (APXTRBAL) in 11.5.10?

    Hi Guys, What is the query for Accounts Payable Trial Balance (APXTRBAL) in 11.5.10? I have to write an AP Invoice Aging Report in Discoverer to show all unpaid invoices that have been transferred into GL... Business wants it to match one-to-one with

  • Wish list for lrt224

    would be nice to have a wishlist here also. Some wishes are possibly worth looking at. Here I go: 1. I would like to have also VLAN support on the WAN and WAN/DMZ port, as I have explained inanother post. 2. Instead of saving the configuration as a w

  • Saving the output of a .sql file in .csv format

    Hi, I am saving the output of a .sql file in .csv format. But the problem is , the record of few columns have "new line" character in it, so when it is getting saved in .csv format, those records are coming in multiple rows, but they should come in o

  • Forein trade data issue

    Hi All, I have the following scenaro: My client is a EOU(Export Oriented Unit) Its only plant is in India. All customers are export customers. e.g. US. So we are maintaining country US in customer master data:General Data Tab Page. But after creating

  • Downpayment before Order Confirmation

    Hi Gurus, Plz. help me in mapping the business scenario. My client sells the goods either in cash or in dd to all the distributors. I need to show the cash in advance column in the sales order screen. Also I want to restrict the end user not to excee