Archive rows from one table to another, in batches of N at a time
I am trying to set up an archiving process to move rows out of a large table into an archive table. I am using Oracle 10gR2, and I do not have the partitioning option. The current process is a simple (apologies for the layout, I can't see a way of formatting code neatly for posting...)
for r in (select ... from table where ...) loop
insert into arch_table (...) values (r.a, r.b, ...);
-- error handling
delete from table where rowid = r.rowid;
-- error handling
commit_count := commit_count + 1;
if commit_count >= N then
commit;
commit_count := 0;
end if;
end loop;
I know this is not a good approach - we're looking at fixing it because it's getting the inevitable "snapshot too old" errors, apart from anything else - and I'd like to build something more robust.
I do need to only take N rows at a time - firstly, because we don't have the space to create a big enough undo tablespace to do everything at once, and secondly, because there is no business reason to insist that the archiving is done in a single transaction - it's perfectly acceptable to do "some at a time" and having multiple commits makes the process restartable while at the same time making some progress on each run.
My first thought was to use rownum in the where clause to just do a bulk insert then a bulk delete:
insert into archive_table (...) select ... from table where ... and rownum < N;
delete from table where ... and rownum < N;
commit;
(I'd need some error logging clauses in there to be able to report errors properly, but that's OK).
However, I can't find anything that convinces me that this type of use of rownum is deterministic - that is, the delete will always delete the same rows that the insert inserted (I could imagine different plans for the insert and the delete, which meant that the rows were selected in a different order). I can't think of a way to prove that this couldn't happen.
Alternatively, I could do a bulk collect to select the rows in batches, then do a bulk insert followed by a rowid-based delete. That way there's a single select, so there's no issue of mismatches, but this would potentially use a lot of client memory to hold the row set.
Does anybody have any comments or suggestions on the best way to handle this? I'd prefer a solution along the lines of the first suggestion (use rownum in the where clause) if I could find something I could be sure was reliable. I just have a gut reaction that it "should" be possible to do something like this in a single statement. I've looked briefly at esoteric uses of the merge statement to do the insert/delete in a single statement, but couldn't find anything.
It's a problem that seems to come up a lot in discussions, but I have never yet seen a decent discussion of the various tradeoffs. (Most solutions I've seen tend to either suggest "bite the bullet and do it in one transaction and give it enough undo" or "use features of the data (for example, a record ID column) to roughly partition the data into manageable sizes", neither of which would be particularly easy in this case).
Thanks in advance for any help or suggestions.
Paul
Actually, you also have a problem in that you get a PLS-00436 error because you can't reference individual attributes of a record in a FORALL. (I think this restriction might have been eased on 11g, but as I'm on 10g I have to live with it :-().
However, your code did give me an idea - I can just bulk-select ROWID and then do an insert ... select * where rowid = (the selected rowid). I need to consider how efficient this will be, in a bit more detail, and do some tests, but as I'm doing rowid-based selects it should be reasonably good. Here's some sample code:
-- create table pfm_test as select * from dba_objects;
-- insert into pfm_test select * from pfm_test;
-- insert into pfm_test select * from pfm_test;
-- insert into pfm_test select * from pfm_test;
-- update pfm_test set created = sysdate - (rownum/24);
-- commit;
-- create table pfm_test_archive as select * from pfm_test where 1=0;
create or replace procedure archive_data (days number, batch number) as
cursor c is select rowid from pfm_test where created < sysdate-days;
type t_rowid_arr is table of rowid;
l_rowid_arr t_rowid_arr;
i number := 0;
begin
loop
open c;
fetch c bulk collect into l_rowid_arr limit batch;
i := i + 1;
dbms_output.put_line('Batch ' || i || ': ' || l_rowid_arr.count || ' records');
exit when l_rowid_arr.count = 0;
forall i in l_rowid_arr.first .. l_rowid_arr.last
insert into pfm_test_archive
select * from pfm_test where rowid = l_rowid_arr(i);
forall i in l_rowid_arr.first .. l_rowid_arr.last
delete from pfm_test where rowid = l_rowid_arr(i);
commit;
close c;
end loop;
end;
-- exec archive_data(17000,1000);Now to look at error handling for FORALL statements...
Thanks for the help.
Paul.
Similar Messages
-
Copying table rows from one table to another table form
Hi
I have a problem about Copying table rows from one table to another table form.On jsf pages if you enter command button go anather jsf page and it copy one row to another table row. But when i execute this process for table FORM it doesn't copy I wrote a code under "createRowFromResultSet - overridden for custom java data source support." Code block is:
ViewRowImpl value = super.createRowFromResultSet(qc, resultSet);
try{
AdfFacesContext fct = AdfFacesContext.getCurrentInstance();
Number abc = (Number)fct.getProcessScope().get("___");
value.setAttribute("___",abc);
}catch(Exception ex){System.out.println(ex); }
return value;Table may be copied with the
expdp and impdp utilities.
http://www.oracle.com/technology/products/database/utilities/index.html -
Copying many rows from one table to another
Could anyone tell me the best way to copy many rows (~1,000,000) from one table to another?
I have supplied a snipit of code that is currently being used in our application. I know that this is probably the slowest method to copy the data, but I am not sure what the best way is to proceed. I was thinking that using BULK COLLECT would be better, but I do not know what would happen to the ROLLBACK segment if I did this. Also, should I look at disabling the indexes while the copy is taking place, and then re-enable them after it is complete?
Sample of code currently being used:
PROCEDURE Save_Data
IS
CURSOR SCursor IS
SELECT ROWID Row_ID
FROM TMP_SALES_SUMR tmp
WHERE NOT EXISTS
(SELECT 1
FROM SALES_SUMR
WHERE sales_ord_no = tmp.sales_ord_no
AND cat_no = tmp.cat_no
AND cost_method_cd = tmp.cost_method_cd);
BEGIN
FOR SaveRec IN SCursor LOOP
INSERT INTO SALES_ORD_COST_SUMR
SELECT *
FROM TMP_SALES_ORD_COST_SUMR
WHERE ROWID = SaveRec.Row_ID;
RowCountCommit(); -- Performs a Commit for every xxxx rows
END LOOP;
COMMIT;
EXCEPTION
END Save_Data;
This type of logic is used to copy data for about 8 different tables, each containing approximately 1,000,000 rows of data.Your best bet is
Insert into SALES_ORD_COST_SUMR
select * from TMP_SALES_ORD_COST_SUMR;
commit;
Read this
http://asktom.oracle.com/pls/ask/f?p=4950:8:15324326393226650969::NO::F4950_P8_DISPLAYID,F4950_P8_CRITERIA:5918938803188
VG -
How to Transfer Rows from one table to another in a different Page
Hi Friends,
My problem is; I need to call a custom page as a popup using Java-Script. ( This is because our business users want the multi-select LOV to look and function differently ).
I have a table in the popped-up page from where; upon a button action I need to close the pop-up and transfer the selected rows , back to the base-page's table.
( Both the Base Page and the Pop-Up are Custom Pages.)
Please find below the AM code that I call before closing the window using Java Script.
But the Base-Page table remains un-disturbed. Can you please show me how to do the transfer of records if possible ?
OAViewObjectImpl main_vo = getBasePageTableVO1();
OAViewObjectImpl sel_vo = getPopupPageTableVO1();
int fetchedRowCount = sel_vo.getFetchedRowCount();
RowSetIterator iterator = sel_vo.createRowSetIterator("SelectedRows_Iterator");
if (fetchedRowCount > 0)
iterator.setRangeStart(0);
iterator.setRangeSize(fetchedRowCount);
for (int i = 0; i < fetchedRowCount; i++)
PopupPageTableVORowImpl row = (PopupPageTable)iterator.getRowAtRangeIndex(i);
BasePageTableVORowImpl main_row = (BasePageTableVORowImpl)main_vo.createRow();
if (main_vo.getFetchedRowCount() == 0)
main_vo.setMaxFetchSize(0);
main_vo.setWhereClause(" 1 = 0 ");
main_vo.executeQuery();
main_vo.setCurrentRow(main_vo.last());
main_vo.next();
main_vo.insertRow(main_row);
main_row.setNewRowState(main_row.STATUS_INITIALIZED);
main_vo.setCurrentRow(main_row);
try
main_row.setField1(row.getField1());
main_row.setField2(row.getField2());
main_row.setField3(row.getField3());
catch(JboException _ex)
iterator.closeRowSetIterator();Thanks Ramkumar. I am able to catch the action after I used formSubmit .
The below lines in processRequest declares the function cszRefreshBase and later; on attaching the function name in the open window java script call, I get my desired functionality.
StringBuffer stringbuffer = new StringBuffer();
stringbuffer.append("function cszRrefreshBase(lovwin, event) ");
stringbuffer.append("{ ");
stringbuffer.append(" if (!lovwin.PopupSL) ");
stringbuffer.append(" return false; ");
stringbuffer.append(" submitForm('DefaultFormName', 0, {'cmePopupEvent':'popupUpdate'}); ");
stringbuffer.append("}");
oapagecontext.putJavaScriptFunction("cszRefreshBaseJS", stringbuffer.toString()); -
Automatic copying row from one table into another one
Hi,
I am looking for some help on how to do the following:
I use Number to track my finances. I have two tables - one for my checking account and the other one for my cash account. When I withdraw cash from my checking account I record a transfer or debit transaction in my checking account table (in the type column of this table I enter "transfer" and in the category column of this table I enter "cash account"); Obviously I have to record a matching transaction in my cash account where the category column shall read "checking account". Both records represent one and the same transaction. In order not to enter this transaction twice I would like to "automate" this process so that once I enter the transaction in either of the two table (checking or cash) the matching entry automatically appears in the other table. Is there any way to do this.
Thank you,
EvgenyYou can use Connection#getMetaData() to retrieve information about the tables and the columns of the table.
After all, it is better to gain information about the table first and then issue a query in the form of "INSERT INTO table1 SELECT * FROM table2", including the eventual column selections and/or data conversions at SQL level. -
How to copy data from one table to another (in other database)
Hi. I would like to copy all rows from one table to another (and not use BC4J). Tables can be in various databases. I have already 2 connections and I am able to browse source table using
ResultSet rset = stmt.executeQuery("select ...");
But I would not like to create special insert statement for every row . There will be problems with date formats etc and it will be slow. Can I use retrieved ResultSet somehow ? Maybe with method insertRow, but how, if ResultSet is based on select statement and want to insert into target table? Please point me in the right direction. Thanks.No tools please, it must be common solution. We suceeded in converting our BC4J aplication to PostgreSQL, the MSSQL will be next. So we want to write simple aplication, which could transfer data from our tables between these 3 servers.
-
Moving time-dependant data from one table to another (archiving)
Hello all
I would like to know if there's an easier solution or a "best practice" to move data from one table to another. The context of this issue can be found within "archiving".
More concretely: we have an application that uses several tables to log information to.
These tables are growing like crazy, and we would like to keep only "relevant" data in those tables, so I was thinking about moving data from these tables that have been in there for, say 2 months, to "archiving" tables.
I figured there must be some kind of "best practice" to get this done.
I have already written a procedure that loops the table that has the time indicator and inserts the records from the normal tables into the archive tables (and afterwards delete this data), but it seems to be taking ages to get it done.
Thanks in advance!
Message was edited by:
timschraepenThere is nothing to do with PL/SQL.
You can refer below links:
http://www.lc.leidenuniv.nl/awcourse/oracle/server.920/a96524/c12parti.htm
http://www.stanford.edu/dept/itss/docs/oracle/10g/server.101/b10739/partiti.htm#i1006727 -
Passing Multiple table row from one view to another view
Hi,
How to Passing Multiple table row from one view to another view in Web Dynpro Abap. (Table UI Element)
thanx....Hi Ganesh,
Kindly do search before posting.. this discussed many times..
First create your context in component controller, and do context mapping in two views so that you can get values from
one veiw to any views.
and for multiple selection, for table we have property selection mode.. set as multi and remember context node selection
selection cardinality shoud be 0-n.
so, select n no of rows and based on some action call sec view and display data.( i think you know navigation between veiw ).
Pelase check this...for multi selection
Re: How to copy data from one node to another or fromone table to another table
for navigation.. check
navigation between the views
Cheers,
Kris. -
MySQL move date from one table to another
I was wondering if there is a MySQL command that will let me move a selected row of data from one table to another. both tables have the same columns and declaration type (one table is actually an archived table on old data)
example
I wasnt to move all data in Table1 where the date is greater than 30 days old to Table 2
-- so the result should be...import all rows to Table 2 where the date is greater than 30 days old..and delete all date from Table 1 that is greater than 30 days.
currently..I'm doing three process
1) get all row that is greater than 30 days
"SELECT * FROM Table1 WHERE TO_DAYS(NOW()) - TO_DAYS(dateField) > 30"
2) insert data into Table2
while (res.hasNext())
TableData data = ..... // .get row
dataList.add(data);
for (int i = 0; i < dataList.size(); i++){
pstm.setString.....
pstm.addBatch()
pstm.executeBatch();
3) delete data from Table1
"DELETE FROM Table 1 WHERE TO_DAYS(NOW()) - TO_DAYS(dateField) > 30"for this app..losing a few rows does not
impact on how we analyze the data.That's what everyone always tells me too. But 99% of the time they come back and want to know why the cannot balance and/or validate the data between two runs taken only minutes from each other.
I've seen people puzzle over data for days that they swear they ran the exact same utility for their tests, but they were in fact using live data, and additional data had accrued but since all they had to do was execute the a script without parameters (they didn't put in a stop time), they got two different answers and it always, and I mean always confuses people. Be safe, and put the option in for and end date/time, then when they waste days trying to figure out why the two different observations gave them different numbers, they cannot blame you (because you gave them the option)!
My 2 cents for the day... -
Insert old missing data from one table to another(databaase trigger)
Hello,
i want to do two things
1)I want to insert old missing data from one table to another through a database trigger but it can't be executed that way i don't know what should i do in case of replacing old data in table_1 into table_2
2)what should i use :NEW. OR :OLD. instead.
3) what should i do if i have records exising between the two dates
i want to surpress the existing records.
the following code is what i have but no effect occured.
CREATE OR REPLACE TRIGGER ATTENDANCEE_FOLLOWS
AFTER INSERT ON ACCESSLOG
REFERENCING NEW AS NEW OLD AS OLD
FOR EACH ROW
DECLARE
V_COUNT NUMBER(2);
V_TIME_OUT DATE;
V_DATE_IN DATE;
V_DATE_OUT DATE;
V_TIME_IN DATE;
V_ATT_FLAG VARCHAR2(3);
V_EMP_ID NUMBER(11);
CURSOR EMP_FOLLOWS IS
SELECT EMPLOYEEID , LOGDATE , LOGTIME , INOUT
FROM ACCESSLOG
WHERE LOGDATE
BETWEEN TO_DATE('18/12/2008','dd/mm/rrrr')
AND TO_DATE('19/12/2008','dd/mm/rrrr');
BEGIN
FOR EMP IN EMP_FOLLOWS LOOP
SELECT COUNT(*)
INTO V_COUNT
FROM EMP_ATTENDANCEE
WHERE EMP_ID = EMP.EMPLOYEEID
AND DATE_IN = EMP.LOGDATE
AND ATT_FLAG = 'I';
IF V_COUNT = 0 THEN
INSERT INTO EMP_ATTENDANCEE (EMP_ID, DATE_IN ,DATE_OUT
,TIME_IN ,TIME_OUT,ATT_FLAG)
VALUES (TO_NUMBER(TO_CHAR(:NEW.employeeid,99999)),
TO_DATE(:NEW.LOGDATE,'dd/mm/rrrr'), -- DATE_IN
NULL,
TO_DATE(:NEW.LOGTIME,'HH24:MI:SS'), -- TIME_IN
NULL ,'I');
ELSIF V_COUNT > 0 THEN
UPDATE EMP_ATTENDANCEE
SET DATE_OUT = TO_DATE(:NEW.LOGDATE,'dd/mm/rrrr'), -- DATE_OUT,
TIME_OUT = TO_DATE(:NEW.LOGTIME,'HH24:MI:SS'), -- TIME_OUT
ATT_FLAG = 'O'
WHERE EMP_ID = TO_NUMBER(TO_CHAR(:NEW.employeeid,99999))
AND DATE_IN <= (SELECT MAX (DATE_IN )
FROM EMP_ATTENDANCEE
WHERE EMP_ID = TO_NUMBER(TO_CHAR(:NEW.employeeid,99999))
AND DATE_OUT IS NULL
AND TIME_OUT IS NULL )
AND DATE_OUT IS NULL
AND TIME_OUT IS NULL ;
END IF;
END LOOP;
EXCEPTION
WHEN OTHERS THEN RAISE;
END ATTENDANCEE_FOLLOWS ;
Regards,
Abdetu..INSERT INTO SALES_MASTER
( NO
, Name
, PINCODE )
SELECT SALESMANNO
, SALESMANNAME
, PINCODE
FROM SALESMAN_MASTER;Regards,
Christian Balz -
Flowing data from one Table to another
I am new to Numbers and I am having a hard time figuring out how to flow a sum from one table into another table. Is this possible? Please help.
Thanks
Elassiegirl wrote:
I am new to Numbers and I am having a hard time figuring out how to flow a sum from one table into another table. Is this possible? Please help.
Hi lassiegirl,
Welcome to Apple Discussions and the Numbers '09 forum.
Apple provides two excellent resources that I recommend be downloaded by all Numbers users, the Numbers '09 User Guide and the iWork Formulas and Functions User Guide. You'll find linke for both of them in the Help menu in Numbers.
The first will give you an overview of Numbers and how it works—spend some time with the preface and the first chapter, browse the rest on a need to know basis when you're doing something new. The second is a reference, useful when you're trying to write a formula.
To your question...
You can copy the sum from one table to another.
For the example formula beow, both Table 1 and Table 2 have one header row and one footer row, and a total of 21 rows each.
If the SUM on Table 1 is in cell C21, and you want to include it in the SUM of column C in Table 2, you could transfer the sum to C1 in the header row of Table 2 with:
C1: =Table 1::C21
In C21 of Table 2 (a Footer row), use any of these formulas:
=SUM(C)+C1
=SUM(Table 1::C)+SUM(C)
=SUM(Table 1 :: C,C)
The first calculates the sum of the cells in column C (of its own table—Table 2) and adds the value in C1, the total transferred from Table 1.
The second returns the same result by calculating the sums of the two columns separately, then adding those sums.
The third takes the single arguments of the two SUM statements in the second, and lists them as multiple arguments in a single SUM statement.
Regards,
Barry -
HOW TO GET THE SELECTED VALUE IN A ROW FROM ONE VIEW TO ANOTHER VIEW?
hi all,
I have a small issue.
i have created two views.In the table of the first view i'm selecting a row and pressing the button it will move to next view.
i am adding some fields manually in the table of the second view and pressing the save button.Here all the values should get updated corresponding to the field which i have selected in the first view.
I want to know how to get the particular field in the selected row from one view to another view.
Kindly help me.Hi,
Any data sharing accross views can be achiveved by defining CONTEXT data in COMPONENT CONTROLLER and mapping it to the CONTEXT of all the views. Follow the below steps.
1. Define a CONTEXT NODE in component controller
2. Define same CONTEXT NODE in all the views where this has to be accessed & changed.
3. Go to CONTEXT NODE of each view, right click on the node and choose DEFINE MAPPING.
This is how you map CONTEXT NODE and same can be accessed/changed from any VIEW or even from COMPONENT CONTROLLER. Any change happens at one VIEW will be automatically available in others.
Check the below link for more info regarding same.
[http://help.sap.com/saphelp_nw04s/helpdata/EN/48/444941db42f423e10000000a155106/content.htm]
Regards,
Manne. -
Copy Long raw from one table to another.
Hi
I am trying to copy data from one table to another. Source has one Long Raw column. There are about million rows in source table. And I am using Oracle 8.1.5. Whenever I execute the copy command, I get the following error message: Can someone help me?
SQL> set long 64000000
SQL> set copycommit 1
SQL> set arraysize 100
SQL> COPY to test/test@testfix CREATE resume_bkup1 using select * from resume;
Array fetch/bind size is 100. (arraysize is 100)
Will commit after every array bind. (copycommit is 1)
Maximum long size is 64000000. (long is 64000000)
ERROR:
ORA-01084: invalid argument in OCI call
SQL>
Thanks
V Prakashinsert into emp_personal(emp_no, emp_pic) select emp_no, emp_pic from emp_personal_old where empno = '10059'
Read the documentation as suggested by sol.beach.
And fix your front-end to use supported datatypes. -
How to convert data when transferring from one table to another
I have two tables and these are the structure of the tables
create table E1(
ID NUMBER
,NAME VARCHAR2(30)
, DESIGNATION VARCHAR2(30)
,GENDER VARCHAR2(10));
create table E2(
ID NUMBER
,NAME VARCHAR2(30)
, DESIGNATION VARCHAR2(3)
,GENDER NUMBER); Now I want to transfer records from one table to another using a master tables where data are compared because the datatypes in tables are different
The first one is a gender table to match the gender and convert
create table Gender(
E1 varchar2(10),
E2 number);The second is for the designation
create table Designation(
E1 varchar2(30),
E2 varchar2(3);How to match and convert the data so that it can be transfered.Peeyush wrote:
Can we do it with the help of a cursor.
All SQL executed by the database are parsed as cursors and executed as cursors.
I mean I have to insert data in bulk and I want to use cursor for it.The read and write (select and insert) are done by the SQL engine. The read part reads data and passes it to the write part that inserts the data.
Now why would using PL/SQL and bulk processing make this faster? It will reside in-between the read part and the write part being done by the SQL engine.
So the SQL engine reads the data. This then travels all the way to the PL/SQL engine as a bulk collect. PL./SQL then issues an insert (the write part to be done by the SQL engine). And now this very same data travels all the way from the PL/SQL engine to the SQL engine for insertion.
So just how is this approach, where you add extra travel time to data, faster?
and i want to commit the transaction after every 50 recordsWhy? What makes you think this is better? What makes you think you have a problem with not committing every 50 rows? -
Trigger to copy records from one table to another; ORA-04091:
Hello,
I'm trying to create a trigger that will move data from one table to another.
I have two tables (Trial1, Trial2) Both of them contains the same attributes (code, c_index)
I want to move each new record inserted in (code in Trial1) to (code in Trial2)
This is my trigger:
Create or replace trigger trg_move_to_trial2
After insert on Trial1
for each row
begin
insert into Trial2 (code)
select :new.code from Trial1;
end;It compiled, but when I insert new (code) record into (Trial1) it display this error:
Error report:
SQL Error: ORA-04091: table STU101.TRIAL1 is mutating, trigger/function may not see it
ORA-06512: at "STU101.TRG_MOVE_TO_TRIAL2", line 3
ORA-04088: error during execution of trigger 'STU101.TRG_MOVE_TO_TRIAL2'
04091. 00000 - "table %s.%s is mutating, trigger/function may not see it"
*Cause: A trigger (or a user defined plsql function that is referenced in
this statement) attempted to look at (or modify) a table that was
in the middle of being modified by the statement which fired it.
*Action: Rewrite the trigger (or function) so it does not read that table.I know what does this error mean, but I don't how to fix the error.
I tried to change the (After insert on Trial1) to be (Before insert on Trial1); that worked, but not in the right way. When I insert new value into (code in Trial1) and refreshed Trial2 table, as much as records I have in Trial 2 they will be duplicated. E.g.
Trial2
code
111
222
333when I insert in Trial1
Trial1
code
444
Trial2 will be:
code
111
222
333
444
444
444Can you please tell me how to solve this issue?
Regards,
Edited by: 1002059 on Apr 23, 2013 5:36 PMYou should not select from Trial1 - you have the data already. Just insert that value.
Create or replace trigger trg_move_to_trial2
After insert on Trial1
for each row
begin
insert into Trial2 (code)
values (:new.code);
end; regards,
David
Maybe you are looking for
-
Changing the status of appraisal document
dear experts, I am working in ECC 6.0 I have created an appraisal category and template and want to maintain it for evaluations and released the document After saving and completing the document, i want to revoke the status from "released " to "non r
-
I am live in Kuwait , I haven't FaceTime on my iPad air how I download it?
I'M live in Kuwait and I have iPad Air and I want facetime for iPadiPad
-
While upgrading 5.0 SRM: we created shopping cart with out catalog as per process in backend (R/3) purchase requisation should be created but PO is creating and This issue is only for specifc product category;
-
Download adobe creative suite 6 from adobe website instead of using CDs
I do not have a cd/dvd drive on my laptop can I just download the programs from the site and if so how?
-
Hi, How to provision a resource with process information using a java program? Resource does not have a resource form. Thanks