Large Table Data Updation

Hi,
I have a large table about 8 GB i need to add a new column to this table and update new column values of all the present records.
Please let me know any good methods for the same. while doing i same i also intend to partition the table.
Thanks in advance.

Check this
SQL> CREATE TABLE tb1(c1 NUMBER);
Table created.
SQL> ALTER TABLE tb1 ADD CONSTRAINT tb1_pk PRIMARY KEY(c1);
Table altered.
SQL> INSERT INTO tb1 VALUES(1);
1 row created.
SQL> CREATE TABLE tb2(c1 NUMBER, c2 NUMBER);
Table created.
SQL> exec dbms_redefinition.can_redef_table(USER, 'TB1');
PL/SQL procedure successfully completed.
SQL>  exec dbms_redefinition.start_redef_table(USER, 'TB1', 'TB2', 'C1 C1, 34 C2');
PL/SQL procedure successfully completed.
SQL> exec dbms_redefinition.finish_redef_table(USER, 'TB1', 'TB2')
PL/SQL procedure successfully completed.
SQL> SELECT * FROM tb1;
     C1        C2
      1        34
SQL> Lukasz

Similar Messages

  • External Table to Internal table Data update query

    Hi all ,
    I have Follwoing 2 tables one is external oracle and 2nd is internal and both table have the same data as following this is sample data while actual table contains millions of record.
    External Table name SE2_EXT
            GL_REF_NO     GL_CUST_ID     GL_TRAN_AMT_LCY     GL_REVERSAL_MARKER     GL_GL_ID     GL_LOCAL
    1     5          513557               100               136340003678088.020001     
    2     5          513557               -100          R     136340003678088.020002     
    3     1          26               -685.12               136340003674772.030001     
    4     1          26               685.12          R     136340003674772.030002     
    5     4          500539               100               136340003477900.000001     
    6     4          500539               -100          R     136340003477900.000002     
    7     23          604612               182.15               136340003578165.170001     
    8     23          604612               -182.15          R     136340003578165.170002     
    9     76          232033               -230.7               136340003576922.100001     
    10     76          232033               235.7          R     136340003576957.010001     I want to update GL_LOCAL column with 'R' with conditions that if
    WHERE GL2.GL_GL_ID=GL2_EXT.GL_GL_ID
      AND GL2.GL_REF_NO=GL2_EXT.GL_REF_NO
      AND GL2.GL_CUST_ID=GL2_EXT.GL_CUST_ID
      AND GL2.GL_REVERSAL_MARKER IN ('R',NULL)
      AND GL2.GL_TRAN_AMT_LCY=GL2_EXT.GL_TRAN_AMT_LCYbut the tricky thing is GL_TRAN_AMT is one time - (minus) and one time +(plus). I only want to update those record who have same TRAN_AMT_LCY with same other condition mentioned above.
    I tried merj statement but it did'nt work.If any body help me i would be appriciated.
    MERGE INTO SE2 A
    USING ( SELECT GL_REF_NO,GL_CUST_ID,GL_TRAN_AMT_LCY,GL_REVERSAL_MARKER,GL_GL_ID FROM GL2_EXT) B
    ON (A.GL_GL_ID=B.GL_GL_ID AND A.GL_CUST_ID=B.GL_CUST_ID)
    WHEN MATCHED THEN
    UPDATE
    SET A.GL_LOCAL_A1=B.SE_LOCAL

    Why is it tricky? You did not say.
    Making a guess as to the problem why not use TO_NUMBER to cast the string?

  • Asking for transport request whie table data updation

    Hi All,
    I am trying to update a table TCJ04 using transaction OPS6. When ever I am adding a new entry or edit the existing entry and click SAVE, the table is asking for Transport request number.
    This table carries a master data and should not be asking for transport request. Rather its access has to be controlled by user authorisation.
    Please tell me as to what is the reason of asking for transport request.
    regards,
    Gaurav.

    Hi Gourav,
    If you go table maintanance generater of the table TCJ04 in SE11, you will find that Standard recording routine radio button is selected. That is the reason you are asked to provide transport request while updating data.

  • Problem: Z table data updates using SM30

    Hi,
    I have a requirement, while updating records in a Z table
    using SM30 Transaction.
    The Structure of the table is below:
    APMOD    Primary Key   Char(3)
    KONST    Primary Key   Char(20)
    ENDDA    Primary Key   DATS
    BEGDA    Non Key       DATS   
    and some other non key fields....
    Problem:  This table should act something like Infotype in HR.  I mean Delimition of Records while creating or changeing the existing record.
    Say there is a record
    APMOD = OGMT
    KONST = Organization Management
    BEGDA = 01/01/2004
    ENDDA = 12/31/9999
    Whenever i am Inserting a new record with key
    APMOD = OGMT and KONST = 'Organization Management'
    and BEGDA = '01/01/2006'.
    First it should update the old record with
    BEGDA = 01/01/2004 and ENDDA = 12/31/2005
    Then the new record needs to be inserted with
    BEGDA = 01/01/2006 and ENDDA = 12/31/9999.
    How can I achieve this using SM30? Can we write our own code somewhere? If yes Where and How?  Or is there any settings available for this requirement?
    I can write a Z program to update this Table, but i should achieve this using SM30 only.
    Let me know if you need any additional info. 
    Regards,
    Sudhakar.

    Hi Sudhakar,
    1. I tried the same at my end. It works fantastic !
       In  SM30 it shows
       'Delimit' Button
       'Expand <--> Collapse' Button.
        and accordingly delmits the records.
    2. in SE11,
       Use the menu
       Utilities ---> Table Maintenance Generator
       and finally build a table maintenance
       to use in SM30.
    3. When u use in SM30,
       u will achieve what u want.
    4. Just make sure your field
       ENDDA has the data element ENDDA
       in table definition.
      It should also be key.
    5. After this ENDDA column,
       there should be no other key column
      ( not even BEGDA)
    I Hope it helps.
    Regards,
    Amit M.

  • Asking for transport request while table data updation

    Hi All,
    I am trying to update a table TCJ04 using transaction OPS6. When ever I am adding a new entry or edit the existing entry and click SAVE, the table is asking for Transport request number.
    This table carries a master data and should not be asking for transport request. Rather its access has to be controlled by user authorisation.
    Please tell me as to what is the reason of asking for transport request.
    regards,
    Gaurav.

    Hi There,
    It is asking for the transport request for each new entry in the table since the Delivery class in the attributes is set to C( Customizing table, maintenance only by cust., not SAP import ), If it is set to "A" it will not ask for the transport request..
    Try creating a custom table and play around with the Delivery class, A and C, you will come to know the difference.
    Let me know if you need further help in this regard.
    don't 4get to rward pionts if found helpful.
    Thanks-

  • Updating Large Tables

    Hi,
    I was asked the following during an interview
    U have a large table with millions rows and want to add a column. What is the best way to do it without effecting the preformance of DB
    Also u have a large table with million rows how do u organise the indexes
    My answer was to coalasce the indexes
    I was wondering what is the best answers for these questions
    Thanks

    Adding a column to a table, even a really big one is trivial and will have no impact on the performance of the database. Just do:
    ALTER TABLE t ADD (new_column DATATYPE);This is simply an update to the data dictionary. Aside from the few milliseconds that Oracle will lock some dicitonary tables (no different than the locks held if you update a column in a user table) there will be no impact on the performance of the database. Now, populatng that column would be a different kettle of fish, and would depend on how (i.e. single value for all rows, calculated based on other columns) the column needs to be populated.
    I would have asked for clarification on what they meant by "oraganise the indexes". If they meant what tablespaces should they go in, I would say in the same tablespace as other objects of similar size (You are using locally managed tablespaces aren't you?). If they meant what indexes would you create, I would say that I would create the indexes neccessary to answer the queries that you run.
    HTH
    John

  • Updating a large table

    Hello,
    We need to update 2 columns on a very large table (20000000 records). Every row in the table is to be updated and the client wants to be able to update the records by year. Below the procedure that has been developed
    DECLARE
    l_year VARCHAR2 (4) := '2008';
    CURSOR c_1 (l_year1 VARCHAR2)
    IS
    SELECT ROWID l_rowid, (SELECT tmp.new_code_x
    FROM new_mapping_code_x tmp
    WHERE tmp.old_code_x = l.code_x) code_x,
    (SELECT tmp.new_code_x
    FROM new_mapping_code_x tmp
    WHERE tmp.old_code_x = l.code_x_ori) code_x_ori
    FROM tableX l
    WHERE TO_CHAR (created_date, 'YYYY') = l_year1;
    TYPE typec1 IS TABLE OF c_1%ROWTYPE
    INDEX BY PLS_INTEGER;
    l_c1 typec1;
    BEGIN
    DBMS_OUTPUT.put_line ( 'Update start - '
    || TO_CHAR (SYSDATE, 'DD/MM/YYYY HH24:MI:SS')
    OPEN c_1 (l_year);
    LOOP
    FETCH c_1
    BULK COLLECT INTO l_c1 LIMIT 100000;
    EXIT WHEN l_c1.COUNT = 0;
    FOR indx IN 1 .. l_c1.COUNT
    LOOP
    UPDATE tableX
    SET code_x = NVL (l_c1 (indx).code_x, code_x),
    code_x_ori =
    NVL (l_c1 (indx).code_x_ori, code_x_ori)
    WHERE ROWID = l_c1 (indx).l_rowid;
    END LOOP;
    COMMIT;
    END LOOP;
    CLOSE c_1;
    DBMS_OUTPUT.put_line ( 'Update end - '
    || TO_CHAR (SYSDATE, 'DD/MM/YYYY HH24:MI:SS')
    END;
    We do not want to do a single update by year as we fear the update might fail with for example rollback segment error.
    It seems to me the above developed is not the most efficient one. Any comments on the above or anyone having a better solution?
    Thanks

    Everything wrong with the sample code and approach used. This is not how one uses Oracle. This is not how one designs performant and scalable code.
    Transactions must be consistent and logical. A commit in the middle of "+doing something+" is wrong. Period. (and no, the reasons for committing often and frequently in something like SQL-Server do not and never have applied to Oracle)
    Also, as I/O is the slowest and most expensive operation that one can perform in a database, it simply makes sense to reduce I/O as far as possible. This means not doing this:
    WHERE TO_CHAR (created_date, 'YYYY') = l_year1;Why? Because an index on created_date is now rendered utterly useless... and in this specific case will result in a full table scan.
    It means using the columns in their native data types. If the column is a date then use it as a date! E.g.
    where created_date between :startDate and :endDateThe proper approach to this problem is to determine what is the most effective logical transaction that can be done, given the available resources (redo/undo/etc).
    This could very likely be daily - dealing and updating with a single day's data at a time. So then one will write a procedure that updates a single day as a single transaction.
    One can also create a process log table - and have this procedure update this table with the day being updated, the time started, the time completed, and the number of rows updated.
    One now has a discrete business process that can be run. This allows one to run 10 or 30 or more of these processes at the same time using DBMS_JOB - thus doing the updates for a month using parallel processing.
    The process log table can be used to manage the entire update. It will also provide basic execution time details allowing one to estimate the average time for updating a day and the total time it will take for all the data in the large table to be updated.
    This is a structured approach. An approach that ensures the integrity of the data (all rows for a single day is treated as a single transaction). One that also provides management data that gives a clear picture of the state of the data in the large table.
    I'm a firm believer that is something is worth doing, it is worth doing well. Using a hacked approach of blindly updating data and committing ad-hoc without any management and process controls... That is simply doing something very badly. Why? It may be interesting running into a brick wall the first time around. However, subsequent encounters with the wall should be avoided.

  • What are the tables will update while loading Master data ?

    Hello Experts,
    What are the tables will update while loading Master data ? And requesting you to provide more information about Master data loading and its related settings in the beginning of creation infoobjects. 

    It depends upon the type of Master data u r loading....
    In all the master data loadings, for every new value of master data an SID will be created in the SID table /BI*/S<INFOOBJECT NAME> irrespective of the type of master data.
    But the exceptional tables that get updated depending on the type of master data are.....
    If it is a time Independent master data then the /BI*/P<INFOOBJECT NAME> table gets updated with the loaded data.
    If it is a time dependent master data then the /BI*/Q<INFOOBJECT NAME> table gets updated with the loaded data.
    If the master data is of time Independent Navigational attributes then for every data load the SID table will get updated first and then the /BI*/X<INFOOBJECT NAME> table gets updated with the SID's created in the SID table (NOT WITH THE MASTER DATA).
    If the master data is of time dependent navigational attributes then for every data load the SID table will get updated first and then the /BI*/Y<INFOOBJECT NAME> table gets updated with the SID's created in the SID table (NOT WITH THE MASTER DATA).
    NOTE: As said above, For all the data in P, Q, T, X, Y tables the SID's will be created in the S table /BI*/S<INFOOBJECT NAME>
    NOTE: Irrespective of the time dependency or Independency the VIEW /BI*/M<INFOOBJECT NAME> defined on the top of /BI*/P<INFOOBJECT NAME> & /BI*/Q<INFOOBJECT NAME> tables gives the view of entire master data.
    NOTE: it is just a View and it is not a Table. So it will not have any physical storage of data.
    All the above tables are for ATTRIBUTES
    But when it comes to TEXTS, irrespective of the Time dependency or Independency, the /BI*/T<INFOOBJECT NAME> table gets updated (and of course the S table also).
    Naming Convention: /BIC/*<InfoObject Name> or /BI0/*<InfoObject Name>
    C = Customer Defined Characteristic
    0 = Standard or SAP defined Characteristic
    * = P, Q, T, X,Y, S (depending on the above said conditions)
    Thanks & regards
    Sasidhar

  • Retrieve data from a large table from ORACLE 10g

    I am working with a Microsoft Visual Studio Project that requires to retrieve data from a large table from Oracle 10g database and export the data into the hard drive.
    The problem here is that I am not able to connect to the database directly because of license issue but I can use a third party API to retrieve data from the database. This API has sufficient previllege/license permission on to the database to perform retrieval of data. So, I am not able to use DTS/SSIS or other tool to import data from the database directly connecting to it.
    Here my approach is...first retrieve the data using the API into a .net DataTable and then dump the records from it into the hard drive in a specific format (might be in Excel file/ another SQL server database).
    When I try to retrieve the data from a large table having over 13 lacs records (3-4 GB) in a data table using the visual studio project, I get an Out of memory exception.
    But is there any better way to retrieve the records chunk by chunk and do the export without loosing the state of the data in the table?
    Any help on this problem will be highly appriciated.
    Thanks in advance...
    -Jahedur Rahman
    Edited by: Jahedur on May 16, 2010 11:42 PM

    Girish...Thanks for your reply...But I am sorry for the confusions. Let me explain that...
    1."export the data into another media into the hard drive."
    What does it mean by this line i.e. another media into hard drive???
    ANS: Sorry...I just want to write the data in a file or in a table in SQL server database.
    2."I am not able to connect to the database directly because of license issue"
    huh?? I never heard this question that a user is not able to connect the db because of license. What error / message you are getting?
    ANS: My company uses a 3rd party application that uses ORACLE 10g. And my compnay is licensed to use the 3rd party application (APP+Database is a package) and did not purchased ORACLE license to use directly. So I will not connect to the database directly.
    3.I am not sure which API is you are talking about, but i am running an application of the visual studio data grid or similar kind of controls; in which i can select (select query) as many rows as i needed; no issue.
    ANS: This API is provided by the 3rd party application vendor. I can pass a query to it and it returns a datatable.
    4."better way to retrieve the records chunk by chunk and do the export without loosing the state of the data in the table?"
    ANS: As I get a system error (out of memory) when I select all rows in a datatable at a time, I wanted to retrieve the data in multiple phases.
    E.g: 1 to 20,000 records in 1st phase
    20,001 to 40,000 records in 2nd phase
    40,001 to ...... records in 3nd phase
    and so on...
    Please let me know if this does not clarify your confusions... :)
    Thanks...
    -Jahedur Rahman
    Edited by: user13114507 on May 12, 2010 11:28 PM

  • Validity Table not updating for 0IC_C03 while updating data

    Hi,
    1.Validity table not updating for 0IC_C03 while updating data in my BW 7.4 With HANA data base?
    Key fields : 0Plant
                       0Calday
    if you run this programe after loading data - RSDG_CUBE_VALT_MODIFY it is updating.
    2. I am not getting no marker update option in non-cumulative Info cube 0ic_c03 manage tab or in DTP tabs check as per 7.4 modifications?
    and  2LIS_03_BX in DTP I am getting below this option only
    Can you please give me solution for this issues.
    Regards
    Umashankar

    Hi Uma,
    Please go through the below link which might be helpful.
    Not able to Edit Validity Table : RSDV
    Marker Update Option is available under Collapse tab of Info cube.
    Thanks,
    Karan

  • How to capture userid,date in a custom table while updating a form

    Hi,
    I have a requirement to insert the userid, the form name and the date on which the record is saved in a custom table while updating a custom form.
    We are using Form Builder 6.0.
    I am new to Forms and can anyone help me with these?
    I would also want to know under which trigger i should be writing the code in.
    Thanks in advance.

    you can use:
    usrid := get_application_property(username);
    formname := get_application_property(current_form);
    dt := to_char(sysdate,'dd/mm/yyyy hh:mi:ss');
    you insert these values in on-update trigger at form level

  • Mass data update in Value mapping table

    Hi ,
      I have used Value mapping replication to update the mass data from external source to Value mapping table . Its updatating in Runtime Cache but i want the data to be visible in GUI Value mapping table as well. Is it possible ? Because i doubt that the data in Runtime Cache may get  removed if the system restarts . Can any one help ?
    Thanks
    Laks

    Hi NALLAM GUNA RANJAN,
      Thanks for your prompt reply but i didn't get what you are trying to convey . My issue here is
    Instead of manually entering key-value pair in Value mapping table , I used Value mapping replication ( http://help.sap.com/saphelp_nw04/helpdata/en/2a/9d2891cc976549a9ad9f81e9b8db25/frameset.htm )
    Its updating the data in Runtime Cache ( you can see this using Cache Monitoring )  but not able to view the data in Actual Value mapping Table (GUI in Directory of SAPXI) I want the data updated using Replication to be visible in GUI table is it possible ?
    Hope you got the question much better now
    Thanks
    Laks

  • A DATA_BUFFER_EXCEEDED viewing data on a large table

    Hello,
    Am getting error below while fetching data in DS designer from SAP ECC Tables.
    Error calling RFC function to get table data: <RFC_ABAP_EXCEPTION-(Exception_Key: DATA_BUFFER_EXCEEDED, SY-MSGTY: E, SY-MSGID:
    I followed the KBA :  1752954 - DATA_BUFFER_EXCEEDED error - Data Services  and found the RFC: Z_AW_RFC_READ_TABLE is available in SAP ECC .
    Please help , how do i resolve this kind of error.
    Aisurya

    Hi Aisurya,
    The cause of the exception is a combination of factors:
    The data extracted for a row in an SAP application table source is larger than 512 bytes.
    The Data Services Remote Function Call (RFC) /BODS/RFC_STREAM_READ_TABLE is not installed on the SAP application server.
    If the function /BODS/RFC_STREAM_READ_TABLE is not loaded to the SAP application server, Data Services extracts data using the SAP-supplied function, RFC_READ_TABLE. This function call limits extracted data to 512 bytes per row.
    Also you can try using the enhanced version of  RFC_READ_TABLE called /BODS/RFC_READ_TABLE2.
    Regards
    Arun Sasi

  • Dynamically Update External Table Data

    I loaded a data table from an external csv file. But the csv file is an output from a program that appends a new line to the csv file every couple days.
    I made a Report with the loaded data, but in a couple days when a few new lines of data appear in the csv file, I want my report to update as well.
    Is there a way to create a "Refresh Button" in my application that will load the data from the csv when pressed?
    I have oracle database 11g express with APEX 4.1.1

    Hi,
    what you describe could certainly be achieved, but it would have to be designed and built, it is not going to be provided "out of the box". I would probably go with an external table based on your file and a process that you would build would be callable from the button and this would do some sort of a merge of the external table data into your database table. Apart from being callable on demand, the process could also be made to be a scheduled process that loads the data on a regular basis, say once a day or once an hour, depending on your requirements. To use an external table it is assumed that the file will be available on the database server. If not and the file is available on only a client machine, then doing an on demand load process would still be possible, but a little more difficult.
    Andre

  • To update table data

    create or replace procedure SP_MIS_LEDGER_ON_DEMAND_V2
    as
    var_date1 date;
    VAR_STARTDATETIME DATE;
    VAR_ENDDATETIME DATE;
    -- VAR_EXECUTE_FOR_DATE DATE;
    -- VAR_STATEMENT VARCHAR2(4000);
    VAR_ELAPSEDTIME VARCHAR2(50);
    VAR_INTRIMSTATUSID NUMBER;
    CURSOR c1 IS
    select DISTINCT ta.accountid,day PROCESSDATE,(NVL(payment,0))PAYMENT,0 TOTALDUE,0 CURBILL, NVL(srf,0)SRF,NVL(sbpnt,0)sbpnt,NVL(srv,0)SRV,NVL(sbf,0)SBF,NVL(SBV,0)SBV,NVL(EF,0)EF,NVL(EV,0)EV,NVL(TSRV,0)TSRV,NVL(tsub,0)TSUB,NVL(teqe,0)TEQE,NVL(DT,0)DT,NVL(A.dep,0)RDEP,NVL(B.DEP,0)PDEP,NVL(pnt,0)PNT,NVL(eqp,0)EQP,NVL(dtr,0)DTR,NVL(drf,0)DRF,NVL(unadj,0)UNADJ from
    (select DISTINCT day ,accountid
    from
    syntblmaccount, tblmtime where yyyy=2010)ta,
    (SELECT accountid,
    SUM(srfee)srf,
    SUM(srvat)srv,
    SUM(subfee)sbf,
    SUM(subvat)sbv,
    SUM(eqefee)ef,
    SUM(eqevat)ev,
    SUM(ttlsrv)tsrv,
    SUM(ttlsub)tsub,
    SUM(ttleqe)teqe,
    SUM(dep)dep,
    SUM(dt)dt,trunc(FROMDATE)FROMDATE
    FROM VWDT_V6
    group by accountid, trunc(FROMDATE)
    )a,
    (SELECT accountid,
    SUM(pnt)pnt,
    SUM(subpnt)sbpnt,
    SUM(eqpnt)eqp,
    SUM(dep)dep,
    SUM(DEPTRANSFER)dtr,
    SUM(DEPREFUNDED)drf,
    SUM(unadj)unadj,trunc(paymentdate)paymentdate
    FROM vwkt_v4
    GROUP BY accountid,trunc(paymentdate)
    )b,
    (SELECT ACCOUNTID accountid,TRUNC(createdate)CREATEDATE, SUM(totalamount)PAYMENT
    from syntbltcreditdocument
    where CREDITDOCUMENTTYPEID IN ('CDT01','CDT04')
    group by accountid,TRUNC(createdate))credit
    where ta.accountid=a.accountid(+)
    and ta.accountid=b.accountid(+)
    and ta.accountid=credit.accountid(+)
    and ta.day=a.FROMDATE(+)
    and ta.day=credit.createdate(+)
    and ta.day=b.paymentdate(+)
    and ta.day =to_date('01-MAY-2010','DD-MON-YYYY');
    BEGIN
    SELECT MAX(PROCESSDATE) INTO VAR_DATE1 FROM MIS_LEDGER_DETAIL_TEST;
    SELECT SYSDATE INTO VAR_STARTDATETIME FROM DUAL;
    SELECT SEQ_PRC_STATUS.NEXTVAL INTO VAR_INTRIMSTATUSID FROM DUAL;
    FOR c1_rec IN c1
    LOOP
    EXIT WHEN c1%NOTFOUND;
    UPDATE MIS_LEDGER_DETAIL_tEST A
    SET A.PAYMENT=c1_rec.payment,
    A.TOTALDUE=c1_rec.TOTALDUE,
    A.CURBILL=c1_rec.CURBILL,
    A.SRF=c1_rec.srf,
    A.SBPNT=c1_rec.sbpnt,
    A.SRV=c1_rec.srv,
    A.SBF=c1_rec.sbf,
    A.SBV=c1_rec.sbv,
    A.EF=c1_rec.ef,
    A.EV=c1_rec.ev,
    A.TSRV=c1_rec.tsrv,
    A.TSUB=c1_rec.tsub,
    A.TEQE=c1_rec.teqe,
    A.DT=c1_rec.dt,
    A.PDEP=c1_rec.Pdep,
    A.RDEP=C1_REC.RDEP,
    A.PNT=c1_rec.pnt,
    A.EQP=c1_rec.eqp,
    A.DTR=c1_rec.dtr,
    A.DRF=c1_rec.drf,
    A.UNADJ=c1_rec.unadj
    where A.accountid=c1_rec.accountid
    and A.processdate=C1_REC.processdate
    and a.processdate =to_date('01-MAY-2010','DD-MON-YYYY');
    END LOOP ;
    commit;
    SELECT SYSDATE INTO VAR_ENDDATETIME FROM DUAL;
    SELECT CAST(VAR_ENDDATETIME AS TIMESTAMP) -
    CAST(VAR_STARTDATETIME AS TIMESTAMP) INTO VAR_ELAPSEDTIME
    FROM DUAL;
    INSERT INTO LedgerStatusSummary (StatusID, ProcedureName, STARTDATETIME, ENDDATETIME, LastExecutionDate,NextExecutionDate,LastModifiedDate,TIMETAKEN,Procedurestatus) VALUES
    (VAR_INTRIMSTATUSID,'SP_MIS_LEDGER_ON_DEMAND',VAR_STARTDATETIME,VAR_ENDDATETIME,TRUNC(VAR_DATE1),TRUNC(VAR_DATE1)+1,VAR_STARTDATETIME, VAR_ELAPSEDTIME,'MENUAL');
    COMMIT;
    EXCEPTION
    WHEN OTHERS
    THEN DBMS_OUTPUT.PUT_LINE('An error was encountered - '||SQLCODE||' -ERROR- '||SQLERRM);
    END SP_MIS_LEDGER_ON_DEMAND_V2;
    i have 9830 data in MIS_LEDGER_DETAIL_tEST table ... i am updating table data but it is taking more time to update for 01-may-2010 it is not completing execution in 15 min so i abort it...
    how to write update query....?? please guide me...
    Thanks in advance
    exec SP_MIS_LEDGER_ON_DEMAND_V2

    Why do you need a cursor or a loop at all? What in your statement can not be accomplished with a simple UPDATE TABLE?
    But if you do require a loop, I don't see it, then use BULK COLLECT and FORALL statements
    http://www.morganslibrary.org/reference/array_processing.html
    And replace this:
    SELECT SYSDATE INTO VAR_STARTDATETIME FROM DUAL;
    with this:
    VAR_STARTDATETIME := SYSDATE;

Maybe you are looking for

  • Connect to TV with Time Capsule

    Hi, I have a Time Capsule and a Macbook and want to connect the MB to my LG TV. Is there any way that I can wirelessly connect to my TV via the TC so that I can view what's on my MB on the TV? If not is there any other way to wirelessly to connect to

  • Code Problem or Bug?

    I have a problem that seems simple to the core.  I can easily duplicate it and I am not sure why it's happening.  Here's the steps: 1. Drag a button to the stage and give it an instance name (btnHello). 2. Create a new layer (for Actions). 3. On fram

  • Adding color to a black and white photo

    I've searched everywhere on here and on Youtube and can't seem to find how I can add color to a black and white photo using Aperture 3 that was taken in black and white.

  • Fields selection in MIRO

    Dear All, In standard the fileds sequence in invoice verifcation is *item amount qty  order unit PO no item ........so on ....del note no billof lading del.note qty ....... now our client want to change the sequence to item amount qty  order unit PO

  • Hostapd rt2800pci «Hardware does not support configured mode»

    Hello I am trying to use my fit-pc as a wifi access point. Here is wat lspci says about my wireless card: lspci |grep Wireless 03:00.0 Network controller: Ralink corp. RT3090 Wireless 802.11n 1T/1R PCIe It seems to be possible through with the mac802