Update records in huge table

Hi,
I need to update two fields in a huge table (> 200.000.000 records). I've created 2 basic update scripts with a where clause. The problem is that there isn't an index on these fields in the where clause. How can I solve this? Creating a new index is not an option.
An other solution is to update the whole table (so without a where clause) but I don't know if it takes a lot of time, locks records,...
Any suggestions?
Thanks.
Ken

Ken,
You may be better off reading the metalink documents. PDML stands for Parallel DML. You can use parallel slaves to get the update done quickly. Obviously this is dependent on the number of parallel slaves you have and the degree you set
Type PDML on metalink
G

Similar Messages

  • How to update Records from Internal table to u2018Zu2019 table?

    Hi Friends,
    How to update Records from Internal table to u2018Zu2019 table.
    I have records in Internal table , that records want to update on u2018Zmarau2019 Table.
    ( my internal table & u2018 Zu2019 table structures are same.)
    Thanking you.
    Regards,
    Subash

    Hi,
    loop at internal table.
    modify <Z- table > from values < internal table Workarea>.
    if sy-subrc = 0.
      COMMIT work.
    else.
      ROLLBACK waork.
    endif.
    endloop.
    or
    UPDATE <Z- table > from table < internal table Workarea>.
    if sy-subrc = 0.
      COMMIT work.
    else.
      ROLLBACK waork.
    endif.
    Prabhudas

  • How to update Records of SAP table from .CSV file

    Hi,
    I have written a code which takes a data from (,) delimited CSV file and adds it into the Internal table.
    Now I want to update the all fields in SAP table with the Internal table.
    I want to use Update statement.
    Update <table Name> SET <field needs to update> WHERE connditon.
    I don't want to iterate through thousand on record in sap table to check the where condition.
    Could you please tell me how to do it.

    Hi. I thing you will not workaround the iterating of the internal table.
    You can pre-load all the records into another internal table
    check lt_csv[] is not initial. " important, otherwise next select would read all records of the table
    select .... into lt_dbitab for all entries in lt_csv where key-fieds = lt_csv-key_fields...
    check sy-subrc eq 0 and lt_dbitab[] is not initial.
    then do in-memory update of the it_dbitab
    loop at it_dbitab assign <fs>.
      read table lt_csv assign <fs_csv> with key ... " lt_csv should be sorted table with key, or you should try to use binary search
      if sy-subrc eq 0.
        ...change required lt_dbitab fields: <fs>-comp = <fs_csv>-comp...
      endif.
    endloop.
    ant then you can do mass-update
    update dbtab from table lt_dbitab.
    From performance view, this solution should be much faster than iterating lt_csv directly and updating every single database record
    Br
    Bohuslav

  • How to update records in a table based on user selection..

    Hi all,
    This time the above doubt is totally based on the logic of coding which I tried a lot but didn't get any solution. so atlast I come to sdn site.
    please help..
    The requirement is like that I have a table with 6 fields (1 primary key and other are nonkeys). If the user inputs some values in the fields on the screen, then a row will be added in the table. Upto this i have done well. but when the user want to change some value  in the existing row of the table my program unable to do so. Because I couldn't get any logic to do that as there are 5 nonkey fields, so if any one field is modified then the respective row should be selected first based on the user selection and then it should be updated.
    At this point I could not get any idea as it may take a lots of if conditions (I guess) to reach to that particular row.
    Please help..
    thanks ,
    sekhar

    Hi Sekhar,
    I am afraid, the whole design of your program is wrong, let me explain
    Let us say you have two rows(5 non key fields) that the user wants to update and the data in these five non key fields are identical and in your program you are getting a number(which is the key) using a number range object. So you will have two entries in the table for the same data.
    And on the update page when the user enters the non key fields, how will the program know(or for that matter any one of us) which record to pick, if you have two identical books and if asked for a book wouldn't you ask which one among these two do you want?
    Possible Solution: Identify a possible key maintaining the integrity of the data, that is a combination of the non key fields which will help you identify a unique row and make these fields as key fields in the table.
    A more costly solution(if you do not want to change the non key field keys to key fields) would be to, adding a check(using select statement) to see if the non key fields combination already exists in the Z table before inserting a record into the table.
    If yes, throw a message to the user and just update the values in the table, else insert the record.
    Another solution would be to, use the non key fields to generate a key(using some logic) and using this instead of the number range object.
    regards,
    Chen

  • Informatica failed to update records in target table

    Hi,
    Recently I converted hourly full load into incremental laod. everything is working good except updating records in target. Informatica not updating any records and I'm trying to understand and troubleshoot the issue. details given below
    I ran mapping debugger and came to know Filter transformation is filtering the data instead of passing to next transformation because update flag is X.
    ETL_PROC_WID AND LKP_ETL_PROC_WID are same and update flag = X.
    I dont understand why ETL_PROC_WID AND LKP_ETL_PROC_WID are same. As per my knowledge Informatica generates new ETL_PROC_WID for every ETL run. If you know anything about it please let me know.
    I executed code in SYS_COL_DIFF and UPDATE_FLG manually(on paper for few records) by assuming ETL_PROC_WID AND LKP_ETL_PROC_WID are different and got UPDATE_FLG=U
    If you know how MPLT_GET_ETL_PROC_WID mapplet in OOTB works please let me know.
    appreciate you for your help
    Thanks
    Jay.
    Edited by: JV123 on Dec 12, 2012 9:29 AM

    Welcome to the forum.
    You can try your hands on MERGE while performing the INSERT operation
    Something like this
    MERGE INTO Emp_Org eo
    USING (
              select Empno,deptno,empname,currenttimestamp from Emp
           ) x           
    ON (
                   eo.Empno      = x.Empno
    WHEN NOT MATCHED THEN
         INSERT
              Empno,
              country,
              Emporg,
              currenttimestamp,
              Targettimestamp
         VALUES
              x.Empno,
              'USA', /* Used Constant here */
              'Emporg', /* Used Constant here */
              x.currenttimestamp,          
              sysdate /* Used Constant here */
         );If you are not ok with the solution, then provide some sample data with expected output

  • How to find unmatched records in huge tables

    Hi,
    I want to find out the faster approach, to find records in two tables which do not match.
    To make it clear, lets say we have two tables.
    table_1(col1 number, col2 varchar2(20), ....other columns)
    and
    table_2(col3 number, col4 varchar2(20), ....other columns)
    col1, col2 from table_1 corresponds to col3, col4 of table_2.
    If a record in table_1 does not exists in table_2 or if one of the column has different value, I want to find all such records.
    Being Oracle developer, I can easily find that using "outer join" or using "exists".
    But want to find out the fastest way, as those table has millions of records.
    Maybe using segment or something like that...
    Any suggestions?

    If "the" fastes way would exist oracle would not implement the others.
    You will have to test it. The easiest way is to use set autotrace on in sqlplus and compere the consistent gets of both statements (by the way: there is also the MINUS operator with also leads to an outer join but is easier to read).
    Dim

  • Updating records in a table

    Hi,
    ERP: 11.5.10.2 , Database= 9i
    we have requirement to update one of the column of a table for customized property manager modules.
    basically this is old data which cannot be inserted from application side as the field representing to this column is disabled from application side. So we have only one way to insert/update directly from database. ( using update staement)
    its almost like 1000 or 2000 records are there, in the same table one column is there which can used as where condition for updating the required collumn
    any suggestion how to achieve this task or any way that can reduce the time of updating.
    Regards

    [Edit: please mark questions as answered when they are. This helps people know what posts to look at.]
    Test data:drop table a;
    create table a(id, NBR) as
    select level, level*1000 from DUAL
    connect by level <= 4;
    drop table B;
    create table B(id, NBR) as
    with MULT as(select level from DUAL connect by level <= 10)
    select id, 9999 from a, MULT;Solution:merge into B
    using (
      select B.rowid RID,
      a.NBR
      from a,B
      where a.id = B.id
      and a.NBR != B.NBR
    ) U
    on (B.rowid = U.RID)
    when matched then update set nbr = u.nbr;
    40 rows merged.Run the MERGE twice and the second time 0 rows will be merged. This confirms that rows are updated only when necessary.
    You could update a join, but only if table A has a primary key or unique index on ID.
    Edited by: Stew Ashton on Oct 29, 2012 10:46 AM

  • Update records from a table in correct sequence that look from 2 tables and loop

    Hi!
    My question title is kinda unclear but here how it goes.
    I created  2 tables for my BOM (Bill of Materials). I have BOM_Header and BOM_Detail. 
    Below are my sample data. 
    BOM_Header
    Product_ID Int
    Product_Name Varchar(50)
    Cost Numeric(18,2)
    Yield Varchar(50)
    Select * from BOM_Header
    1 Choco Cake 850.00 10
    2 Mixed Flour 700.00 30
    3 Choco Syrup 160.00 10
    4 Egg Formula 2150.00 20
    BOM_Detail
    Product_ID int
    ItemNo Int
    ItemName varchar(50)
    Quantity int
    Unit varchar(50)
    ProdCost numeric(18,2)
    Select * from BOM_Detail
    1 2 Mixed Four 10 Grams 15.00
    1 3 Choco Syrup 20 ML 25.00
    1 4 Egg Formula 20 Grams 10.00
    2 101 Flour 5 packs 80.00
    2 4 Egg Formula 5 Grams 60.00
    3 201 Cocoa Power 2 kg 20.00
    3 202 Sugar 2 kg 60.00
    4 301 Whole Egg 10 pcs 85.00
    4 302 EP12 Formula 25 ml 52.00
    My computation is below.
    BOM_Header = a
    BOM_Detail = b
    a.Cost = b.Quantity  x  b.Product Cost  where a.Product_ID = b.Product_ID
    My problem is how can I automatically compute their Food Cost in  sequence from raw materials to finished products.
    Based on the data, I need to compute first the Egg Formula because it is used as component of  Mixed Flour then compute the Mixed Flour and other component to get the cost of Choco Cake. 
    How can I do this automatically in query to look first in detail if there are ingredients within a sub - ingredients then compute before computing the final cost of the Product. 
    This is because cost of ingredients are changing most of the time and need to recalculate to get the most updated cost.
    Any suggestion is very much appreciated.
    Thank you very much,
    Regem

    >> My question title is kinda unclear but here how it goes. <<
    Then your answers will be unclear, too :(  
    You do not know data modeling, so your data types are wrong. What math are you doing with the product_id? None. This is why identifiers are char(n) and not numeric. They should be industry standards if possible. I like the GTIN.
    You do not even know that rows are not records. 
    Why is the product name fifty characters? Certainly not research! But if you are an ACCESS programmer using a default vale, then you might do this. 
    Besides violating ISO-11179 rules, “<vague>_field” makes no sense! It is a quantity that you put in a  string. 
    CREATE TABLE BOM_Header
    (gtin CHAR(15) NOT NULL PRIMARY KEY,
     product_name   VARCHAR(25) NOT NULL
     unit_cost  DECIMAL(18,2) NOT NULL
      CHECK ( unit_cost >= 0.0.),
     yield_qty INTEGER NOT NULL
       CHECK (yield_qty >= 0));
    >> Any suggestion is very much appreciated. <<
    Get a copy of my book on TREES in SQL and read the chapter on BOM problems. I am not goingto try to post a whole chaper and diagrams to answer this. You are doing the wrong things and have done them poorly. 
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • Performance updating a extra huge table

    Hi guys, just an advice. I'm handling table with more than 300 millions rows, sometimes even 800 millions and so far I came up with some good solution but now I really need to be concerned about the performance. I got a table with:
    FlyID int, FlyNumber int, SettlDate datetime2, SettlPeriod double, Consumpt dec, Ixl dec, Aunit int
    300 millions rows. The settldate is a date , settperiod is an half hour ( so 48 period each day).
    The other table is:
    BMUnit int,  SettlDate datetime2, SettlPeriod double, Chargefact dec
    I'm going to join the two table on bmunit=bmunit, settdate=settdate, settperiod=settperiod and with an insert filling a new table
    Fingers crossed and I hope it wors within a reasonable time ( 3 hours...more?)
    The real concern is:
    I got another table with
    FlyID int, Company varchar, CompanyID int, FromDate datetime, ToDate datetime
    The logic should be something like this:
    Update table1 set 1companyid=dd.companyid , company=company
    where table1.flyid=company.flyid
    and settlementdate >= fromdate and settlementdate <= todate
    but just yesterday I tried something without date and the querr ran for more than seven hours and so I had to killed it. I'm wondering if there is a better way...all this stuff because I'm going to build several cube taking as source a big table. That's
    it's going to make the retrievement really fast, so far I cut pratically entire hours but now I need you this more element and before I start to write some code I'd like to hear some your advice..
    Thanks

    Tables that large are always a problem to do major maintenance.
    I would do your update in batches:
    DECLARE @cnt int;
    SET @cnt = 1;
    WHILE @cnt > 0
    BEGIN
    Update TOP 1000000 table1 set 1companyid=dd.companyid , company=company
    where table1.flyid=company.flyid
    and settlementdate >= fromdate and settlementdate <= todate
    SET @cnt=@@ROWCOUNT
    END

  • XSU : Updating records in the table

    Hi,
    XSU seems to be assuming that the data from the XML doc has to be always INSERTED into the database. Does it not have to UPDATE existing records and INSERT only the NEW RECORDS?
    How should this be implemented?
    null

    A future release is planned to offer this so-called "upsert" functionality, but the current release requires you to know whether you want to insert, update, or delete.
    You can use the insert functionality, in combination with an INSTEAD OF INSERT trigger to programmatically handle the "update-if-already-exists" functionality.
    See Example 12-17 on page 465 of "Building Oracle XML Applications" for a concrete example of this hand-coded "upsert" functionality.

  • Update record using SQL statement

    I have VB6.0 and Oracle 10G Express Edition in Windows 2000 Server. My procedure in VB 6.0 can't update record in the table using SQL statement, and the Error Message is " Missing SET keyword ".
    The SQL statement in VB6.0 look like this :
    General Declaration
    Dim conn as New ADODB.Connection
    Dim rs as New ADODB.Recordset
    Private Sub Command1_Click()
    dim sql as string
    sql = " UPDATE my_table " & _
    " SET Name = ' " & Text3.Text & " ' " & _
    " AND Unit = ' " & Text2.Text & " ' " & _
    " WHERE ID = ' " & Text1.Text & " ' "
    conn.Execute (sql)
    Private Sub Form Load()
    Set conn = New ADODB.Connection
    conn.Open "Provider=MSDASQL;" & "Data Source=my_table;"& "User ID =marketing;" & "Password=pass123;"
    I'm sorry about my language.
    What's wrong in my SQL statement, I need help ........ asap
    Best Regards,
    /Harso Adjie

    The syntax should be
    UPDATE TABLE XX
    SET FLD_1 = 'xxxx',
    FLD_2 = 'YYYY'
    WHERE ...
    'AND' is improperly placed in the SET.

  • Inserting records across multiple tables

    I'm still pretty new to working with databases, but have been
    fine using DW to use forms to add, edit and delete records from a
    flat table.
    I'm less sure about updating records across multiple tables,
    for example in a one to many rleationship with a look up table, eg
    If I have three tables
    Companies :
    CompanyID (INT, auto increment)
    Company
    Address
    etc
    Contacts :
    ContactID (INT, auto increment)
    FirstName
    LastName
    etc
    CompanyContacts :
    CompanyID (INT)
    ContactID (INT)
    It's straightforward enough to create pages to insert new
    records into the Companies or Contacts tables, but how do I go
    about populating the CompanyContacts table when I add a new record
    in the Contacts table, so that it becomes 'attached' to a
    particular 'Company'?
    If that makes sense.
    Iain

    I'm still pretty new to working with databases, but have been
    fine using DW to use forms to add, edit and delete records from a
    flat table.
    I'm less sure about updating records across multiple tables,
    for example in a one to many rleationship with a look up table, eg
    If I have three tables
    Companies :
    CompanyID (INT, auto increment)
    Company
    Address
    etc
    Contacts :
    ContactID (INT, auto increment)
    FirstName
    LastName
    etc
    CompanyContacts :
    CompanyID (INT)
    ContactID (INT)
    It's straightforward enough to create pages to insert new
    records into the Companies or Contacts tables, but how do I go
    about populating the CompanyContacts table when I add a new record
    in the Contacts table, so that it becomes 'attached' to a
    particular 'Company'?
    If that makes sense.
    Iain

  • Detect updated records

    Hi all,
    is there a way to find all the updated record of a table?
    (I don't have any field on the table that can help me such as update_date or similar).
    Thanks.

    I don't think we're disagreeing, just want to make sure I'm following
    rp0428 wrote:
    >
    If there was a timestamp on the row, your refresh process would need to know the timestamp that the destination was current as of and would need to pull the changes since that timestamp.
    >
    Unfortunately that approach has a big, gaping hole which can render it unreliable.True. But using a timestamp to do the replication was never a suggestion here. Dan correctly pointed out that if you wanted to convert the SCN to a timestamp, there are limits on the accuracy of the conversion and the time frame during which that conversion is possible. I was simply emphasizing that there was no need to do that conversion in the first place unless you needed to know when a row changed rather than merely that it had changed and I compared how you would identify changed data using the ORA_ROWSCNto how you might do it if there was a timestamp that could be used.
    The value of the timestamp is typically set by using SYSDATE or SYSTIMESTAMP but the record containing that value isn't 'current' until the record is committed.
    So if a query/process begins before midnight tonight but is not commited until after midnight (e.g. 1am tomorrow) the timestamp will have today's value but will NOT get pulled by a query after midnight but before 1am and processed in tonight's batch process.
    That data will also not get processed in tomorrow's batch process because the timestamp makes it appear as if it has already been processed.
    Timestamp's only work reliably when it is known that the above use case cannot happen.Or if you define a maximum transaction length and pull data since last_extract_time - maximum_transaction_length. Obviously not perfect, which is why there are tons of technologies to allow you to replicate changes (CDC, Streams, materialized views, etc.) rather than rolling your own solution. Using the SCN would, of course, be preferrable from a correctness standpoint in the extremely rare case that there is a need to roll your own change data capture process.
    Justin

  • How to update multiple records in a table created in view (web dynpro)

    Here is my coding......
    *coding to get the district value
    DATA lo_nd_district TYPE REF TO if_wd_context_node.
        DATA lo_el_district TYPE REF TO if_wd_context_element.
        DATA ls_district TYPE wd_this->element_district.
        DATA lv_district_txt LIKE ls_district-district_txt.
      navigate from <CONTEXT> to <DISTRICT> via lead selection
        lo_nd_district = wd_context->get_child_node( name = wd_this->wdctx_district ).
      get element via lead selection
        lo_el_district = lo_nd_district->get_element(  ).
      get single attribute
        lo_el_district->get_attribute(
          EXPORTING
            name =  `DISTRICT_TXT`
          IMPORTING
            value = lv_district_txt ).
    *coding to diplay records when clicking a button(Submit)
    DATA lo_nd_table TYPE REF TO if_wd_context_node.
    DATA lo_el_table TYPE REF TO if_wd_context_element.
    DATA ls_table TYPE wd_this->element_table.
      DATA lv_district LIKE ls_table-district.
    navigate from <CONTEXT> to <TABLE> via lead selection
      lo_nd_table = wd_context->get_child_node( name = wd_this->wdctx_table ).
    get element via lead selection
      lo_el_table = lo_nd_table->get_element(  ).
    get single attribute
      lo_el_table->set_attribute(
        EXPORTING
          name =  `DISTRICT`
       " IMPORTING
          value = lv_district_txt ).
    The above coding updates only one record to that
    table created in view.
    If i enter 2nd district value means then the first record
    in the table is overwritten.
    So my need is the record should not be overwritten.
    it(2nd record ) should be displayed after the 1st record.
    Any one can help me and send the coding plz....

    instead of using set attribute you should use bind table method to display/update the records in table view.
    step1 ) collect all the data in a local table
    step2 ) and the bind that lacal table with your node
    search1 = wd_context->get_child_node( name = `TABLE1` ).
    search1->bind_table( lt_detail)
    here lt_detail is your local table and TABLE1 is node which is bound with table ui element.

  • Unable to update records in table

    Hi all,
    While updating particular set of records in a table system gets hanged however it is allowing to update rest of record.i checked lock using
    v$locked_object ,v$session but i cud't find any..is there any way to find lock at all levels.
    thanks

    Grab your sid of the session that you are running via sys_context('USERENV','SID') or something else and then check out the wait events, and locks:
    select * from gv$lock where sid = <sid>;
    select * from gv$session_wait where sid = <sid>;P.S. Watch for triggers... YUCK!
    Message was edited by:
    JoeC

Maybe you are looking for

  • Microsoft SharePoint Server 2010 encountered an error during setup. One or more required office components failed to complete successfully.

    Hi, I got the below error while installing share point 2010 on Windows 7 OS. Microsoft SharePoint Server 2010 encountered an error during setup. One or more required office components failed to complete successfully. Trouble shooted by following link

  • PDF with notes and markup help!

    I have a shiny new iPad I have some contracts and bids in PDF format with notes and highlights marked up on them with preview on my Mac book pro. The PDF files are saved in my iDisk I can open them on any of my Mac computers and the notes remain inta

  • Switch to different application server in ABAP program

    Hello everyone, I am currently working on some ABAP objects classes and tool programs for additional user administration functionality. We are runnig a SAP ERP system with distributed application servers and active logon load-balancing for users. I'd

  • SAP 4.6 installation aborted.

    Hi All My sap 4.6installation aborted with given below error.. My OS is Windows 2003Server and oracle 8i Info: DBR3LOADEXEC_NT_ORA R3loadPrepare 2 710 Total number of processes: 20 17 process(es) finished successfully: 0 4 5 6 7 8 9 10 11 12 13 14 15

  • Iphone 5 from Dubai

    Hello, I have the opportunity to purchase an iPhone5 neverlocked from Dubai. I am living in Romania and I want to use my iPhone in to the Orange GSM network. The GSM frequencies are: 2G > 900, 1800; 3G > 2100! Is the iPhone5 purchased from EAU workin