Best way to refresh 5 million row table

Hello,
I have a table with 5 million rows that needs to be refreshed every 2 weeks.
Currently I am dropping and creating the table which takes a very long time and gives a warning related to table space at the end of execution. It does create the table with the actual number of rows but I am not sure why I get the table space warning at the end.
Any help is greatly appreciated.
Thanks.

Can you please post your query.
# What is the size of temporary tablespace
# Is you query performing any sorts ?
Monitor the TEMP tablespace usage from below after executing your SQL query
SELECT TABLESPACE_NAME, BYTES_USED, BYTES_FREE
FROM V$TEMP_SPACE_HEADER;
{code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

Similar Messages

  • Best way to update individual rows of a Table?

    I've taken a look at some examples, though haven't gotten any clarification on this.  I am looking to have something close to a listbox or table to where I can update just a single column of row values at a 1 time per second pace.  I am looking to display our data-acquisition values in a table or listbox.  The single listbox seemed to work good for this, but I was unable to use row headers to list the channel names next to the channel values.  I was thinking about connecting the cursor values of two list-boxes to do this, but didn't find any info on this for the single list-box.
    I have a few questions:
    1) I have a 1D array to where I want to use that array of data to constantly update the first column (with a multitude of rows) of a table.  I am looking for the best route so as not to take up too much processing time in doing this.
    What is the best way to update individual rows of a table?   Invoke Node "Set Cell Value" ... or is there another method?
    2) Why is it that after every other iteration the row values are erased? 
    Also, for adding additional strings to the original arrray ... is it best to use the "Array Subset" and then the "Build Array" function, or the "Array Subset" and "Insert Into Array" function?
    See the attached example.
    Thanks.
    Solved!
    Go to Solution.
    Attachments:
    Table Example.vi ‏19 KB

    Jeff·Þ·Bohrer wrote:
    2) Why is it that after every other iteration the row values are erased?
    Classic race condition.  dump the for loop and p-node and just wire the 2D array to the table terminal.!
    I'm not seeing the race condition.  What I am seeing is the table emptying after the last element was written to it on every other run.  I saw watched this with highlight execution on.
    But I'm in full agreement with just writing to the terminal.  It is a 1D array, so you will need to use a build array and transpose 2D array in order for it to write properly.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Best way to insert millions of records into the table

    Hi,
    Performance point of view, I am looking for the suggestion to choose best way to insert millions of records into the table.
    Also guide me How to implement in easier way to make better performance.
    Thanks,
    Orahar.

    Orahar wrote:
    Its Distributed data. No. of clients and N no. of Transaction data fetching from the database based on the different conditions and insert into another transaction table which is like batch process.Sounds contradictory.
    If the source data is already in the database, it is centralised.
    In that case you ideally do not want the overhead of shipping that data to a client, the client processing it, and the client shipping the results back to the database to be stored (inserted).
    It is must faster and more scalable for the client to instruct the database (via a stored proc or package) what to do, and that code (running on the database) to process the data.
    For a stored proc, the same principle applies. It is faster for it to instruct the SQL engine what to do (via an INSERT..SELECT statement), then pulling the data from the SQL engine using a cursor fetch loop, and then pushing that data again to the SQL engine using an insert statement.
    An INSERT..SELECT can also be done as a direct path insert. This introduces some limitations, but is faster than a normal insert.
    If the data processing is too complex for an INSERT..SELECT, then pulling the data into PL/SQL, processing it there, and pushing it back into the database is the next best option. This should be done using bulk processing though in order to optimise the data transfer process between the PL/SQL and SQL engines.
    Other performance considerations are the constraints on the insert table, the triggers, the indexes and so on. Make sure that data integrity is guaranteed (e.g. via PKs and FKs), and optimal (e.g. FKs should be indexes on the referenced table). Using triggers - well, that may not be the best approach (like for exampling using a trigger to assign a sequence value when it can be faster done in the insert SQL itself). Personally, I avoid using triggers - I rather have that code residing in a PL/SQL API for manipulating data in that table.
    The type of table also plays a role. Make sure that the decision about the table structure, hashed, indexed, partitioned, etc, is the optimal one for the data structure that is to reside in that table.

  • Best way to deal with Mutating table exception with Row Level Triggers

    Hello,
    It seems to be that the best way to deal with Mutating Table exception(s) is to have to put all the trigger code in a package & use it in conjunction with a Statement level trigger .
    This sounds quite cumbersome to me . I wonder is there any alternative to dealing with Mutating table exceptions ?
    With Regards

    AskTom has a good article about this,
    http://asktom.oracle.com/tkyte/Mutate/index.html

  • JPA -- Best way to refresh a List association?

    Hi,
    I need to refresh a OneToMany association.
    For example, I have two entities: Header & Detail.
    @Entity
    @Table(name="HEADERS")
    public class Header implements Serializable {
        @OneToMany(mappedBy="header")
        private List<Detail> details;
    @Entity
    @Table(name="DETAILS")
    public class Detail implements Serializable {
        @ManyToOne(fetch=FetchType.LAZY)
        @JoinColumn(name="HDR_ID", referencedColumnName="HDR_ID")
        private Header header;
    }So, I fetch the Header along with all its Details.
    At a later point of time, I know that some Detail rows in the database have been changed behind my back. I need to re-fetch the list of Details. What should I do?
    1. I could add a cascade parameter to the @OneToMany association. I could specify:
    @OneToMany(mappedBy="header", cascade={CascadeType.REFRESH})Then I could run:
    entityManager.refresh(header);The trouble is that, since all the Details are already in the cache, the cached entities will be returned, not the ones fetched from the database. So, I won't refresh a thing. A query will be sent to the database indeed, but I will get the cached (i.e. stale) entities. I don't know of a way to specify something like
    setHint(TopLinkQueryHints.REFRESH, HintValues.TRUE)dynamically for associations, so that the values in the cache would be replaced with the ones fetched from the database.
    2. I could try to turn off the caching for the while Entity class. The trouble is that for some reason this doesn't work (see my other question here JPA -- How can I turn off the caching for an entity? Besides, even if it worked, I don't want to turn off the caching in general. I simply want to refresh the list sometimes.
    Could anyone tell me what's the best way to refresh the association?
    Best regards,
    Bisser

    Hi Chris,
    First, let me thank you that you take the time to answer my questions. I really appreciate that. I wish to apologize for my late reply but I wasn't around the PC for a while.
    TopLink doesn't refresh an entity based on a view. I will try to explain in more detail. I hope you'll have patience with me because this might be a bit longer even than my previous post. I will oversimplify my actual business case.
    Let's assume we have two tables and a view:
    create table MASTERS
      (id number(18) not null primary key,
       master_name varchar2(50));
    create table DETAILS
      (id number(18) not null primary key,
       master_id number(18) not null,   -- FK to MASTER.ID
       price number(7,2));
    create view DETAILS_VW as
      select id, master_id, price
      from details;Of course, in real life the view is useful and actually peforms complex aggregate calculations on the details. But at the moment I wish to keep things as simple as possible.
    So, I create Entities for the tables and the view. Here are the entities for MASTERS and DETAILS_VW, only the essential stuff (w/o getters, setters, sequence info, etc.):
    @Entity
    @Table(name="MASTERS")
    public class Master {
         @Id
         @Column(name="ID", nullable=false)
         private Long id;
         @Column(name="MASTER_NAME")
         private String masterName;
         @OneToMany(mappedBy="master", fetch=FetchType.LAZY, cascade=CascadeType.REFRESH)
         private List<DetailVw> detailsVw;
    @Entity
    @Table(name="DETAILS_VW")
    public class DetailVw {
         @Id
         @Column(name="ID")
         private Long id;
         @ManyToOne(fetch=FetchType.LAZY)
         @JoinColumn(name="MASTER_ID", referencedColumnName="ID")
         private Master master;
         @Column(name="PRICE")
         private Double price;
    }So, now we have the tables and the entities. Let's assume one master row and two detail rows exist:
    MASTER:  ID=1, MASTER_NAME='Master #1'
    DETAIL:  ID=1, MASTER_ID=1, PRICE=3
    DETAIL:  ID=2, MASTER_ID=1, PRICE=8And now let's run the following code:
    // List the initial state
    Master master = em.find(Master.class, 1L);
    List<DetailVw> detailsVw = master.getDetailsVw();
    for (DetailVw dv : detailsVw) {
         System.out.println(dv);
    // Modify a detail
    EntityTransaction tx = em.getTransaction();
    tx.begin();
    Detail d = em.find(Detail.class, 2L);
    d.setPrice(1);
    tx.commit();
    // Refresh
    System.out.println("----------------------------------------");
    em.refresh(master);
    // List the state AFTER the update
    detailsVw = master.getDetailsVw();
    for (DetailVw dv : detailsVw) {
         System.out.println(dv);
    }And here are some excerpts from the console (only the essentials):
    DetailVw: id=1, price=3
    DetailVw: id=2, price=8
    UPDATE DETAILS SET PRICE = ? WHERE (ID = ?)
         bind => [1, 2]
    SELECT ID, MASTER_NAME FROM MASTERS WHERE (ID = ?)
         bind => [1]
    SELECT ID, PRICE, MASTER_ID FROM DETAILS_VW WHERE (MASTER_ID = ?)
         bind => [1]
    DetailVw: id=1, price=3
    DetailVw: id=2, price=8You see, the UPDATE statement changes the DETAILS row. The price was 8, but was changed to 1. I checked the database. It was indeed changed to 1.
    Furthermore, due to the refresh operation, a query was run on the view. But as you can see from the console output, the results of the query were completely ignored. The price was 8, and continued to be 8 even after the refresh. I assume it was because of the cache. If I run an explicit query on DETAILS_VW with the hint:
    q.setHint(TopLinkQueryHints.REFRESH, HintValues.TRUE);then I see the real updated values. But if I only refresh with em.refresh(master), then the DetailVw entities do not get refreshed, even though a query against the database is run. I have tested this both in JavaSE and in OC4J. The results are the same.
    An explicit refresh on a particular DetailVw entity works, though:
    DetailVw dvw = em.find(DetailVw.class, 2L);
    em.refresh(dvw);
    System.out.println(dvw);Then the console says:
    DetailVw: id=2, price=1So, the price is indeed 1, not 8.
    If you can explain that to me, I will be really thankful!
    Best regards,
    Bisser

  • Best way to refresh page after returning from task flow?

    Hello -
    (Using jdev 11g release 1)
    What is the best way to refresh data in a page after navigating to and returning from a task flow with an isolated data control scope where that data is changed and commited to the database?
    I have 2 bounded task flows: list-records-tf and edit-record-tf
    Both use page fragments
    list-records-tf has a list.jsff fragment and a task flow call to edit-record-tf
    The list.jsff page has a table of records that a user can click on and a button which, when pressed, will pass control to the edit-record-tf call. (There are also set property listeners on the button to set values in the request that are used as parameters to edit-record-tf.)
    The edit-record-tf always begins a new transaction and does not share data controls with the calling task flow. It consists of an application module call to set up the model according to the parameters passed in (edit record X or create new record Y or...etc.), a page fragment with a form to allow users to edit the record, and 2 different task flow returns for saving/cancelling the transaction.
    Back to the question - when I change a record in the edit page, the changes do not show up on the list page until I requery the data set. What is the best way to get the list page to refresh itself automatically upon return from the edit-record-tf?
    (If I ran the edit task flow in a popup dialog I could just use the return listener on the command component that launched the popup. But I don't want to run this in a dialog.)
    Thank you for reading my question.

    What if you have the bean which has refresh method as TF param? Call that method after you save the data. or use contextual event.

  • Best way to outer join a table that is doing a sub query

    RDBMS : 11.1.0.7.0
    Hello,
    What is the best way to outer join a table that is doing a sub query? This is a common scenario in EBS for the date tracked tables.
    SELECT papf.full_name, fu.description
      FROM fnd_user fu
          ,per_all_people_f papf
    WHERE fu.user_id = 1772
       AND fu.employee_id = papf.person_id(+)
       AND papf.effective_start_date = (SELECT MAX( per1.effective_start_date )
                                          FROM per_all_people_f per1
                                         WHERE per1.person_id = papf.person_id)Output:
    No output produced because the outer join cannot be done on the sub queryIn this case I did a query in the FROM clause. Is this my best option?
    SELECT papf.full_name, fu.description
      FROM fnd_user fu
          ,(SELECT full_name, person_id
              FROM per_all_people_f papf
             WHERE papf.effective_start_date = (SELECT MAX( per1.effective_start_date )
                                                  FROM per_all_people_f per1
                                                 WHERE per1.person_id = papf.person_id)) papf
    WHERE fu.user_id = 1772
       AND fu.employee_id = papf.person_id(+)Output:
    FULL_NAME     DESCRIPTION
    {null}            John DoeThanks,
    --Johnnie                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    Hi,
    BrendanP wrote:
    ... See the adjacent thread for the other with Row_Number().Do you mean {message:id=10564772} ? Which threads are adjacent is always changing. Post a link.
    I think RANK suits the requirements better than ROW_NUMBER:
    WITH    all_matches     AS
         SELECT  papf.full_name
         ,      fu.description
         ,     RANK () OVER ( PARTITION BY  papf.person_id
                               ORDER BY          papf.effective_start_date     DESC
                        )           AS r_num
         FROM             fnd_user             fu
         LEFT OUTER JOIN      per_all_people_f  papf  ON  fu.employee_id  = papf.person_id
         WHERE   fu.user_id  = 1772
    SELECT     full_name
    ,     description
    FROM     all_matches
    WHERE     r_num     = 1
    Johnnie: I hope this answers your question.
    If not, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all tables involved, and also post the results you want from that data.
    See the forum FAQ {message:id=9360002}

  • Access-SQL Server (Client Server Configuration) Best Way To Refresh SQL Server Records ?

    We are using Access 2013 as the front end and SQL Server 2014 as the back end to a client server configuration.
    Access controls are bound to the SQL fields with the same names. When using Access to create a new record in a Form, the data are not transferred to SQL if the form is exited to display a different Form or Access is closed. If the right or left arrow navigation
    buttons at the bottom of the form are first used to display either the previous or next record, then the data in the new record are correctly transferred to SQL.
    What is the best way to refresh the new SQL record prior to the closing of the new record in the bound Access form ? We have tried Requery of the entire form and with all of the individual controls without success. We are looking for a method of refreshing
    SQL that functions in a manner similar to that of what happens with the navigation buttons.
    Thank you very much for your assistance.
    Robert Robinson
    RERThird

    Hi Stefan,
    I had added the code to set me.dirty = False in response to the On Dirty event and didn't realize that it was working properly. I had tried other several approaches and must have become confused somewhere along the line.
    I retested the program. On Dirty is working and the problem is solved.
    Thank you very much for your assistance.
    Robert Robinson
    RERThird

  • Delete from 95 million rows table ...

    Hi folks, need to delete from a 95 millions rows regular table, what should be my best options, have tried CTAS using parallel, but it failed after 1+ hrs ... it was due to bad query, but checking is there any other way to achieve this.
    Thanks in advance.

    user8604530 wrote:
    Hi folks, need to delete from a 95 millions rows regular table, what should be my best options, have tried CTAS using parallel, but it failed after 1+ hrs ... it was due to bad query, but checking is there any other way to achieve this.
    Thanks in advance.how many rows in the table BEFORE the DELETE?
    how many rows in the table AFTER the DELETE?
    How do I ask a question on the forums?
    SQL and PL/SQL FAQ
    Handle:     user8604530
    Status Level:     Newbie
    Registered:     Mar 10, 2010
    Total Posts:     64
    Total Questions:     26 (22 unresolved)
    I extend to you my condolences since you rarely get your questions answered.

  • BEST WAY TO REFRESH A MATERIALIZED VIEW

    Hi, I have a Materialized View that was created after two Base Tables, Table A is a Dynamic Table, this means that it have Insert's, update's and delete's, and a Table B that is a Fixed Table, this means that this table do not change over time (it's a Date's Table). The size of the Table related to the Materialized View is to big (120 millions rows) so the refresh has to be very efficient in order to not affect the Data Base performance.
    I was thinking on a Fast Refresh mode but It would not work because the log created on the B table is not usefull, the thing is that I created the two materialized view log's (Tables A and B) and when I execute the dbms_mview.refresh(list =>'MV', method => 'F')+ sentence I got the error
    cannot use rowid column from materialized view log on "Table B" ... remember that this table don't change over time ...
    I need to mantain the Materialized view up to date, but do not know how ... Please Help !!!!!

    The Materialized View name is test_sitedate2
    The Table B name is GCO01.FV_DATES .... is a Fixed Table ... do not change over time ...
    The Code of the MV is
    CREATE MATERIALIZED VIEW TEST_SITEDATE2
    REFRESH FORCE ON DEMAND
    AS
    SELECT site_id, date_stamp
    FROM gco01.fv_site, gco01.fv_dates
    where fv_dates.date_stamp >= fv_site.start_date
    and fv_dates.date_stamp >= to_date('01/03/2010', 'dd/mm/yyyy')
    and fv_dates.date_stamp < to_date('01/04/2010', 'dd/mm/yyyy');
    Each table gco01.fv_site and gco01.fv_dates have it's materiallized view log created
    The error is ....
    SQL> execute dbms_mview.refresh(list =>'test_sitedate2', method => 'F');
    begin dbms_mview.refresh(list =>'test_sitedate2', method => 'F'); end;
    ORA-12032: cannot use rowid column from materialized view log on "GCO01"."FV_DATES"
    ORA-06512: at "SYS.DBMS_SNAPSHOT", line 2254
    ORA-06512: at "SYS.DBMS_SNAPSHOT", line 2460
    ORA-06512: at "SYS.DBMS_SNAPSHOT", line 2429
    ORA-06512: at line 1
    Thank's

  • Best way to update an OLTP table ?

    Hi,
    We have an OLTP table with huge data.
    We need to update a status column from 'N' to 'Y' for almost 70% of rows based on some condition.
    This table may be accessed by hundreds of sessions at a time.
    So, what is the best way to do the same.
    Rgds,
    Rup

    if someone is using the table, ddl cannot be done (or at least you would have to wait maybe a long time)
    quick test...
    SQL> create table bank
      2  (id number primary key
      3  ,acc number
      4  ,ind varchar2(1)
      5  )
      6  /
    Table created.
    SQL> insert into bank
      2  select rownum
      3       , rownum * 10
      4       , 'N'
      5    from all_objects
      6   where rownum <= 10
      7  /
    10 rows created.
    SQL> commit;
    Commit complete.
    SQL> update bank
      2     set acc = -10
      3   where id = 10
      4  /
    1 row updated.new session
    SQL> alter table bank
      2  add new_ind varchar2(1)
      3  /
    alter table bank
    ERROR at line 1:
    ORA-00054: resource busy and acquire with NOWAIT specifiedwell, not a long time... but anyway you can't do ddl while someone is working on the table.

  • Best way to load data in table from combination of Table and flat file?

    Hi All,
    Could you please share your thoughts on best way of achieving this objective -
    Flat File - 15 Million records (Field A,B,C)
    Table A - 15 Million records ( Field A,D)
    Objective -
    Load Field A,B,C,D in Table B from Flat file and Table A.
    Data can be loaded from flat file in Table B then updated from Table A but this update operation is taking lot of time. (Intermediate Commit, Bulk operations already tried)
    Regards,
    Dark Knight

    Environment -
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Tables are analyzed.
    Indexes are their.
    Update statement using the index.
    Data is close to 200 MB.
    I am interested in knowing if there are alternate ways of doing this, other than conventional way of loading data then updating it.

  • Best way to load many rows on datagrid with MVVM?

    Hi,
    I have a table with more than 2 million rows and 20 columns. of course this takes so long to load the data on datagrid and even mostly causing out of memory exception. what kind of way can I follow to load this dynamically and efficiently? 
    In the attached project for example, in ordersview.xaml I am binding Orders as following. I guess that doing Async helps but not completly in my case. I need to find a kind of paging and background loading. Sample
    project doesnt have so many data but if possible to presume that there are so many Orders and i want to load all data. 
    <DataGrid Name="grdOrders" CopyingRowClipboardContent="DataGrid_CopyingRowClipboardContent" AutoGenerateColumns="False"
    ItemsSource="{Binding Orders, IsAsync=True}"
    protected async override void GetData( )
    "Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it."

    I'll mention some complications first.
    If you keep a context alive, by default it will have change tracking for anything you read using it.  This is irrelevant if you read like 20 records.  Once you start talking many thousands, this if an overhead you want to avoid.
    A common dodge is to open a context, grab data, transform it elsewhere and then close the context.
    Work with your data disconnected.
    That means you lose change tracking but you can add such properties as IsDirty to a wrapping viewmodel so they each track their own changes.
    This isn't just about highlighting changed fields:
    https://gallery.technet.microsoft.com/WPF-Highlight-Changed-a77976d4
    Another dodge is to start off knowing what the user is probably interested in.
    I want todays newspaper, this months sales, yesterdays stock levels....
    People rarely actually want totally flexible filtering.
    Don't re-implement excel unless you really have to.
    You can filter huge collections, but you nothing is fr and if you put enough data into a collection thiDSDngs will slow. Eventually.
    But.  You can do date ranges.  Collectionview filtering can be persuaded to do pretty much anything you like by some variation of the filtering examples in those tips.
    For example, each of those predicates in the list to check can be quite complicated and you could go with .Any rather than trueforall.
    Hope that helps.
    Recent Technet articles: Property List Editing;
    Dynamic XAML

  • Google style autosuggest with millions rows table

    Hi All,
    I'm exploring the ways of implementing a "google style autosuggest" on a table with no less than 30 millions rows. It has a field with an address (varchar) and I'd like to create a Ajax call while the user is typing that would suggest the user few addresses.
    I was thinking about using contains+fuzzy... but not sure if it will be fast enough and if it will return the right results.
    Any suggestions ?
    thanks

    2 million rows with XML type data
    link may be of your interest.
    HTH
    Girish Sharma

  • How is the best way to manage the stats table?

    Hello!
    I have the Integration 2.1 working with an Oracle 8.1.7 db. I noticed that the table
    STATS is growing pretty fast.
    How is the best way to manage this table?... I haven't found something related with
    this issue in the documentation, but at least I want to know how to safely delete
    records from this table.
    For example, if I know the minimal time I have to keep in the table, is quite simple
    to create a shell script and/or Oracle pl/sql job to trim the table.
    I hope somebody can help me!!!!
    Thank you!
    Ulises Sandoval

    Write an app people want to buy and rate highly.

Maybe you are looking for

  • Windows 7 thinks my phone is my wife's phone when connected.

    Not sure where to post this.  I found one other person with this issue after I Googled it but no solution.  Long story short - we have 2 iPhones in the family and 2 separate profiles set up on the PC with Windows 7.  When I plug my phone in via USB o

  • System Preferences quits all the time

    Hi, Just upraded my mid-2010 27" iMac to Lion (after carefully uninstalling the few PowerPC apps I had on SnowLeopard and all the apps which are said to be not working on http://roaringapps.com/apps:table). Each time I open System Preferences, it qui

  • Read IPTC from JPE

    Is it possible for java to read the IPTC metadata from a JPEG?

  • FCP Selection Tool goes white

    I can't think of a better name, but the "pointer" or selection tool is normally black, but when selecting motion files on the timeline my pointer turns white and fills to black. But the filling to black is taking longer lately. Is this the same thing

  • Search for specific E-mail in Time Machine?

    Hi, I have a problem with searching for E-mails in Time Machine. When I search for E-mails of a specific address or subject in Mail and can't find it, I enter Time Machine and in Time Machine the "search field" of the mail window is empty and all E-m