Undo huge data

Oracle version: Oracle 10G
Operating System: Windows xp - 2
We have one java application which inserts huge data in temp table and then there is some business logic which select data from temp table and modifies (insert + updates) other table. And the application performs these steps several times in a loop.
Now we are facing following issues
1) There is lots of undo data created hence the performance is very slow.
2) Data is large (inserting data in temp is large) however we can't commit it.. As we are using it for frequent delete and insert...
3) Any different approach we should follow

user609923 wrote:
We have one java application which inserts huge data in temp table from where?
and then there is some business logic which select data from temp table and modifies (insert + updates) other table.
And the application performs these steps several times in a loop.Why? Do you really need that? Cannot these loops be avoided? Cannot you share the data? Cannot you just leave the data as is and next time just merge changes?
Now we are facing following issues
1) There is lots of undo data created hence the performance is very slow.
2) Data is large (inserting data in temp is large) however we can't commit it.. As we are using it for frequent delete and insert...
3) Any different approach we should followAssuming you have done enough to tune existing SQL statements (or the existing algorithm per se is bad), rethink your algorithm and minimize the amount of changes.
Gints Plivna
http://www.gplivna.eu

Similar Messages

  • TS4605 Hi, I was working in WORD on a file containing huge data. My machine just hung up one day while working and now I seem to have lost the file how do I get it back.  Please HELP me.

    Hi, I was working in WORD on a file containing huge data. My machine just hung up one day while working and now I seem to have lost the file how do I get it back.  Please HELP me.

    Well, iCloud has nothing to do with this.
    Do you have the built-in backup function Time Machine running on your Mac?
    See: http://support.apple.com/kb/ht1427

  • Strange error in requests returning huge data

    Hi
    Whenever a request returns huge data, we get the following error, which is annoying because there is no online source or documentation about this error:
    Odbc driver returned an error (SQLFetchScroll).
    Error Details
    Error Codes: OPR4ONWY:U9IM8TAC
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 46073] Operation 'stat()' on file '/orabi/OracleBIData/tmp/nQS_3898_715_34357490.TMP' failed with error: (75) ¾§ . (HY000)
    Any idea what might be wrong?
    Thanks

    The TMP folder is also "working" directory for the cache management (not exclusively). OBIEE tries to check if the report can be run against a cache entry using the info in this file.
    Check if MAX_ROWS_PER_CACHE_ENTRY , MAX_ROWS_PER_CACHE_ENTRY and MAX_CACHE_ENTRY_SIZE are set correctly.
    Regards
    John
    http://obiee101.blogspot.com

  • Huge data usage with Verison 4G LTE hotspot ( Verizon-890L-50DB)

    Recently upgraded my internet service to a Verizon 4G LTE hotspot (Verizon-890L-50DB).  First month - no problems - second month - I have 6GB of overage charges.  Every time I log in to check email or any other minimal data usage task, I get hit with another GB or more of download usage.  My computer is an HP Pavilion Laptop, Windows 7 with NETWORX installed.  Networx tells me it is indeed MY computer that is downloading this data and my hard drive is also showing the increases in data.  
    I've run AVG, Glary Utilities, Speedy PC PRO and Spybot Search and Destroy but found no virus, rootkit or anything on my computer that would explain this problem. 
    When I use a different wireless internet service, I do NOT see the extraneous data usage.
    I contacted Verizon's support team and got no help diagnosing the problem but they did agree to send a replacement for my hotspot device.  Hopefully this fixes the issue; then I have to battle with them about the unwarranted $60 or more in overage charges.
    Has anyone else experienced a problem similar to this?

    I would recomend getting out of the contract before the 14 day trial period ends.   Verizon will charge you an activation fee, restocking fee and one months service, but that is better than being stuck with this mess.  I have homefusion and am afraid to use since verizon seems to fabricate data usage.  Unfortunately I did not realize this untill after the 14 day trial.  Now is will cost me $350 to terminate my contract.
    Date: Mon, 18 Feb 2013 21:21:48 -0700
    From: [email protected]
    To: <Email address removed for privacy.>
    Subject: Re: Huge data usage with Verison 4G LTE hotspot ( Verizon-890L-50DB) - Re: Huge data usage with Verison 4G LTE hotspot ( Verizon-890L-50DB)
                                                                                    Re: Huge data usage with Verison 4G LTE hotspot ( Verizon-890L-50DB)
        created by Lyda1 in Verizon Jetpack 4G LTE Mobile Hotspot 890L - View the full discussion
    Exactly the same thing has happened to me.  I purchased the Verizon Jetpack™ 4G LTE Mobile Hotspot 890L two days ago.  After one day, I had supposedly used half the 5GB monthly allowance.  After two days, I am at 4.25 GB usage.  I don't stream movies, I have the hotspot password protected, I live alone, and no one else uses my computer.  I have not downloaded any large files.  At this rate, I'll go broke soon.
    Reply to this message by replying to this email -or- go to the message on Verizon Wireless Community
    Start a new discussion in Verizon Jetpack 4G LTE Mobile Hotspot 890L by email or at Verizon Wireless Community
    © 2011 Verizon Wireless
    Verizon Wireless | One Verizon Way | Mail Code: 180WVB | Basking Ridge, NJ 07920
    We respect your privacy.  Please review our privacy policy for more information.
                                 Not interested in these emails anymore, or want to change how often they come? Update your email preferences.
    Message was edited by: Verizon Moderator

  • Method for Downloading Huge Data from SAP system

    Hi All,
    we need to download the huge data from one SAP system  & then, need to migrate into another SAP system.
    is there any better method, except downloading data through SE11/SE16 ? please advice.
    Thanks
    pabi

    I have already done several system mergers, and we usually do not have the need to download data.
    SAP can talk to SAP. with RFC and by using ALE/IDOC communication.
    so it is possible to send e.g. material master with BD10 per IDOC from system A to system B.
    you can define that the IDOC is collected, which means it is saved to a file on the application server of the receiving system.
    you then can use LSMW and read this file with several hundred thousand of IDOCs as source.
    Field mapping is easy if you use IDOC method for import, because then you have a 1:1 field mapping.
    So you need only to focus on the few fields where the values changes from old to new system..

  • Posting huge data on to JSP page using JSTL tags

    Hi,
    I have one application where I have to post huge data (approximately 2000 rows of data) into JSP page. I am using JSTL tags and running on Tomcat 5.
    It is taking almost 20 to 25 seconds to load the entire page.
    Is it the optimal time to load or it could be improved?
    Please let me know.
    Thanks,
    --Subbu.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

    Hi Evnafets,
    Thank you for the response.
    Here are the tasks I am doing to display the data on JSP.
    0. We are running on Tomcat 5 and the memory size is 1024MB (1GB).
    1. Getting the data - I am not performing any database queries. The data is stored in the static cache memory. So the server side response is so quick - less than a milli second.
    2. Using Java beans to pass data to the presentation layer (JSP).
    3. 10 'if' conditions and 2 'for' loops and 2 'choose' statements are being used in the JSP page while displaying the data.
    4. Along with the above, there are 4 javascript files are being used.
    5. The jsp file size after rendering the data, is aprox. 160 kb
    Hope this information helps you to understand the problem.
    Thanks,
    --Subbu.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • TSV_TNEW_PAGE_ALLOC_FAILED: Huge data into onternal table

    Hello Experts,
    I am selecting data from a database table into one internal table. The internal table was defined using occurs 0. I am getting Short Dump TSV_TNEW_PAGE_ALLOC_FAILED when the data selected from database table is huge. The internal table is unable to hold that amount of huge data.
    Can you please suggest how to overcome this using ABAP.
    regards.

    2 ways are there:
    1. I think you need to take help of Basis guys to change some parameters which suports huge data in to the memory which avoids this
    2. Split your internal table in two parts and do your calculations or manupulations....
    Hope the solution is prompt.

  • SQL Server 2005 huge data migration. Taking more than 18 hours ...

    Hi,
    We are migrating an SQL server 2005 DB to Oracle 10 g 64 bit 10.2.0.5. The source sql server DB has few tables (<10) and only few stored procedure(<5). However one table of source is very huge, it has 70 million records in it. I started the migration process; it was running till 18 hours passed away then I cancelled it. I am using SQL Developer 3.1.0.7 with online migration.
    Kindly let me know if I choose offline migration will it reduce the data move time. Is there some other tool by oracle that can migrate huge data (records in millions) quickly. Is there some configuration in SQL Developer that can catalyses the process?
    Kindly help.
    Best Regards!
    Irfan

    I tried using SQL Loader to move data. The file exported by SQL server for the transaction table is of size around 17 GB. The sql loader stopped giving error on file size limit. The error is given below:
    SQL*Loader: Release 10.2.0.5.0 - Production on Thu Mar 22 15:15:28 2012
    Copyright (c) 1982, 2007, Oracle. All rights reserved.
    SQL*Loader-510: Physical record in data file (E:/mssqldata/transaction.txt) is longer than the maximum(1048576)
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    C:\Users\Administrator\Desktop\wuimig\datamove\wuimig\2012-03-22_14-20-29\WUI\dbo\control>
    Please help me out how do I move such huge data to oracle from sql server.
    Best Regards!

  • Huge data upload to custom table

    I created a custom table and the requirement is to load huge data to this table around 20million records. I have input file with 1million records. After loading few files, I got error max extents 300 is reached. We talked to basis and they increased to 1000. Still I'm getting this error.
    The way my porgram written was
    reading all input data to internal table i_zmstdcost_sum_final.
    INSERT zmstdcost_sum FROM TABLE i_zmstdcost_sum_final.
    Can any one suggest me the best solution or how to alter my program?
    Thanks,
    PV

    Hi
    If you need to load a very large number of records u shouldn't load the record in an internal table and use the commit statament after a certain number of record and close a file after loading all its records.
    LOOP AT T_FILE.
      OPEN DATASET T_FILE.
      IF SY-SUBRC = 0.
        DO.
            READ DATASET T_FILE INTO WA.
            IF SY-SUBRC <> 0.
              EXIT.
            ENDIF.
            INSERT ZTABLE FROM WA.
            IF SY-SUBRC = 0.
              COUNT = COUNT + 1.
            ENDIF.
            IF COUNT = 1000.
              COUNT = 0.
              COMMIT WORK.
            ENDIF.
        ENDDO.
        IF COUNT > 0.
          COMMIT WORK.
          COUNT = 0.
        ENDIF.
       CLOSE DATASET T_FILE.
      ENDIF.
    ENDLOOP.
    Max

  • Load Huge data into oracle table

    Hi,
    I am using oracle 11g Express Edition, I have a file of .csv forma, Which has a data of size 500MB which needs to be uploaded into oracle table.
    Please suggest which would be the best method to upload the data into table. Data is employee ticket history which is of huge data.
    How to do the mass upload of data into oracle table need experts suggestion on this requirement.
    Thanks
    Sudhir

    Sudhir_Meru wrote:
    Hi,
    I am using oracle 11g Express Edition, I have a file of .csv forma, Which has a data of size 500MB which needs to be uploaded into oracle table.
    Please suggest which would be the best method to upload the data into table. Data is employee ticket history which is of huge data.
    How to do the mass upload of data into oracle table need experts suggestion on this requirement.
    Thanks
    SudhirOne method is to use SQL Loader (sqlldr)
    Another method is to define an external table in Oracle which is allowing you to view your big file as a table in database.
    You may want to have a look at this guide: Choosing the Right Export/Import Utility and this Managing External Tables.
    Regards.
    Al
    Edited by: Alberto Faenza on Nov 6, 2012 10:24 AM

  • How to update a table with huge data

    Hi,
    I have a scenario where I need to update tables that are having huge data (each table is having more than 10,00000)
    I am writing this update in PLSQL block.
    Is there any way to improve the performance of this update statement? Please suggest...
    Thanks.

    user10403630 wrote:
    I am storing all tables and columns that needs to be updated in tab_list and forming a dynamic qry.
    for i in (select * from tab_list)
    loop
    v_qry := 'update '||i.table_name||' set '|| i.column_name ' = '''||new_value||'''' ||' where '||i.column_name = ''||old_value||'''';
    execute immediate v_qry;
    end loop;Sorry to say so but this code is aweful!
    Well the only thing to make this even more slow would be to add a commit inside the loop.
    Some advices. But I'm not sure which one works in your case.
    The fastest way to update a million rows is: write a single update statement. On typical systems this should only run for like a couple of minutes.
    if you need to update several tables then write a single update for each table.
    If you have different values that need to be updated then find a way how to consider those different values in a single update or merge statement. Either by joining another table or by using some in-lists.
    e.g.
    update myTable
    set mycolumn = decode(mycolumn
                                     ,'oldValue1','newvalue1'
                                     ,'oldValue2','newvalue2'
                                     ,'oldValue3','newvalue3')
    where mycolumn in ('oldValue1','oldvalue2','oldvalue3'....);If you need to do this in pl/sql then
    1) use bind variables to avoid hard parsing the same statement again and again
    2) use bulk binding to avoid pl/sql context switches

  • Table maintenence Generator for a Huge data Table.

    Hi Experts,
    I have created a Table maitenance for a Custom table which has some 80,000,000 records approx.
    Now the when i run this, it goes for a short dump saying "STORAGE_PARAMETERS_WRONG_SET".
    The basis have reported that the tcode is running a sequential read on the table & they are saying that is the reason for this dump.
    Are there any limitations that Table maintenance can't be created for table with huge data?
    Or should the program be modified to accomodate the "READ" from the tables in case of large entries?
    Please advice.
    regards,
    Kevin.

    Hi,
      I think this is because of memory overflow.
      You can create two screens for this, in on screen (Overview) screen, restrict the data selection.
      In detail screen display the data.
    With regards,
    Vamsi

  • Transferin Huge data via Java sockets! Problematic!

    Hello!
    I tried to write a server app in Java to get huge data which contains a meta data XML format about file information and then binary data of file...
    I get data from socket with DataInputStream and then I read like this:
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    while((got = in.read(res)) != -1)
    out.write(res, 0, got);
    byte[] recieved = out.toByteArray();
    I saw that if I store all receiving data in array it takes alot of memory and I cannot even transfer files which have 650MB of data.
    So I decided to write what I get from socket directly to a file...
    I used FileOutputStream to do it...
    But problem is -> THIS SERVER IS SO SLOW!
    I wrote similar code in VB6... And it receives data from client SO SO SO faster than Java...
    What's wrong with my code which makes works slow?
    how can I solve it?
    You experts have any server code example for receiving huge amount of data?
    Please advice...
    Thanks alot

    How to use buffered streams: go to Java tutorials http://java.sun.com/docs/books/tutorial/index.html - look for "basic input/output", and "buffered streams" under it. You'll want BufferedInputStream and BufferedOutputStream.
    Alternatively:
            byte buf = new byte[8192];
            while (true) {
                int count = in.read(buf);
                if (count == -1)
                    break;
                out.write(buf, 0, count);
            }If at all possible, don't read the whole file into memory at once. Read with a loop like the above and do whatever you are supposed to do with the data a chunk at a time. But if you must have it in memory then I guess you got to do what you got to do.

  • How do I reclaim the unused space after a huge data delete- very urgent

    Hello all,
    How do I reclaim the unused space after a huge data delete?
    alter table "ODB"."BLOB_TABLE" shrink space; This couldn't execute with ora 10662 error. Could you please help

    'Shrink space' has requirements:
    shrink_clause
    The shrink clause lets you manually shrink space in a table, index-organized table or its overflow segment, index, partition, subpartition, LOB segment, materialized view, or materialized view log. This clause is valid only for segments in tablespaces with automatic segment management. By default, Oracle Database compacts the segment, adjusts the high water mark, and releases the recuperated space immediately.
    Compacting the segment requires row movement. Therefore, you must enable row movement for the object you want to shrink before specifying this clause. Further, if your application has any rowid-based triggers, you should disable them before issuing this clause.
    Werner

  • Range partition the table ( containing huge data ) by month

    There ia a table with huge data ard 9GB.This needs to range patitioned by month
    to improve performance.
    Can any one suggest me the best option to implement partitioning in this.

    I have a lot of tables like this. My main tip is to never assign 'MAXVALUE' for your last partition, because it will give you major headaches when you need to add a partition for a future month.
    Here is an example of one of my tables. Lots of columns are omitted, but this is enough to illustrate the partitioning.
    CREATE TABLE "TSER"."TERM_DEPOSITS"
    ( "IDENTITY_CODE" NUMBER(10), "ID_NUMBER" NUMBER(25),
    "GL_ACCOUNT_ID" NUMBER(14) NOT NULL ,
    "ORG_UNIT_ID" NUMBER(14) NOT NULL ,
    "COMMON_COA_ID" NUMBER(14) NOT NULL ,
    "AS_OF_DATE" DATE,
    "ISO_CURRENCY_CD" VARCHAR2(15) DEFAULT 'USD' NOT NULL ,
    "IDENTITY_CODE_CHG" NUMBER(10)
    CONSTRAINT "TERM_DEPOSITS"
    PRIMARY KEY ("IDENTITY_CODE", "ID_NUMBER", "AS_OF_DATE") VALIDATE )
    TABLESPACE "DATA_TS" PCTFREE 10 INITRANS 1 MAXTRANS 255
    STORAGE ( INITIAL 0K BUFFER_POOL DEFAULT)
    LOGGING PARTITION BY RANGE ("AS_OF_DATE")
    (PARTITION "P2008_06" VALUES LESS THAN (TO_DATE(' 2008-07-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN'))
    TABLESPACE "DATA_TS_PART1" PCTFREE 10 INITRANS 1
    MAXTRANS 255 STORAGE ( INITIAL 1024K BUFFER_POOL DEFAULT)
    LOGGING NOCOMPRESS , PARTITION "P2008_07" VALUES LESS THAN (TO_DATE(' 2008-08-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS',
    'NLS_CALENDAR=GREGORIAN'))
    TABLESPACE "DATA_TS_PART2" PCTFREE 10 INITRANS 1 MAXTRANS 255
    STORAGE ( INITIAL 1024K BUFFER_POOL DEFAULT) LOGGING NOCOMPRESS )
    PARALLEL 3

Maybe you are looking for

  • Help with WindowsDesktopSSO and AMIdentity.getAttributes

    Hi guys and girls, I need some help from you experts. I successfully setup, thanks to this guide http://blogs.oracle.com/knittel/entry/opensso_windowsdesktopsso and a lot of trial & errors and googling a Kerberos authentication between OpenAM version

  • [SOLVED] Update to Samba 4 completely broke functionality of Samba

    I have a computer running Arch 64bit, from which I share media files to my girlfriend's Windows 7 laptop, and to my own Windows 8 tablet. Until an hour or so, I was running Samba 3.6.13 which happily worked most of the time (with occasional breaks th

  • Time-out error due to SQL in loop

    Hi Folks, Have an issue with performance. I have a couple of scenarios: 1) A custom report program loops through 9000 odd records (records which are being fetched from BSEG, BSID table). It gives a short dumpafter 10 mins. It is a TIME_OUT error. The

  • Executable jar - question

    I build an executable jar or exe and I want to run this program on a system(win 2000), where there is no jdk installed. Is there any possibility to built my program in such way, so that this program would run on this system?

  • 1.5 won't load on MacBook Pro

    I'm trying to update Aperture 1.1.2 to 1.5 on my MacBook Pro. I've got the Academic version of Aperature. But when I try to update, the updater tells me I have to have Aperture installed. I do. What gives? I tried repairing all permissions, but it do