Large number of rows in ResultSet
Hai,
I have huge number of rows retrive from database using ResultSet but it seems to be slow to get records. When it retrive data from database can we read data from ResultSet?.
Advance wishes,
Daiesh
No, he can't, but who's to say that's what's slowing
him down? Could be bad JDBC code, old JDBC drivers,
or bad queries. Could be.
Honestly why does it matter to you? It matters a great deal.
If he's talking about a web page, and LOTS of people who ask this question are bringing that data down to the browser, my answer would be that no user wants to deal with 500K records all at once. A well-placed WHERE clause or filter, or paging through 25 at a time a la Google, would be my recommendation in that case.
Did you have an
answer all prepared to go so long as he gave you a
reason that was up to your rigorous standards? Yes, see above.
You
could have said "make sure you really need all that
data, and if you do, then try X, Y, and Z to speed it
up". But to just throw that out with no evidence
you're prepared to help regardless? Why bother?See above. There was a reason for the question.
Oh wait... I get it ... did I mistake your submission
to "The Most Useless, Unhelpful, and Arrogant Post
Contest" for a real post? My bad...Nope, that would be yours, dumb @ss.
%
Similar Messages
-
Oracle Error 01034 After attempting to delete a large number of rows
I sent the command to delete a large number of rows from a table in an oracle database (Oracle 10G / Solaris). The database files are located at /dbo partition. Before the command the disk space utilization was at 84% and now it is at 100%.
SQL Command I ran:
delete from oss_cell_main where time < '30 jul 2009'
If I try to connect to the database now I get the following error:
ORA-01034: ORACLE not available
df -h returns the following:
Filesystem size used avail capacity Mounted on
/dev/md/dsk/d6 4.9G 5.0M 4.9G 1% /db_arch
/dev/md/dsk/d7 20G 11G 8.1G 59% /db_dump
/dev/md/dsk/d8 42G 42G 0K 100% /dbo
I tried to get the space back by deleting all the data in the table oss_cell_main :
drop table oss_cell_main purge
But no change in df output.
I have tried solving it myself but could not find sufficient directed information. Even pointing me to the right documentation will be higly appreciated. I have already looking at the following:
du -h :
du -h8K ./lost+found
1008M ./system/69333
1008M ./system
10G ./rollback/69333
10G ./rollback
27G ./data/69333
27G ./data
1K ./inx/69333
2K ./inx
3.8G ./tmp/69333
3.8G ./tmp
150M ./redo/69333
150M ./redo
42G .
I think its the rollback folder that has increased in size immensely.
SQL> show parameter undo
NAME TYPE VALUE
undo_management string AUTO
undo_retention integer 10800
undo_tablespace string UNDOTBS1
select * from dba_tablespaces where tablespace_name = 'UNDOTBS1'
TABLESPACE_NAME BLOCK_SIZE INITIAL_EXTENT NEXT_EXTENT MIN_EXTENTS
MAX_EXTENTS PCT_INCREASE MIN_EXTLEN STATUS CONTENTS LOGGING FOR EXTENT_MAN
ALLOCATIO PLU SEGMEN DEF_TAB_ RETENTION BIG
UNDOTBS1 8192 65536 1
2147483645 65536 ONLINE UNDO LOGGING NO LOCAL
SYSTEM NO MANUAL DISABLED NOGUARANTEE NO
Note: I can reconnect to the database for short periods of time by restarting the database. After some restarts it does connect but for a few minutes only but not long enough to run exp.Check the alert log for errors.
Select file_name, bytes from dba_data_files order by bytes;
Try to shrink some datafiles to get space back. -
JDev: af:table with a large number of rows
Hi
We are developing with JDeveloper 11.1.2.1. We have a VO that returns > 2.000.000 of rows and that we display in a af:table with access mode 'scrollable' (the default) and 'in Batches of' 101. The user can select one row and do CRUD operations in the VO with popups. The application works fine but I read that scroll very large number of rows is not a good idea because can cause OutOfMemory exception if the user uses the scroll bar many times. I have tried with access mode in 'Range Paging' but the application works in strange ways. Sometimes when I select a row to edit, if the selected row is the number 430 in the popup is show it the number 512 and when I want to insert a new row throws this exception:
oracle.jbo.InvalidOperException: JBO-25053: No se puede navegar con filas no enviadas en RangePaging RowSet.
at oracle.jbo.server.QueryCollection.get(QueryCollection.java:2132)
at oracle.jbo.server.QueryCollection.fetchRangeAt(QueryCollection.java:5430)
at oracle.jbo.server.ViewRowSetIteratorImpl.scrollRange(ViewRowSetIteratorImpl.java:1329)
at oracle.jbo.server.ViewRowSetIteratorImpl.setRangeStartWithRefresh(ViewRowSetIteratorImpl.java:2730)
at oracle.jbo.server.ViewRowSetIteratorImpl.setRangeStart(ViewRowSetIteratorImpl.java:2715)
at oracle.jbo.server.ViewRowSetImpl.setRangeStart(ViewRowSetImpl.java:3015)
at oracle.jbo.server.ViewObjectImpl.setRangeStart(ViewObjectImpl.java:10678)
at oracle.adf.model.binding.DCIteratorBinding.setRangeStart(DCIteratorBinding.java:3552)
at oracle.adfinternal.view.faces.model.binding.RowDataManager._bringInToRange(RowDataManager.java:101)
at oracle.adfinternal.view.faces.model.binding.RowDataManager.setRowIndex(RowDataManager.java:55)
at oracle.adfinternal.view.faces.model.binding.FacesCtrlHierBinding$FacesModel.setRowIndex(FacesCtrlHierBinding.java:800)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
<LoopDiagnostic> <dump> [8261] variableIterator variables passivated >>> TrackQueryPerformed def
<LifecycleImpl> <_handleException> ADF_FACES-60098:El ciclo de vida de Faces recibe excepciones no tratadas en la fase RENDER_RESPONSE 6
What is the best way to display this amount of data in a af:table and do CRUD operations?
Thanks
Edited by: 972255 on 05/12/2012 09:51Hi,
honestly, the best way is to provide users with an option to filter the result set displayed in the table to reduce the result set size. No-one will query 2.00.000 rows using the table scrollbar.
So one hint for optimization would be a query form (e.g. af:query)
To answer your question "srollable" vs. "page range", see
http://docs.oracle.com/cd/E21043_01/web.1111/b31974/bcadvvo.htm#ADFFD1179
Pay attention to what is written in the context of +"The range paging access mode is typically used for paging through read-only row sets, and often is used with read-only view objects.".+
Frank -
How to Capture a Table with large number of Rows in Web UI Test?
HI,
Is there any possibility to capture a DOM Tabe with large number of Rows (say more than 100) in Web UI Test?
Or is there any bug?Hi,
You can try following code to capture the table values.
To store the table values in CSV :
*web.table( xpath_of_table ).exportToCSVFile("D:\exporttable.csv", true);*
TO store the table values in a string:
*String tblValues=web.table( xpath_of_table ).exportToCSVString();*
info(tblValues);
Thanks
-POPS -
Af:table Scroll bars not displayed in IE11 for large number of rows
Hi. I'm using JDeveloper 11.1.2.4.0.
The requirements of our application are to display a table potentially displaying very large numbers of rows (sometimes in excess 3 million). While the user does not need to scroll through this many rows, the QBE facility allows drill-down into specific information in the rowset. We moved up to JDeveloper 11.1.2.4.0 primarily so IE11 could be used instead of IE8 to overcome input latency in ADF forms.
However, it seems that IE11 does not enable the vertical or horizontal scroll bars for the af:table component when the table contains greater than (approx) 650,000 rows. This is not the case when the Chrome browser is used. Nor was this the case on IE8 previously (using JDev 11.1.2.1.0).
When the table is filtered using the QBE (to a subset < 650,000 rows), the scroll bars are displayed correctly.
In the code the af:table component is surrounded by an af:panelCollection component which is itself surrounded by an af:panelStretchLayout component.
Does anyone have any suggestions as to how this behaviour can be corrected? Is it purely a browser problem, or might there be a programmatic workaround in ADF?
Thanks for your help.Thanks for your response. That's no longer an option for us though...
Some further investigation into the generated HTML has yielded the following information...
The missing scroll bars appear to be as a consequence of the setting of a style for the horizontal and vertical scroll bars (referenced as vscroller and hscroller in the HTML). The height of the scrollbar appears to be computed by multiplying the estimated number of rows in the iterator on which the table is based by 16 to give a scrollbar size proportional to the amount of data in the table, although it is not obvious why that should be done for the horizontal scroller. If this number is greater than or equal to 10737424 pixels then the scroll bars do not display in IE11.
It would seem better to be setting this height to a sensible limiting number of pixels for a large number of rows?
Alternatively, is it possible to find where this calculation is taking place and override its behaviour?
Thanks. -
Result Set - How to retrieve the number of rows in resultset
harlo....everyone.
i m new in this language and would like to ask that how can i store the result into an array or u all hav others ideal to do it......? Hopefully can get answer asap....thanksssss.
Below is the source code that i did. Please comment it
ResultSet rs = stmt.executeQuery("Select CCourse_Code from RegisteredCourse, Student where Student.NStud_ID ='123'and Student.CStud_Pwd='123' and Student.NStud_ID = RegisteredCourse.NStud_ID ");
if(rs!=null){
// can i get the the number of records in resultset, rs to define my array size
String course[] = new String[2];
while(rs.next()){
for(int i =0; i<course.length; i++)
course[i] = rs.getString("ccourse_code");
return course;Or...
ResultSet rs = stmt.executeQuery("Select count(CCourse_Code) from RegisteredCourse, Student where Student.NStud_ID ='123'and Student.CStud_Pwd='123' and Student.NStud_ID = RegisteredCourse.NStud_ID ");
This should return a table with one row and one column. This will hold the number originally requested.
However I would still go with the ArrayList suggestion given above. -
Get number of rows in ResultSet object
Can anybody guide me on how to get the total number of rows in a result set object without iterating through all rows? Thanks.
Hi Soon,
You mention that you want to get the row count without iterating through the "ResultSet". I don't know what database you are using, but I want to inform you that with my Oracle 8i (8.1.7.4) database, using JDK 1.3.1 and the matching JDBC (thin) driver, the implementation of the "last()" method (that others have recommended that you use), actually iterates through the whole "ResultSet".
What I do is first execute a "SELECT COUNT(1)" query (using a literal "1") to get the number of rows -- this is much faster than executing the "last()" method.
Then I execute my actual query.
Hope this helps you.
Good Luck,
Avi. -
hello !
i have an application wich interrogates my Oracle 8i Databse using OCI, my problem is that the number of rows returned is so large ( the totality of rows is 1 million) , how can i have rows returned by packets (as the same way as results returned by a search tool)Use bulk binds (see http://technet.oracle.com/sample_code/tech/pl_sql/htdocs/bulkdemo.txt)
hello !
i have an application wich interrogates my Oracle 8i Databse using OCI, my problem is that the number of rows returned is so large ( the totality of rows is 1 million) , how can i have rows returned by packets (as the same way as results returned by a search tool) -
Deleing large number of rows from table
Hi,
Consider tables A,B,C,D,E,F. all are having 100000++ records Tables B,C,D are dependent on table A (with foreign key constraint). When I am deleting records from all tables, table B,C,D are taking max 30-40 seconds while table A is taking 30-40 mins. All tables are having indexes.
Method I have used:
1. Created Temp table
2. then deleted all records from B,C,D,E,F for all records in temp table for limit of 500.
delete from B where exists (select 1 from temp where b.col1=temp.col1);
3. please suggest options for me why it is taking too much time for deleting records in table A.
Is there any thing that during deleting data from such master table, it is reffering to all dependent tables even if dependent data is not present ??? If yes then couls you please please suggest options for me to remove this ? I hope it won't go for CHECK constraints during deleting data.
Thanks,
Avinash
Edited by: user12952025 on Apr 30, 2013 2:55 AM
Edited by: user12952025 on Apr 30, 2013 2:56 AM
Edited by: user12952025 on Apr 30, 2013 2:57 AMuser12952025 wrote:
Hi,
Consider tables A,B,C,D,E,F. all are having 100000++ records Tables B,C,D are dependent on table A (with foreign key constraint). When I am deleting records from all tables, table B,C,D are taking max 30-40 seconds while table A is taking 30-40 mins. All tables are having indexes.What attribute of the Foreign key is specified? Is it On Delete Cascade? If yes, then in a way, deleting data fro Child tables is un-necessary. Only a Delete from parent shall suffice.
>
Method I have used:
1. Created Temp table
2. then deleted all records from B,C,D,E,F for all records in temp table for limit of 500.
delete from B where exists (select 1 from temp where b.col1=temp.col1);
3. please suggest options for me why it is taking too much time for deleting records in table A.
Is there any thing that during deleting data from such master table, it is reffering to all dependent tables even if dependent data is not present ??? If yes then couls you please please suggest options for me to remove this ? I hope it won't go for CHECK constraints during deleting data.One another way is to "Switch-Off" the relationship while deleting the data.
ALTER TABLE table_name
disable CONSTRAINT constraint_nameAnd then Delete the data from each of tables.
You did specify the number of rows in each table, it would have been better to mention the number of rows to be deleted.
It is not a hard-and-fast way, but would generally perform better, to copy the data (to be retained) from Parent Table into a Temporary Table, Drop Parent Table and rename teh Temporary table to parent table. Similar can be performed on Child tables.
You may then Enable the Foreign key constraints. -
Java executed SQL statement to XML - large number of rows
Hello,
I have to write some code to generate XML as the result of a query. AFAIK I am using the latest versions of the relevant Java libraries etc.
I have found that using a max_size function above 2000 results in 'Exception in thread "main" java.lang.OutOfMemoryError' errors
I thought I could overcome this by reading the data in batches using the skip_rows functionality.
I have included the code I am using below and would appreciate any help.
Best Regards,
Mark Robbins
import java.sql.*;
import java.math.*;
import oracle.xml.sql.query.*;
import oracle.xml.sql.docgen.*;
import oracle.jdbc.*;
import oracle.jdbc.driver.*;
public class XMLRetr
public static void main(String args[]) throws SQLException
String tabName = "becc";
String user = "test/test";
DriverManager.registerDriver(new oracle.jdbc.driver.OracleDriver());
//initiate a JDBC connection
Connection conn =
DriverManager.getConnection("jdbc:oracle:oci8:"+user+"@test");
// initialize the OracleXMLQuery
OracleXMLQuery qry = new OracleXMLQuery(conn,"select * from "+tabName+" where TRUNC(SAMPLED) < \'14-NOV-99\'" );
// structure the generated XML document
System.out.println("Before setting parameters");
qry.setMaxRows(8000); // set the maximum number of rows to be returned
qry.setRowsetTag("ROOTDOC"); // set the root document tag
qry.setRowTag("DBROW"); // sets the row separator tag
qry.setRaiseException(true);
qry.keepCursorState(true);
// create the document generation factory. Note: there are methods in OracleXMLQuery
// which don't require that an OracleXMLDocGen object be passed in; but rather, they
// create an OracleXMLDocGen object of their own.
OracleXMLDocGen doc = (OracleXMLDocGen) new OracleXMLDocGenString();
for (int rowCnt = 1; rowCnt < 8000; rowCnt = rowCnt + 1000)
// get the xml
System.out.println("Before skip rowCnt:"+rowCnt);
qry.setSkipRows(rowCnt); // process next batch
System.out.println("Before getXML");
// this is where I get the exception on the second iteration of the loop
qry.getXML(doc);
System.out.println("Displaying document");
// System.out.println(doc.getXMLDocumentString());
System.out.println("Row number:"+rowCnt);
System.out.flush();
System.out.println("End of document");
qry.close();
nullI used qry.getXMLString()
But called it for every row, ie, set
qry.maxRows(1) and qry.skipRows(rowcount-1).
The down side is I had to postprocess the String to remove the processing instruction and doc level tags. -
Export to Excel - Large number of Rows
Hi All,
I have a report in Answers which returns around 150,000 to 200,000 records. I tried to export this report in Excel as well Excel 2000 format.
I got only 65,004 records.But in the page navigator shows like this Records 1-65000. When i click Update Row Count in the Physical Layer it shows 170,560 records. When i click 'All Pages' button in the Page navigator, page gets hung. How to fix this issue?
Also the Excel has a limit of 65,536 rows per sheet. If the no of records exceeds the excel limit then one more sheet need to be added automatically with the records. How to configure this setting?Excel has the limit of accomodating 65k rows,i don't think we can change excel limit. when it comes to OBIEE report you can change the instance config file(OracleBI Data/web/config to accomodate any number of rows as per your convenience.
Thanks -
How do i get the number of rows in Resultset?
Hi,
Is there a way to make the size of string array according to the number of records
from the Resultset? The other problem is the values in saRow are all null.Why is it so? Thanks for helping me in advance
String sql = "SELECT Name FROM UserType" ; << Return "Admin" "User" "CIO"
ResultSet rs = db.ExecuteSQL(sql);
while (rs.next()) {
int i=0,rowCount;
String[] saRow = new String[2]; << supposed to use rowCount
saRow[i++] = rs.getString("Name");
System.out.println(saRow);
//System.out.println
Database Query Success
null << saRow value
null
null
null
null
nullEssentially, you can't. A ResultSet is not a set of rows in memory: it is a cponnection to a back end database. Potentially, you start getting results out of the result set while the database is still searching for more.
I see this all the time on these forums: people doing work in java that is better done on the database. If you want to know the number of records in the database, ask the database: "select count(*) from UserType". Especially once you start needing to search big tables.
However, in this case, where you are just loading up the types: why bother? You know that there wioll only be a few of them. Load the results into an ArrayList. Go to the JavaDocs that you downloaded with the JDK, go to the "guides" section, and see the guide on the collections API. -
Issue in updating large number of rows which is taking a long time
Hi all,
Am new to oracle forums. First I will explain my problems as below:
1) I have a table of 350 columns for which i have two indexes. One is for primary key's id
and the other is the composite id (combination of two functional ids)
2) Through my application, the user can calculate some functional conditions and the result
is updated in the same table.
3) The table consists of all input, intermediate and output columns.
4) The only way of calculation is done through update statements. The problem is, for one
complete process, the total number of update statement hits the db is around 1000.
5) From the two index, one indexed column is mandatory in all update where clause. So one
will come at any case but the other is optional.
6) Updating the table is taking a long time if the row count exceeds 1lakh.
7) I will now explain the scenario:
a. Say there is 5lakh 100 records in the table in which mandatory indexed column id 1 has
100 records and id 2 has 5 lakhs record.
b. If I process id 1, it is very fast and executed within 10seconds. But if I process id 2,
then it is taking more than 4 minutes to update.
Is there any way to increase the speed of the update statement. Am using oracle 10g.
Please help me in this, Since I am a developer and dont have much knowledge in oracle.
Thanks in advance.
Regards,
Sethurefer the link:
http://hoopercharles.wordpress.com/2010/03/09/vsession_longops-wheres-my-sql-statement/ -
HOW TO - Insert large number of rows fast
I have a tables A , B and C and I select some data from A, B and C using some complex criteria.
Table A B and C has 10mil rows.
final rows to be insert into table D is about 3mil.
Currently the rows are inserted one at a time and there are 3 mil inserts in the plsql.
What is the best way to create these rows.
psudocode
begin
for loop ..... loop
--compled selection criteria.
insert into D ..... ;
end loop ;
end ;
SSis there a way to optimize the inserts.The inserts takes very little time
Re: Insert Statement Performance Degradation
In this example the same number of inserts into the same table takes 0.03 seconds to insert 100,000 rows, and 3.06 seconds when looped. If the entire insert operation was optimized away to take zero time, the loop would still take 3.03 seconds, which represents a performance increase of a little under 1%.
As I said it is not a single query by which I build a row for insert into table D, It is
a complex operation which is not necessary to explain here.This is the slow part that needs optimizing. -
How do I process a large number of rows using ADO?
I need to iterate through a table with about 12 million records and process each row individually. The project I'm doing cannot be resolved with a simple UPDATE statement.
My concern is that when I perform the initial query that so much data will be returned that I'll simply crash.
Ideally I would get to a row, perform my operation, then go to the next row ... and so on and so on.
I am using ADO / C++I suggest you simply use the default fast-forward read-only (firehose) cursor to read the data. This will stream data from SQL Server to your application and client memory usage will be limited to the internal API buffers without resorting to paging.
I ran a quick test of this technique using ADO classic and the C# code below and it ran in under 3 minutes (35 seconds without the file processing) on my Surface Pro against a remote SQL Server. I would expect C++ to be a significantly faster
since it won't incur the COM interop penalty. The same test with SqlClient ran in under 10 seconds.
static void test()
var sw = Stopwatch.StartNew();
Console.WriteLine(DateTime.Now.ToString("HH:mm:ss.fff"));
object recordCount;
var adoConnection = new ADODB.Connection();
adoConnection.Open(@"Provider=SQLNCLI11.1;Server=serverName;Database=MarketData;Integrated Security=SSPI");
var outfile = new StreamWriter(@"C:\temp\MarketData.txt");
var adoRs = adoConnection.Execute("SELECT TOP(1200000) Symbol, TradeTimestamp, HighPrice, LowPrice, OpenPrice, ClosePrice, Volume FROM dbo.OneMinuteQuote;", out recordCount);
while(!adoRs.EOF)
outfile.WriteLine("{0},{1},{2},{3},{4},{5},",
(string)adoRs.Fields[0].Value.ToString(),
((DateTime)adoRs.Fields[1].Value).ToString(),
((Decimal)adoRs.Fields[2].Value).ToString(),
((Decimal)adoRs.Fields[3].Value).ToString(),
((Decimal)adoRs.Fields[4].Value).ToString(),
((Decimal)adoRs.Fields[5].Value).ToString());
adoRs.MoveNext();
adoRs.Close();
adoConnection.Close();
outfile.Close();
sw.Stop();
Console.WriteLine(DateTime.Now.ToString("HH:mm:ss.fff"));
Console.WriteLine(DateTime.Now.ToString(sw.Elapsed.ToString()));
Dan Guzman, SQL Server MVP, http://www.dbdelta.com
Maybe you are looking for
-
How do I restore Safari to factory settings?
I recently downloaded a third-party program that had unexpected effects on my Safari browser after installation, even though the program I downloaded had nothing to do with my browser. While my Safari had worked perfectly before, now, my ad- and popu
-
JSF Page -- Backing Bean -- Non-JSF Page ...
Hi I have a JSF page where the user enters data and submits by clicking on the Submit button. My backing bean then takes the data and processes it. After successful processing, I want to direct the user to a non-JSF page. What is the best way of achi
-
I plugged in my ipod and it said you are disabled for 23,423,324 minutes, what can i do?
-
When asked about the untrusted app/certificate there is a checkbox to "always trust" and a dropdown to "always trust". Checking the box brigs up a system dialog asking for a password to confirm but the pointer turns into a spinning wheel and the syst
-
Why has my Pandora stopped working and is asking for Adobe Flash V 10 or better
When trying to access my Pandora Music account, it will not load, asks for Adobe Flash v10 or better. It has been working in the past but suddenly puked. I have been trying to download the latest flash player and it seems to download ok but times out