Estimate 1 Row data size in a table
Hello I have a table like below I want to know 1 record size in KB
Create table test(
Id int,
Name nvarchar(5),
File nvarchar(max),
Createddate datetime,
User varchar(20))
Can you tell me in this way
int =
Name=
File=
Createdate=
User
So it will help me in future to calculate myself.
HI,
try this.
DECLARE @sql nvarchar(max),@clmns nvarchar(max),@table nvarchar(100),@ln varchar(max)
SET @table = 'REGIONS'
SELECT @clmns =
STUFF((
SELECT ', ['+name+']' as 'data()' FROM
SELECT name FROM sys.columns where object_id = object_id(@table)
) a
FOR XML PATH(''),TYPE
).value('.','NVarchar(max)'),1,2,''),
@ln =
STUFF((
SELECT '+ ' +ln as 'data()' FROM
SELECT CASE WHEN collation_name is null THEN CAST(max_length as Varchar(10)) ELSE 'LEN('+name+')'END as ln FROM sys.columns where object_id = object_id(@table)
) a
FOR XML PATH(''),TYPE
).value('.','NVarchar(max)'),1,2,'')
PRINT @ln
SET @sql = 'SELECT '+@clmns+','+@ln+' as ln FROM '+@table+' r'
EXEC (@sql)
Similar Messages
-
How to get selected row data of an ADF table in HashMap?
Hi,
Can anyone please tell me how to selected row data of an ADF table in HashMap like :
Object obj = pageTable.getSelectedRowData();
JUCtrlHierNodeBinding rowData = (JUCtrlHierNodeBinding)obj;
Now in above code I want the convert rowData in HashMap.
Can anyone please tell me how to do that? Its urgent.
Thanks,
VikVik,
No need to ask the same question 3 times...
In [url http://forums.oracle.com/forums/message.jspa?messageID=4590586]this post, Nick showed you how to get the Row.
If it were so urgent, you could have done a little reading of the javadocs to come up with code like this (not tested, up to you to do that)
HashMap m = new HashMap();
Row r = get it like Nick showed you to;
Object values[]=r.getAttributeValues();
String names[]=r.getAttributeNames();
for (int i=0; i<r.getAttributeCount(); i++)
m.put(names, values[i]); -
How to send multiple row data into an internal table??
I have a view with table control.i want to select multiple row and send all the row data into an internal table.i am able to select multiple row but all the selected row data is not going to the internal table.....only a particular row data which is lead selected is going.
Do anyone can help me regarding this issue?
Thanks in advance,
Subhasis.Hey,
Some code example:
declaring an internal table and work area to get all the elements from the node.
data : lt_Elements type WDR_CONTEXT_ELEMENT_SET,
ls_Element type WDR_CONTEXT_ELEMENT_SET,
considering flights is my node.
lt_data type sflight.
Node_Flights is the ref of the node to which ur table is binded.
Use Code Inspector to read the node.
lt_Element = Node_Flights->GET_ELEMENTS
loop at lt_elements into ls_Element.
l_bollean = ls_elements->is_selected ( returns abap true/false ).
if l_bollean IS INITIAL.
append ls_Element to lt_data.
endif.
Hope this would help.
Cheers,
Ashish -
Comparing data size in one table to column widths in another table
I have data in a table that has a large number of columns, many of them nvarchar of varying widths. Im trying to take that data and insert it into another table but Im getting the warning message about string or binary data being truncated. I
suspect there is a field somewhere that is not large enough for the data. However, I run across this often enough I would like to come up with a better solution than just eyeballing the data.
I found this example
http://www.sqlservercentral.com/Forums/Topic1115499-338-2.aspx
(credit goes to poster in the linked thread above)
Select columns
into #T
from MyDataSource;
select *
from tempdb.sys.columns as TempCols
full outer join MyDb.sys.columns as RealCols
on TempCols.name = RealCols.name
and TempCols.object_id = Object_ID(N'tempdb..#T')
and RealCols.object_id = Object_ID(N'MyDb.dbo.MyTable)
where TempCols.name is null -- no match for real target name
or RealCols.name is null -- no match for temp target name
or RealCols.system_type_id != TempCols.system_type_id
or RealCols.max_length < TempCols.max_length ;
Why a full outer join ? Why not just a left join, since I really only want to see the matches on my source table?
When Im running this against the table im interested in, it doesnt seem to find matches between my target table and my temp tableAs an outer join of any type, that query won't work well. For example, suppose you do a left join. So the query begins by getting every row from tempdb.sys.columns (whether it is in #T or not). Consider a row for a column which is not in
#T, you look for matches for rows in Mydb.sys.columns ON
on TempCols.name = RealCols.name
and TempCols.object_id = Object_ID(N'tempdb..#T')
and RealCols.object_id = Object_ID(N'MyDb.dbo.MyTable)
Notice that since the row you are considering is NOT a column in #T, the second part of the ON condition is not true, so the whole ON condition will not be true. But this is a left join. So the join keeps this row with NULL's in the columns coming
from the RealCols table. Then you do the where condition, but the connections are all OR and one of the conditions is RealCols.name is null (which it is because there was no match), your output will include a row for this column in tempdb even though
this column is not in #T. So if you use a left join, the output of this query will include a row for every column in every table in tempdb not named #T.
Similarly, if you do a right join, you get a column for every row of every table in MyDb which is not a column in dbo.MyTable.
And a full join (which you are doing above) will return a row for every column in every table in both tempdb and MyDb.
This query will sort of work if you make it an inner join. But even then it won't find every possible cause of string or binary truncation. For example, you are doing RealCols.max_length < TempCols.max_length. But in sys.columns, if
you have a varchar(max) column, max_length is stored as -1. So if a column in RealCols is varchar(50) and the same column in TempCols is varchar(max), that column will not show up as an exception, but of course, you can get truncation if you attempt
to store a varchar(max) in a varchar(50).
I would run a query more like
Create Table FooX(a int, b varchar(20), c int, d varchar(20), e int, f varchar(20), g decimal(4,0));
Select a, b, 1 as x, 'abc' as y, Cast('' as varchar(max)) As f, Cast(25.1 as decimal(3,1)) as g
into #T
from FooX;
Select 'In Real, not in or different in Temp' As Description, name, column_id, system_type_id, max_length, precision, scale, collation_name From sys.columns Where object_id = Object_ID(N'FooX')
Except Select 'In Real, not in or different in Temp' As Description, name, column_id, system_type_id, max_length, precision, scale, collation_name From tempdb.sys.columns Where object_id = Object_ID(N'tempdb..#T')
Union All
Select 'In Temp, not in or different in Real' As Description, name, column_id, system_type_id, max_length, precision, scale, collation_name From tempdb.sys.columns Where object_id = Object_ID(N'tempdb..#T')
Except Select 'In Temp, not in or different in Real' As Description, name, column_id, system_type_id, max_length, precision, scale, collation_name From tempdb.sys.columns Where object_id = Object_ID(N'FooX')
Order By name, Description;
go
Drop Table #T
go
Drop Table FooX
The output of that is
In Real, not in or different in Temp c 3 56 4 10 0 NULL
In Real, not in or different in Temp d 4 167 20 0 0 SQL_Latin1_General_CP1_CI_AS
In Real, not in or different in Temp e 5 56 4 10 0 NULL
In Real, not in or different in Temp f 6 167 20 0 0 SQL_Latin1_General_CP1_CI_AS
In Temp, not in or different in Real f 5 167 -1 0 0 SQL_Latin1_General_CP1_CI_AS
In Real, not in or different in Temp g 7 106 5 4 0 NULL
In Temp, not in or different in Real g 6 106 5 3 1 NULL
In Temp, not in or different in Real x 3 56 4 10 0 NULL
In Temp, not in or different in Real y 4 167 3 0 0 SQL_Latin1_General_CP1_CI_AS
From which you can quickly see that the differences are c, d, and e are in the real table and not the temp table, f is in both tables but the max_length is different, g is in both table, but the precision and scale are different, and x and y are in the temp
table, but not the real table.
Tom -
Multiple Row Data Selection in Popup Table
In my Popup window there is a table control with ten rows.
All the rows are editable. After i enter velue in all the rows and click submit button , all the values needs
to be selected in main window.
But the problem is in main window only one input field is there. I can't increse the number to ten also.
So i need to show all the selected values in this one input field separated by comma.
Please suggest me how to do it.
I have done it for single row selection but stuck with multiple rows.
Thanks a lot.Hi,
try this code:
I<Table>Node tableNode = wdContext.node<Table>();
I<Table>Element tableEl;
String input = "";
for (int i = 0; i < tableNode.size(); i++) {
tableEl = tableNode.get<Table>ElementAt(i);
if (tableEl.get<InputFieldAttribute>() != null)
input += tableEl.get<InputFieldAttribute>() + ",";
wdContext.current<MainWinInputFieldElement>().set<InputFieldAttribute>(input);
Regards,
Matteo
Edited by: Matteo Fusi on Apr 2, 2009 10:04 AM -
Row/data size limit when saving DESKI report to Excel?
Users are getting an "(Error: INF )" message when attempting to save a report to Excel. When the report has been generated using a smaller date range it generates 12,104 rows and saves successfully to Excel. When a larger date range is specified the report generates 18,697 rows and generates the above error message and does not produce an Excel file. The Excel file size for the 12,104 rows is 18.5 MB; the 18,697 row file would be in the neighborhood of 25+ MB. The row limits on Excel are well above what we're generating, so we're thinking there's a brick wall in the process that creates the Excel file from the BObj report?
We are running XIr2 patched to SP4 on Windows servers/XP desktops using MS SQL Server database softwear.Hi Sarbhjeet,
Thank you for all your help.
Further to our investigation we found that this is known issue and it is a limitation of the Product, it wont be fixed.
The ADAPT for the same is ADAPT00743734
It is reproducible with XI 3.1 as well.
FYI...........
1. PDF Engine will get the rendering information from the busobj. If there is a character size differences between PDF and busobj, there are chances that you may see bigger cells or reduced data in PDF.
2. In the report when Fit to 1 page by 1 wide in Pagesetup->FitToPrint, the page size will be set to the fit the complete report in busobj.
3. When save as PDF, this information (increased page size, due to settings in 2) will be sent to PDF Engine. The Engine will try to fit the size into the available standard PDF size (based on Print settings and other combinations).If it tries to fit the report (enhanced page size) to the available size, then we loose lot of information depending on the number of columns and rows.
4. Shrinking the report in busobj may not feasible.
When we change the page size in Fit to Print (either Adjust to % or Fit to ) , in reporter it changes the page size instead of changing the rendering to fit to the page.
When save the report to PDF, it gets the information with extended page size. So we see the enhanced page in PDF.
Also in order to implement the shrink (instead of changing the Page size), each cell has to be shrieked to the percentage of the reduced page (compared to the original page size). It becomes more complicated when charts come to picture.
I hope the above information helps.
Thanks & Regards,
Anisa -
Hi All,
how to find the data size in particular Table?
Ex:I need find out the sh.sales table how much data size is loaded
Thanks,USER_SEGMENTS (or DBA_SEGMENTS)
And if it's a partitioned table, don't forget to sum the sizes.
You may also want to include any indexes on the table. -
SQL query to find top 5 users having more rows/data in table
Dear experts,
OS = HP-UX
Database = Oracle 9.2.0.8
AC users = 600
Ex:-
select * from all_users where username like 'AC%';
AC_1
AC_2
AC_3
AC_4
AC_5
AC_6
AC_.
AC_.
AC_.
AC_600
Each AC user having same tables INCOMING, OUTGOING
Now i need to find top 5 users having more rows/data in INCOMING , OUTGOING tables. I tried this:
SQL>conn AC_1/pwd
select 'select count(*) from '||table_name||';' from user_tables;
But i get max counts info only for this AC_1 user , however, i need top 5 users having more rows/max counts query.
Thank you,source : oracle forums
May be , this one.. not tested though.
Before doing this you need to have select_catalog_role
WITH tmp
AS (SELECT owner,
table_name,
TO_NUMBER (
EXTRACTVALUE (
xmltype (
DBMS_XMLGEN.getxml (
'select /*+ PARALLEL*/ count(*) c from '
|| table_name)),
'/ROWSET/ROW/C'))
Cnt
FROM dba_tables
WHERE 1 = 1 AND table_name IN ('INCOMING', 'OUTGOING')),
tmp1
AS (SELECT a.*,
MAX (cnt)
OVER (PARTITION BY a.table_name ORDER BY a.cnt DESC)
maxcnt
FROM tmp a)
SELECT DISTINCT a.*
FROM tmp a, tmp1 b
WHERE a.cnt = b.maxcnt AND a.table_name = b.table_name; -
Hi All,
How we can find Avg row length size of a table, without inserting data into a table. This is required to basically estimate table size (roughly). We might need to this for at least 100+ tables.
regards,
varaThe average row length depends on the data you are inserting into the table.
You can easily compute an approximation of the max row length from Oracle’s data dictionary:
select
table_name ,
sum(data_length)
from
user_tab_columns
group by
table_nameLet’s say we have a table with a number(10) and a varchar2(1000) column. For a data set where the varchar2 column contains at least 900 characters we would get average row size of 900+. For the same table, but with another data set where the varchar2 column has fewer than 50 characters, we would get average row size of less than 100.
Iordan Iotzov
http://iiotzov.wordpress.com/ -
How to pass the data of the dynamic table into internal table
Hi all,
I had designed an Dynamic table in my Online Interactive form.
I am able to pass only the First row data into the internal table which i had created
Now how can i pass data of all the rows into the internal table.
Needed some coding help
Thanks
AjayHello Ajay,
If you have dynamic table in adobe and you want to capture all the added rows then you need to add corresponding element in the node bound to that table ui.
when u add a row using java script in adobe form there is no corresponding element created in the backend that is ur wd node bound to the table ui. and so u only get 1 row of data back.
try this,
create a button in your view holding adobe form say "add rows" and on click of this button write the code to add one more element to the node bound to ur table ui of adobe form. when server side rendering occurs it will recreate the table in adobe with added row in the node and you will be able to get the data entered by user.
Thanks,
Abhishek -
How to estimate average row size without populating data
hi all
I have a work of estimating the average row size of some tables. Because I don't want to populate these tables with data(It's alot work to do it),so I am not able to get the arerage row size from dba_tables by analyzing these tables.
Is there other way around to do my job easily?
Thanks alot.Hi,
I am not a coder,I am a DBA.Do you think that it's coder's responsibility to do such kind of works?Still i think it part of the dba role to suppport the developers or to provide guidance, the easiet way would be the using package DBMs_SPACE, perhaps if you take the help of developer, ask him to populate the single record with full length or say max length in each table and generate the 10053 trace with basic select clauses, You can the information from the trace too...
- Pavan Kumar N -
Hi All,
I have two table with similar structure and data, one is on disk and the other is in memory. I somehow calculated the difference between a row size of on disk and in memory table and found that the row size of in memory is 700 Bytes more than the disk based
tables.
aaAs others mentioned, memory optimized tables and disk based tables have different structures in SQL Server 2014.
For memory optimized tables, the number of indexes on table also contribute to the size. You can calculate the exact size of rows and thus the table size using the formula given in the below articles
Table and Row Size in Memory-Optimized Tables
Estimate the Size of a Table
Krishnakumar S -
Size difference for expdp estimate and from data dictionaries
Hello
We have one large table with around 70 million rows and when we check size of this table using query as below
select sum(bytes)/1024/1024/1024 "size in GB" from user_segments where segment_name='MYTABLE';
we got expected size 17GB.
However, when we exported the table using EXPDP it showed estimated size as 34GB.
Can anyone help me identifying why the diffrence is there for expected tablesize in GB?
Note: Table consits of BLOB datatype column
Oracle10G
Sun Solaris10
Edited by: aps on Feb 20, 2009 7:28 AMThe estimate is for table row data only; it does not include metadata.
BLOCKS - The estimate is calculated by multiplying the number of database blocks used by the target objects times the appropriate block sizes.
dba_lob_partition will give you more accurate info. -
Manage max table storage space in case of excess data (size in GB)
My scenario is that I am using sql server 2008 r2 on my end. I have created a database named testDB. I have a lot of tables with some log tables in this. some tables have contain lack of records in log table.
So my purpose is that I want to fix the table size of those tables(log tables) and want to move records in other database table placed on another location. So my database has no problem.
Please tell me, is there any way to make such above steps which I want for my database?
Is there already built any such functionality in sql server?
May be this question repeated but still I have no solution for my issue.
Fill free to ask any query.
ThanksMy scenario is that I am using sql server 2008 r2 on my end. I have created a database named testDB. I have a lot of tables with some log tables in this. some tables have contain lack of records in log table.
So my purpose is that I want to fix the table size of those tables(log tables) and want to move records in other database table placed on another location. So my database has no problem.
Please tell me, is there any way to make such above steps which I want for my database?
Is there already built any such functionality in sql server?
May be this question repeated but still I have no solution for my issue.
Fill free to ask any query.
Thanks
well, there is no such direct option to restrict the table size.. one way, you can do it. putt that table on separate filegroup and files and restrict the growth on the file. BUT, this will not give the accurate limitaion on the rows you want and, it not
a good practice to do that- infact you should never do this option. and if you have more tables, each one would require it's own filegroup/files - this is a bad idea.
the more common solution is to archive the information into another table in a different database.
a simple script such as this would work. this will move all the log data older than 30 days to archive database
use ArchiveDatabase
GO
insert into archivetable
select * from testdb..Oldtablelog where logdate <(dateadd(day,-30,getdate())
is there any particular reason you want to archive the data.. if this for database managability- for backups/maintanence - you can partition and mark the old filegroups as readonly and new data as read-write.
Hope it Helps!! -
Get the size of the table as per row wise.( Rows in group by clause)
Hello,
I am using ORACLE 11g Standard edition and RHEL 4.
I have a situation in which i want to know the size of the limited rows of the table.
I am moving table's rows from one table(one tablespace) to another table(another tablespace) While moving the rows I want to be sure that the size of the rows is good enough to fit in the another table's tablespace free size. So before inserting rows in another table i will check the size of rows and the free space in tablespace and perform the action as per.
Here is the senario with example :-
I have a table called MAIN_TAB which has a column as DATE_TIME which stores the systimestamp when the data was inserted. See the code below ...
select * from main_tab;
ID VALUE DATE_TIME
1 DATA 18-MAY-11 12.00.00.000000000 AM
2 DATA 18-MAY-11 12.00.00.000000000 AM
3 DATA 17-MAY-11 12.00.00.000000000 AM
4 DATA 17-MAY-11 12.00.00.000000000 AM Now i will fire a group by date_time query to know how many rows for each systimestamp.
select trunc(date_time),count(id)
from MAIN_TAB
group by trunc(date_time)
DATE_TIME COUNT(ID)
17-MAY-11 2
18-MAY-11 2So now you can see i have 2 rows for 17th and 18th May. I want to know what is the size of the data for 17th and 18th May in MB. I know how to get the size of the whole table but i want only the limited rows size as per date.
So the question is how can i get the size of a table's 2 rows data ???
Provide me some guidance.
If the question is not clear to you , let me know ....
Thanks in advance ...Thanks Pravan for your reply. But Its still not so usefull for me. Can you please give some clear idea about what you wanna say ??
I fired the DBA_TABLES view for my table i.e. 'MAIN_TAB' The AGV_ROW_LEN column showed 0 but that does not mean that my table has no data it, for sure it has 4 rows in it and consists of some data . . . . .
Please clarify what i can do to get the size of rows related to that particular date ......
thanks.
Maybe you are looking for
-
Can I trust iTunes in the Cloud to retain past purchases?
I've been using iTunes for several years now to download and watch television episodes, since long before the advent of iTunes in the Cloud, with its ability to re-download previously purchased music, television, and (most recently) movies. Prior to
-
I have upgraded to an Iphone 5s, however all of my music is on my old phone (4s), i want to transfer all 12.7gb of it to my new Iphone, but i cannot drag and drop it? how can i take my music from my 4s and add this to my 5s using Itunes on a windows
-
Local session bean lookup in another local session bean in EJB 3.0
Hi, I am doing JNDI lookup of a local session bean in a session bean. ( I do not want to use EJB dependency injection). Lookup of local interface from session bean is successful. But, when the calling session bean is a local session in another sessio
-
Hp 3052 all-in-one software for windows 8 32-bit and 64-bit
I have a 3052A and windows 8 32 bit.
-
Second 27" iMac with dead pixels
I'm intrigued to see how many others have purchased the new 27" iMac's and have managed to get one without any dead pixels. I am currently on my 2nd 27" iMac and both of them have a dead pixel right in the middle of the screen, first one was black, t