TO DRAW A TABLE WITH MULTIPLE ROWS AND MULTIPLE COLOUMNS IN FORM
Hi,
How to draw a table with multiple rows and columns seperated by lines in form printing?
check this
http://sap-img.com/ts003.htm
Regards
Prabhu
Similar Messages
-
How to set all af:table with banding="row" and bandingInterval="1"
i need all af:table in my project with two attribute banding="row" and bandingInterval="1", how to implement it
by skin or css? pls give me a clue. thxHi,
skinning is for the look and feel (e.g. the color of the banding). The banding and banding interval is a component property that needs to be set on the page source
Frank -
I also got a code in some website related to my problem but there are errors in this code that i am not able to understand can you correct those errors:
using System;
using System.IO;
using System.Collections;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Web;
using System.Web.SessionState;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Web.UI.HtmlControls;
using Microsoft.Web.UI.WebControls;
namespace shark.TreeView
/// <summary>
/// Summary description for DocTree.
/// </summary>
public class DocTree : System.Web.UI.Page
protected Microsoft.Web.UI.WebControls.TreeView TreeCtrl;
public DocTree()
Page.Init += new System.EventHandler(Page_Init);
private void Page_Load(object sender, System.EventArgs e)
if ( ! this.IsPostBack )
// add tree node "type" for folders and files
string imgurl = "/shark/webctrl_client/1_0/Images/";
TreeNodeType type;
type = new TreeNodeType();
type.Type = "folder";
type.ImageUrl = imgurl + "folder.gif";
type.ExpandedImageUrl = imgurl + "folderopen.gif";
TreeCtrl.TreeNodeTypes.Add( type );
type = new TreeNodeType();
type.Type = "file";
type.ImageUrl = imgurl + "html.gif";
TreeCtrl.TreeNodeTypes.Add( type );
// start the recursively load from our application root path
// (we add the trailing "/" for a little substring trimming below)
GetFolders( MapPath( "~/./" ), TreeCtrl.Nodes );
// expand 3 levels of the tree
TreeCtrl.ExpandLevel = 3;
private void Page_Init(object sender, EventArgs e)
InitializeComponent();
// recursive method to load all folders and files into tree
private void GetFolders( string path, TreeNodeCollection nodes )
// add nodes for all directories (folders)
string[] dirs = Directory.GetDirectories( path );
foreach( string p in dirs )
string dp = p.Substring( path.Length );
if ( dp.StartsWith( "_v" ) )
continue; // ignore frontpage (Vermeer Technology) folders
nodes.Add( Node( "", p.Substring( path.Length ), "folder" ) );
// add nodes for all files in this directory (folder)
string[] files = Directory.GetFiles( path, "*.aspx" );
foreach( string p in files )
nodes.Add( Node( p, p.Substring( path.Length ), "file" ) );
// add all subdirectories for each directory (recursive)
for( int i = 0; i < nodes.Count; i++ )
if ( nodes[ i ].Type == "folder" )
GetFolders( dirs[ i ] + "\\", nodes[i ].Nodes );
// create a TreeNode from the specified path, text and type
private TreeNode Node( string path, string text, string type )
TreeNode n = new TreeNode();
n.Type = type;
n.Text = text;
if ( type == "file" )
// strip off the physical application root portion of path
string nav = "/" + path.Substring( MapPath( "/" ).Length );
nav.Replace( '\\', '/' );
n.NavigateUrl = nav;
// set target if using FRAME/IFRAME
n.Target="doc";
return n;
#region Web Form Designer generated code
/// <summary>
/// Required method for Designer support - do not modify
/// the contents of this method with the code editor.
/// </summary>
private void InitializeComponent()
this.Load += new System.EventHandler(this.Page_Load);
#endregion
and the design that i got on website for the code that i displayed just above it is
<%@ Register TagPrefix="iewc" Namespace="Microsoft.Web.UI.WebControls" Assembly="Microsoft.Web.UI.WebControls" %>
<%@ Page language="c#" Codebehind="DocTree.aspx.cs" AutoEventWireup="false" Inherits="shark.TreeView.DocTree" %>
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" >
<HTML>
<HEAD>
<meta content="Microsoft Visual Studio 7.0" name="GENERATOR">
<meta content="C#" name="CODE_LANGUAGE">
<meta content="JavaScript (ECMAScript)" name="vs_defaultClientScript">
<meta content="http://schemas.microsoft.com/intellisense/ie5" name="vs_targetSchema">
</HEAD>
<body>
<form id="DocTree" method="post" runat="server">
<table height="100%" cellSpacing="0" cellPadding="8" border="0">
<tr height="100%">
<td vAlign="top">
<iewc:treeview id="TreeCtrl" runat="server" SystemImagesPath="/shark/webctrl_client/1_0/treeimages/">
</iewc:treeview>
</td>
<td vAlign="top" width="100%" height="100%">
Click on any *.aspx page in the tree and it should load here <iframe id="doc" name="doc" frameBorder="yes" width="100%" scrolling="auto" height="100%">
</iframe>
</td>
</tr>
</table>
</form>
</body>
</HTML>
This is my code for viewing treeview but it is not expanding plz help me in this also
using System;
using System.Collections.Generic;
using System.Configuration;
using System.Data;
using System.IO;
using System.Web;
using System.Web.Security;
using System.Web.UI;
using System.Web.UI.HtmlControls;
using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;
using System.Media;
using System.Drawing;
using System.Drawing.Imaging;
public partial class _Default : System.Web.UI.Page
protected void Page_Load(object sender, EventArgs e)
if (Page.IsPostBack == false)
System.IO.DirectoryInfo RootDir = new System.IO.DirectoryInfo(Server.MapPath("~/Files"));
// output the directory into a node
TreeNode RootNode = OutputDirectory(RootDir, null);
// add the output to the tree
MyTree.Nodes.Add(RootNode);
TreeNode OutputDirectory(System.IO.DirectoryInfo directory, TreeNode parentNode)
// validate param
if (directory == null) return null;
// create a node for this directory
TreeNode DirNode = new TreeNode(directory.Name);
// get subdirectories of the current directory
System.IO.DirectoryInfo[] SubDirectories = directory.GetDirectories();
// OutputDirectory(SubDirectories[0], "Directories");
// output each subdirectory
for (int DirectoryCount = 0; DirectoryCount < SubDirectories.Length; DirectoryCount++)
OutputDirectory(SubDirectories[DirectoryCount], DirNode);
// output the current directories file
System.IO.FileInfo[] Files = directory.GetFiles();
for (int FileCount = 0; FileCount < Files.Length; FileCount++)
DirNode.ChildNodes.Add(new TreeNode(Files[FileCount].Name));
} // if the parent node is null, return this node
// otherwise add this node to the parent and return the parent
if (parentNode == null)
return DirNode;
else
parentNode.ChildNodes.Add(DirNode);
return parentNode;
This is my design
<%@ Page Language="C#" AutoEventWireup="true" CodeFile="Default.aspx.cs" Inherits="_Default" %>
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head runat="server">
<title></title>
<style type="text/css">
.auto-style2
width: 412px;
.auto-style3
width: 174px;
.auto-style4
width: 743px;
.auto-style5
width: 506px;
height: 226px;
</style>
</head>
<body>
<form id="form1" method="post" runat="server">
<table align:"center" class="auto-style4" border="1" style="table-layout: fixed; border-spacing: 1px">
<tr>
<td class="auto-style3">
<br />
<br />
<br />
<br />
</td>
<td class="auto-style2">
<br />
<br />
<br />
<br />
</td>
</tr>
<tr>
<td class="auto-style3" valign="top">
<asp:TreeView Id="MyTree" PathSeparator = "|" ExpandDepth="0" runat="server" ImageSet="Arrows" AutoGenerateDataBindings="False">
<SelectedNodeStyle Font-Underline="True" HorizontalPadding="0px" VerticalPadding="0px" ForeColor="#5555DD"></SelectedNodeStyle>
<NodeStyle VerticalPadding="0px" Font-Names="Tahoma" Font-Size="10pt" HorizontalPadding="5px" ForeColor="#000000" NodeSpacing="0px"></NodeStyle>
<ParentNodeStyle Font-Bold="False" />
<HoverNodeStyle Font-Underline="True" ForeColor="#5555DD"></HoverNodeStyle>
</asp:TreeView>
</td>
<td class="auto-style2">
<base target="_blank" /> <iframe frameborder="0" scrolling="yes" marginheight="0" marginwidth="0"
src="" class="auto-style5"></iframe>
</td>
</tr>
</table>
</form>
</body>
</html>Hi meghage,
From your code, it is a WebForm project.
This forum is to discuss problems of Windows Forms. Your question is not related to the topic of this forum.
You can consider posting it in asp.net forum for supports . Thanks.
ASP.NET: http://forums.asp.net
Regards,
Youjun Tang
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
String into table with separate row and column bytes
Hello, well I have a string (coming from an I2C eeprom and then through RS232 via MCU) with this format --> byte1 byte2 byte3 byte4 byte5 \r byte1 byte2 byte3 byte4 byte5 \r and so on, the length of the frame isn't known, it depends on how many sensors have passed over its limit. I am having problems on plotting this information into a table, I don't know at all how to separate each byte in a column and each packet of five bytes in a row. I attached my vi with the example of two sensors information string, I am able only to plot the first sensor .Any help will be welcome.
Thanks in advance.
Regards.
Attachments:
string_to_table.vi 31 KBThis is working modification of your code
Message Edited by EVS on 08-26-2005 09:16 PM
Jack
Win XP
LabVIEW 6.1, 7.0, 7.1, LabWindows/ CVI 7.1
Let us speek Russian
Attachments:
Clip_5.jpg 53 KB -
Editable table with multiple rows
Hi All!
We're trying to develop some application in VC 7.0. That application should read data from some R/3 tables (via standard and custom functional modules), then display data to user and allow him/her to modify it (or add some data into table rows), and then save it back to R/3.
What is the best way to do that?
There's no problem with displaying data.
But when I try to add something to table (on portal interface), I'm able to use only first row of the table... Even if I fill all fields of that row, I'm not able to add data to second row, etc.
Second question. Is it possible to display in one table contents of output of several BAPIs? For example we have three bapis, one displaying user, second displays that user's subordinates, and the third one - that user's manager. And we want one resulting table...
And last. What is the best way to submit data from table view in VC (portal) to R/3 table? I understand that we should write some functional module, that puts data to R/3. I'm asking about what should be done in VC itself. Some button, or action...
Any help will be appreciated and rewarded :o)
Regards,
DKHere are some former postings:
Editable table with multiple rows
and
Editable table with multiple rows
Are you on the right SP-level?
Can you also split up your posting: one question, one posting? This way you get faster answers, as people might just browse the headers. -
How to create a table with multiple select on???
Hi all,
I am new to webdynpro and my requirement is to create a table with multiple selection on.I have to add abt 10 rows in the table but only 5 rows should be visible and moreover a verticalscroll should be available to view other rows.Can anybody explain me in detail how to do that.Please reply as if you are explaining to a newcomer.Reply ASAP as i have to do it today.
ThanxsHi,
1. Create a value node in your context name Table and set its cardinality to 0:n
2. Create 2 value attributes within the Table node name value1 and value2
3. Goto Outline view> Right click on TransparentUIContainer>Apply Template> Select Table>mark the node Table and it's attributes.
you have created a table and binded its value to context
Table UI properties
4.Set Selection Mode to Multi
5.Set Visible Row Count to 5
6.ScrollableColCount to 5
In your implemetaion, you can add values to table as follow:
IPrivate<viewname>.ITableElement ele = wdContext.nodeTable().createTableElement();
ele.setValue1(<value>);
ele.setValue2(<value>);
wdContext.nodeTable().addElement(ele);
The above code will allow you to add elements to your table node.
Regards,
Murtuza -
Select on table with 1800 rows is slow
I have a table with 1800 rows. Each entry has a geometry position and a geometry polygon around the position. I am using the polygon to detect which (other) entries are near the current entry.
In the following testdata and the subsequent query, i am filtering on 625 (of 1865) rows, and then using the .STContains-method to finding other rows (the testdata is fully found by this query, in the live database the values are not so regular as in the testdata.
The query take 6500 ms. In the live database, only 800 records are (yet) in the table, and it takes 2200 ms.
select SlowQueryTable.id
from SlowQueryTable
inner join dbo.SlowQueryTable as SlowQueryTableSeen
on SlowQueryTable.[box].STContains(SlowQueryTableSeen.position) = 1
where SlowQueryTable.userId = 2
(The query in the live system is even more complex, but this is main part of it and even simplified as it is just takes too long).
This script generates test data and runs the query:
-- The number table is just needed to generate test data
CREATE TABLE [dbo].[numbers](
[number] [int] NOT NULL
go
declare @t table (number int)
insert into @t select 0 union all select 1 union all select 2 union all select 3 union all select 4 union all select 5 union all select 6 union all select 7 union all select 8 union all select 9
insert into numbers
select * from
select
t1.number + t2.number*10 + t3.number*100 + t4.number*1000 as x
from
@t as t1,
@t as t2,
@t as t3,
@t as t4
) as t1
order by x
go
-- this is the table which has the slow query. The Columns [userId], [position] and [box] are the relevant ones
CREATE TABLE [dbo].SlowQueryTable(
[id] [int] IDENTITY(1,1) NOT NULL,
[userId] [int] NOT NULL,
[position] [geometry] NOT NULL,
[box] [geometry] NULL,
constraint SlowQueryTable_primary primary key clustered (id)
create nonclustered index SlowQueryTable_UserIdKey on [dbo].SlowQueryTable(userId);
--insert testdata: three users with each 625 entries. Each entry per user has its unique position, and a rectangle (box) around it.
-- In the database in question, the positions are a bit more random, often tens of entries have the same position. The slow query is nevertheless visible with these testdata
declare @range int;
set @range = 5;
INSERT INTO [dbo].SlowQueryTable (userId,position,box)
select
users.number,
geometry::STGeomFromText('POINT (' + convert(varchar(15), X) + ' ' + convert(varchar(15), Y) + ')',0),
geometry::STPolyFromText('POLYGON ((' + convert(varchar(15), X - @range) + ' ' + convert(varchar(15), Y - @range) + ', '
+ convert(varchar(15), X + @range) + ' ' + convert(varchar(15), Y - @range) + ', '
+ convert(varchar(15), X + @range) + ' ' + convert(varchar(15), Y + @range) + ', '
+ convert(varchar(15), X - @range) + ' ' + convert(varchar(15), Y + @range) + ','
+ convert(varchar(15), X - @range) + ' ' + convert(varchar(15), Y - @range) + '))', 0)
from (
select
(numberX.number * 40) + 4520 as X
,(numberY.number * 40) + 4520 as Y
from numbers as numberX
cross apply numbers as numberY
where numberX.number < (1000 / 40)
and numberY.number < (1000 / 40)) as positions
cross apply numbers as users
where users.number < 3
CREATE SPATIAL INDEX [SlowQueryTable_position]
ON [dbo].SlowQueryTable([position])
USING GEOMETRY_GRID
WITH (
BOUNDING_BOX = ( 4500, 4500, 5500, 5500 ),
GRIDS =(LEVEL_1 = HIGH,LEVEL_2 = HIGH,LEVEL_3 = HIGH,LEVEL_4 = HIGH),
CELLS_PER_OBJECT = 64, PAD_INDEX = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
go
ALTER INDEX [SlowQueryTable_position] ON [dbo].SlowQueryTable
REBUILD;
go
CREATE SPATIAL INDEX [SlowQueryTable_box]
ON [dbo].SlowQueryTable(box)
USING GEOMETRY_GRID
WITH ( BOUNDING_BOX = ( 4500, 4500, 5500, 5500 ) ,
GRIDS =(LEVEL_1 = HIGH,LEVEL_2 = HIGH,LEVEL_3 = HIGH,LEVEL_4 = HIGH),
CELLS_PER_OBJECT = 64, PAD_INDEX = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
go
ALTER INDEX [SlowQueryTable_box] ON [dbo].SlowQueryTable
REBUILD;
go
SET STATISTICS IO ON
SET STATSTICS TIME ON
-- this is finally the query. it takes about 6500 ms
select SlowQueryTable.id
into #t1
from SlowQueryTable
inner join dbo.SlowQueryTable as SlowQueryTableSeen
on SlowQueryTable.[box].STContains(SlowQueryTableSeen.position) = 1
--on SlowQueryTable.position.STDistance(SlowQueryTableSeen.position) < 5
where SlowQueryTable.userId = 2
drop table #t1
drop table SlowQueryTable
drop table numbers
Using an explicit index hint does do the job, but then the query gets slow if i change the where clause:
select SlowQueryTable.id
into #t1
from SlowQueryTable
with (index([SlowQueryTable_box]))
inner join dbo.SlowQueryTable as SlowQueryTableSeen
on SlowQueryTable.[box].STContains(SlowQueryTableSeen.position) = 1
where SlowQueryTable.userId = 2
leads to 600ms, and changing the where clause
where SlowQueryTable.id = 100
slows it again down to 1200ms. Filtering on ID get massively slowed down when using index hint on the spatial index.
Since the table in the live system will grow to 10000+ rows, and the query is called often by users, I badly need a more efficient query.
Do I have to create a different queries for each use-case, some with index hints and some without?I've run your example and can confirm your results. There's a couple of things that I noticed though.
After looking at query plans, it's not a matter of "with spatial index" vs. "without spatial index". You have two spatial indexes, one on each column (position and box). When you don't hint the "box" spatial index, the query
uses the "position" spatial index. Because of what they are indexing (points vs. polygons), the "box" spatial index requires a lot more IO. With some (non-spatial) predicates, the "box" spatial index gives better performance,
with others the "position" one does. I've yet to figure out exactly why (short on time, I might get back to it in future), but you can examine query plans and use the spatial index diagnostic procs (e.g. sp_help_spatial_geometry_index_xml ) in
addition to the diagnostics you're running to see why and if you can find a better performing plan/index.
Bear this in mind. Given a choice of multiple spatial indexes, the SQL Server query optimizer is not able to choose (for the most part, IO etc. aside), which one is best. Also, there is usually only one choice of spatial query plan shape, in general. If
your query is more complex than the one in your example, you might benefit by breaking it in two: one query to filter out all the rows and predicates that don't use a spatial index and one query that uses the spatial index on the subset. I've had good
luck with this other situations with complex queries involving spatial predicates. This method may not be applicable to a spatial query as simple as the one in your example, however.
Hope this helps, Bob -
Web Analysis : populate the same table with multiple data sources
Hi folks,
I would like to know if it is possible to populate a table with multiple data sources.
For instance, I'd like to create a table with 3 columns : Entity, Customer and AvgCostPerCust.
Entity and Customer come from one Essbase, AvgCostPerCust comes from HFM.
The objective is to get a calculated member which is Customer * AvgCostPerCust.
Any ideas ?
Once again, thanks for your help.I would like to have the following output:
File 1 - Store 2 - Query A + Store 2 - Query B
File 2 - Store 4 - Query A + Store 4 - Query B
File 3 - Store 5 - Query A + Store 5 - Query B
the bursting level should be give at
File 1 - Store 2 - Query A + Store 2 - Query B
so the tag in the xml has to be split by common to these three rows.
since the data is coming from the diff query, and the data is not going to be under single tag.
you cannot burst it using concatenated data source.
But you can do it, using the datatemplate, and link the query and get the data for each file under a single query,
select distinct store_name from all-stores
select * from query1 where store name = :store_name === 1st query
select * from query2 where store name = :store_name === 2nd query
define the datastructure the way you wanted,
the xml will contain something like this
<stores>
<store> </store> - for store 2
<store> </store> - for store 3
<store> </store> - for store 4
<store> </store> - for store 5
<stores>
now you can burst it at store level. -
Hi
I have a problem to import a dump that contains the tables with 0 rows.
When i exported from ORACLE 11.2 64 bit on SERVER 2008 i noticed that log didn't confirm the tables with 0 rows.
When i want to import to ORACLE 11.2 64 bit on other SERVER 2008 i have a lot of errors on this tables with 0 rows.
In the log i get the same tables with 1 row at least, but no one with 0 rows.
I open my dump in TEXTPAD and i see it contains "CREATE ....." these tables.
I don't understand why it happens. I used FUll DUMP by SYS, it didn't help.
This is not first time when i export and import dumps,no errors.
I'm using command "EXP" and "IMP" and every time it's ok.(IF it's a releavent)
Why it happens? any solutions for this issue?
ThanksI've found (i guess so) solution to this issue
here are two links to this new feature that is called deffered segment creation
The reason for this behavior is 11.2 new feature ‘Deferred Segment Creation‘ – the creation of a table sent is deferred until the first row is inserted.
As a result, empty tables are not listed in dba_segments and are not exported by exp utility
http://www.nativeread.com/2010/04/09/11gr2-empty-tables-skipped-by-export-deferred-segment-creation/
http://antognini.ch/2010/10/deferred-segment-creation-as-of-11-2-0-2/
And this is i've found in official documentation from oracle
Beginning in Oracle Database 11g Release 2, when creating a non-partitioned heap-organized table in a locally managed tablespace, table segment creation is deferred until the first row is inserted. In addition, creation of segments is deferred for any LOB columns of the table, any indexes created implicitly as part of table creation, and any indexes subsequently explicitly created on the table.The advantages of this space allocation method are the following:A significant amount of disk space can be saved for applications that create hundreds or thousands of tables upon installation, many of which might never be populated.Application installation time is reduced.There is a small performance penalty when the first row is inserted, because the new segment must be created at that time.
To enable deferred segment creation, compatibility must be set to '11.2.0' or higher. You can disable deferred segment creation by setting the initialization parameter DEFERRED_SEGMENT_CREATION to FALSE. The new clauses SEGMENT CREATION DEFERRED and SEGMENT CREATION IMMEDIATE are available for the CREATE TABLE statement. These clauses override the setting of the DEFERRED_SEGMENT_CREATION initialization parameter.
+Note that when you create a table with deferred segment creation (the default), the new table appears in the _TABLES views, but no entry for it appears in the SEGMENTS views until you insert the first row. There is a new SEGMENTCREATED column in _TABLES, _INDEXES, and _LOBS that can be used to verify deferred segment creation+
Note:
The original Export utility does not export any table that was created with deferred segment creation and has not had a segment created for it. The most common way for a segment to be created is to store a row into the table, though other operations such as ALTER TABLE ALLOCATE EXTENTS will also create a segment. If a segment does exist for the table and the table is exported, the SEGMENT CREATION DEFERRED clause will not be included in the CREATE TABLE statement that is executed by the original Import utility. -
How to create table with 1 row 1MB in size?
Hello,
I am doing some R&D and want to create a Table with 1 row, which is 1 MB in size.
i.e. I want to create a row which is 1 MB in size.
I am using a 11g DB.
I do this in SQL*Plus:
(1.) CREATE TABLE onembrow (pk NUMBER PRIMARY KEY, onembcolumn CLOB);
(2.) Since 1MB is 1024*1024 bytes (i.e. 1048576 bytes) and since in English 1 letter = 1 byte, I do this
SQL> INSERT INTO onembrow VALUES (1, RPAD('A', 1048576, 'B'));
1 row created.
(3.) Now, after committing, I do an analyze table.
SQL> ANALYZE TABLE onembrow COMPUTE STATISTICS;
Table analyzed.
(4.) Now, I check the actual size of the table using this query.
select segment_name,segment_type,bytes/1024/1024 MB
from user_segments where segment_type='TABLE' and segment_name='ONEMBROW';
SEGMENT_NAME
SEGMENT_TYPE
MB
ONEMBROW
TABLE
.0625
Why is the size only .0625 MB, when it should be 1 MB?
Here is the DB Block related parameters:
SELECT * FROM v$parameter WHERE upper(name) LIKE '%BLOCK%';
NUM NAME TYPE VALUE
478 db_block_buffers 3 0
482 db_block_checksum 2 TYPICAL
484 db_block_size 3 8192
682 db_file_multiblock_read_count 3 128
942 db_block_checking 2 FALSE
What am I doing wrong here???When testing it is necessary to do something that is a reasonably realistic model of a problem you might anticipate appearing in a production system - a row of 1MB doesn't seem likely to be a useful source of information for "R&D on performance tuning"
What's wrong with creating millions of rows ?
Here's a cut and paste from a windows system running 11.2.0.3
SQL> set timing on
SQL>
SQL> drop table t1 purge;
Table dropped.
Elapsed: 00:00:00.04
SQL>
SQL> create table t1
2 nologging
3 as
4 with generator as (
5 select
6 rownum id
7 from dual
8 connect by
9 level <= 50
10 ),
11 ao as (
12 select
13 *
14 from
15 all_objects
16 where rownum <= 50000
17 )
18 select
19 rownum id,
20 ao.*
21 from
22 generator v1,
23 ao
24 ;
Table created.
Elapsed: 00:00:07.09
7 seconds to generate 2.5M rows doesn't seem like a problem. For a modelling example I have one script that generates 6.5M (carefully engineered) rows, with a couple of indexes and a foreign key or two, then collects stats (no histograms) in 3.5 minutes.
Regards
Jonathan Lewis
http://jonathanlewis.wordpress.com
Now on Twitter: @jloracle -
Can we bind a single external table with multiple files in OWB 11g?
Hi,
I wanted to ask if it is possible to bind an external table with multiple source files at same or different locations? Or an external table has to be bound to a single source file and a single location.
Thanks in advance,
Ann.
Edited by: Ann on Oct 8, 2010 9:38 AMHi Ann,
Can you please help me out by telling me the steps to accomplish this. Right click on the external table in project tree, from the menu choose Configure,
then in opened Configuration Properties dialog window right clock on Data Files node and choose from menu Create -
you will get new record for file - specify Data File Name property
Also link from OWB user guide
http://download.oracle.com/docs/cd/B28359_01/owb.111/b31278/ref_def_flatfiles.htm#i1126304
Regards,
Oleg -
Will there performance improvement over separate tables vs single table with multiple partitions? Is advisable to have separate tables than having a single big table with partitions? Can we expect same performance having single big table with partitions? What is the recommendation approach in HANA?
Suren,
first off a friendly reminder: SCN is a public forum and for you as an SAP employee there are multiple internal forums/communities/JAM groups available. You may want to consider this.
Concerning your question:
You didn't tell us what you want to do with your table or your set of tables.
As tables are not only storage units but usually bear semantics - read: if data is stored in one table it means something else than the same data in a different table - partitioned tables cannot simply be substituted by multiple tables.
Looked at it on a storage technology level, table partitions are practically the same as tables. Each partition has got its own delta store & can be loaded and displaced to/from memory independent from the others.
Generally speaking there shouldn't be too many performance differences between a partitioned table and multiple tables.
However, when dealing with partitioned tables, the additional step of determining the partition to work on is always required. If computing the result of the partitioning function takes a major share in your total runtime (which is unlikely) then partitioned tables could have a negative performance impact.
Having said this: as with all performance related questions, to get a conclusive answer you need to measure the times required for both alternatives.
- Lars -
Filtering a table with 19699 rows
Hello
I am new to Xcelsisu and I am trying to aply filter component to a table with 19699 rows. I is suppose to show in the filter 45 diferent clasifications. I seems to be using only the first 500 rows to aply the filter. I changed the maximum number of rows in the preferences. What should i do ?a component cannot handle 20.000 rows you should filter these in the database by using quries when getting the data in the dashboard. If there is no database connection available, you can use pivot tables to create list for the different dimensions and then use lookup formulas like match and index o display the data for your chart.
-
I HAVE A SOURCE TABLE WITH 10 RECORDS AND TARGET TABLE 15 RECORDS. MY QUESTION IS USING WITH THE TABLE COMPARISON TRANSFORM .I WANT TO DELETE UNMATCHED RECORDS FROM THE TARGET TABLE ?? HOW IT IS ??
Hi Kishore,
First identify deleted records by selecting "Detect deleted rows from comparison table" feature in Table Comparison
Then Use Map Operation with Input row type as "delete" and output row type as "delete" to delete records from target table. -
Select max date from a table with multiple records
I need help writing an SQL to select max date from a table with multiple records.
Here's the scenario. There are multiple SA_IDs repeated with various EFFDT (dates). I want to retrieve the most recent effective date so that the SA_ID is unique. Looks simple, but I can't figure this out. Please help.
SA_ID CHAR_TYPE_CD EFFDT CHAR_VAL
0000651005 BASE 15-AUG-07 YES
0000651005 BASE 13-NOV-09 NO
0010973671 BASE 20-MAR-08 YES
0010973671 BASE 18-JUN-10 NOHi,
Welcome to the forum!
Whenever you have a question, post a little sample data in a form that people can use to re-create the problem and test their ideas.
For example:
CREATE TABLE table_x
( sa_id NUMBER (10)
, char_type VARCHAR2 (10)
, effdt DATE
, char_val VARCHAR2 (10)
INSERT INTO table_x (sa_id, char_type, effdt, char_val)
VALUES (0000651005, 'BASE', TO_DATE ('15-AUG-2007', 'DD-MON-YYYY'), 'YES');
INSERT INTO table_x (sa_id, char_type, effdt, char_val)
VALUES (0000651005, 'BASE', TO_DATE ('13-NOV-2009', 'DD-MON-YYYY'), 'NO');
INSERT INTO table_x (sa_id, char_type, effdt, char_val)
VALUES (0010973671, 'BASE', TO_DATE ('20-MAR-2008', 'DD-MON-YYYY'), 'YES');
INSERT INTO table_x (sa_id, char_type, effdt, char_val)
VALUES (0010973671, 'BASE', TO_DATE ('18-JUN-2010', 'DD-MON-YYYY'), 'NO');
COMMIT;Also, post the results that you want from that data. I'm not certain, but I think you want these results:
` SA_ID LAST_EFFD
651005 13-NOV-09
10973671 18-JUN-10That is, the latest effdt for each distinct sa_id.
Here's how to get those results:
SELECT sa_id
, MAX (effdt) AS last_effdt
FROM table_x
GROUP BY sa_id
;
Maybe you are looking for
-
IPSEC VPN clients can't reach internal nor external resources
Hi! At the moment running ASA 8.3, with fairly much experience of ASA 8.0-8.2, I can't get the NAT right for the VPN clients. Im pretty sure it's not ACL's, although I might be wrong. The problem is both VPN users can reach internal resources, and vp
-
Please help. I recently upgraded to IOS 8.2 and i realised that all my downloaded videos from itunes have somewhat vanished. I see the cloud sign with the downward arrow. However they won't play, i am being required to DOWNLOAD them again. This is bo
-
Unable to connect to Page Server when viewing Crystal Report
Have just moved from Crystal Enterprise 10 to Crystal Reports Server XI R2 and encounter the following error when attempting to view an instance of a report in Crystal Reports format: CrystalReportViewer The Page Server you are trying to connect to i
-
Error in boot-uponly with win98 HDD installed
Gents: Desire to report a bug in arch 0.5 downloaded Oct 18, 2003. Utilized arch 0.5 for some time but after installing arch linux, slackwareeline 9.0 and slackware linux 9.1 along with windows 98 in my system I encountered inability to l
-
[MBA 2013]sound crashes and restart after a few seconds
Hi guys, I bought a Macbook Air in June 2013 and 2-3 months ago, I started having problems with the sound. Basically, there is always one of those things running : iTunes, VLC or youtube/soundcloud so I always use the sound. After some time, the soun