Number of tables in schema
Hi all,
i was used following queries to find no of tables in a schema
select * from tab;
and
select count(table_name) from user_tables;
These 2 queries returning 2 different answers
Can you please tell me what is the difference between these 2 queries?
What is exact query to find total no of tables in a schema?
Thanks
These are both views with different object source and condition. Better use user_tables
USER_TABLES:
select columns...
from sys.ts$ ts, sys.seg$ s, sys.obj$ co, sys.tab$ t, sys.obj$ o,
sys.obj$ cx, sys.user$ cu, x$ksppcv ksppcv, x$ksppi ksppi
where o.owner# = userenv('SCHEMAID')
and o.obj# = t.obj#
and bitand(t.property, 1) = 0
and bitand(o.flags, 128) = 0
and t.bobj# = co.obj# (+)
and t.ts# = ts.ts#
and t.file# = s.file# (+)
and t.block# = s.block# (+)
and t.ts# = s.ts# (+)
and t.dataobj# = cx.obj# (+)
and cx.owner# = cu.user# (+)
and ksppi.indx = ksppcv.indx
and ksppi.ksppinm = '_dml_monitoring_enabled'
TAB:
select columns...
from sys.tab$ t, sys.obj$ o
where o.owner# = userenv('SCHEMAID')
and o.type# >=2
and o.type# <=5
and o.linkname is null
and o.obj# = t.obj# (+)
Similar Messages
-
Limitation on the number of tables in a Database Schema!
Hi All,
Is there a limitation on the number of tables present in a schema that can be fetched in Designer while creating a universe.
Customer is using Oracle schema which contains 22000 tables and while trying to insert tables in Designer XIR2 (or trying to get the table browser) Designer hangs.
In BO 6.5 there are no issues while retrieving the tables from the schema in Designer.
Is there a way to retrieve only a certain amount of tables in Designer XIR2?
Thanks for your help!Hi Ganesh,
Following are the Answers regaring your queries:
Query : 1
There is no limitation on the number of components (objects, classes, tables, joins, hierarchies, lov's, etc) in a universe. But of course as the number of components increases, you could run into problems related to performance.
This depends on available RAM and the processing speed.
Query 2:
There is NO such option to select the number of table to be automatically inserted in to the universe because Suppose if you have 22000 tables in DB and you want only 1000 table ,you entered 1000 tables as the value to insert tables in Universe then How Designer will come to know which tables you want to take in Schema to build the Universe?
It all depends on the DBA and Universe Designer which tables are important for organizations reporting needs.
When you create connection to the DB then Connection will fetch all table from the database and we canu2019t limit DB data retrieval.
I hope this Helps...
Thanks...
Pratik -
SAP R/3 instance does not match the number of columns in schema error
Hi Gurus,
Badly need your help, I am currently on dead end here, I cant' think of any resolution for this error message, I have tried almost everything trying to resolve it. Kindly advise..
error msg:* 1306806 3342 R3S-140113 3/23/2010 9:54:47 AM The number of table columns <0> in an R/3 instance does not match the number of columns <8> in schema. Operation*
source: R/3 database
target: Oracle 10G DBase
Tool: Busines Objects Data Integrator 11.7
Many Thanks!
Randy Reyes
PhilippinesHello,
Maybe the following is applicable in your case:
I don't know exactly what you are doing in the BOBJ system but are you using an RFC read table function?
If yes this may be the problem. RFC ReadTable is OK to do some tests but is not scalable. Plus there is a limitation
with the RFC_READ_TABLE function in terms of number of characters returned (up to 512 characters), you
should use R/3 dataflow instread:
Please find below the link which explains how to use the R/3 Dataflow:
https://www.sdn.sap.com/irj/scn/wiki?path=/display/bobj/readingviaABAP
I hope this helps,
Best Regards,
Des -
What is the maximum number of table that can be created in a schema?
Hi Scott,
Thanks for your quick reply. I've increased the size of H Grid: Catalog to 100 and there's no change (even after bouncing the server).
I've tested it in another instance and got a "Next X - Y of Y" link at the bottom and a "Previous" link at the top when there has been an overflow. This instance doesn't have that (the overflow objects can only be seen when you do a direct search for them).
Anne
Message was edited by:
anne -
when we want to query the dba_tables or user_tables to find the no of rows in a particular table or no of rows of all the tables then why do we need to analyze that particual table or schema to get the num_rows columns values??
"RTFM" is the best teacher always"
When you analyze the table:
Oracle collects the following statistics for a table. Statistics marked with an
asterisk are always computed exactly. Table statistics, including the status of
domain indexes, appear in the data dictionary views USER_TABLES,
ALL_TABLES, and DBA_TABLES in the columns shown in parentheses.
->"Number of rows (NUM_ROWS) "
-> * Number of data blocks below the high water mark (that is, the number of data
blocks that have been formatted to receive data, regardless whether they currently
contain data or are empty) (BLOCKS)
-> * Number of data blocks allocated to the table that have never been used (EMPTY_BLOCKS)
-> Average available free space in each data block in bytes (AVG_SPACE)
-> Number of chained rows (CHAIN_COUNT)
-> Average row length, including the row's overhead, in bytes (AVG_ROW_LEN) Jameel -
Crystal Reports - Connecting to databases - Limit to Number of Tables?
I am using CR 2008 and attempting to build a report. The database I created access to contains over 5000 tables in the db schema. The list of tables presented in the GUI to build the report is just a portion of the total number of tables. ('A' to 'O' displays alphabetically). Is there a limit to the number of tables that can be displayed to build the contents of a report? If so, what is the limit? Is there a work around?
Hello,
This is by design and a limit of how CR works due to limitation in your PC resources. CR loads all of that info into memory, if there are too many to list, and typically it's more common when using Oracle due to it's ability have thousands of tables, is in the Database connection right click on the connection and then select Options. You can add filtering to limit what you see.
Only option you have to be able to get to see all tables required for your report. Don't add tables if they are not required or if you can't link them. Work arounds are to create a collection of Stored Procedures or Views so you see just what you need.
Thank you
Don -
Populating item number in table control
I have created a custom table control. It is very similar to the VA01 or Vl01N table controls to add items. When the user enters the material name, the item number column needs to be populated.
Currently I have something like this
LOOP at i_items.
MODULE tblcntrl_modify INPUT.
ENDLOOP
MODULE tblcntrl_modify INPUT. "PAI
posnr = sy-stepl * 10.
more logic for input validation.
ENDMODULE
The issue I have is that when I scroll in the table control, the posnr for each item gets recalculated and they get whacked .. so I get a scenario like this
10 mat1
20 mat2
30 mat3
10 mat4
40 mat5
what is the best way to populate the item number. Thank youhi,
u have to make one module in PAI brtn <b>LOOP-ENDLOOP, CHAIN-ENDCHAIN</b>
LOOP AT itab_det.
CHAIN.
FIELD itab_det-comp_code.
FIELD itab_det-bill_no.
FIELD itab_det-bill_date.
FIELD itab_det-vend_cust_code.
FIELD itab_det-bill_amt.
MODULE <b>tab1_modify</b> ON CHAIN-REQUEST.
ENDCHAIN.
FIELD itab_det-mark
MODULE tab1_mark ON REQUEST.
ENDLOOP.
<b>MODULE tab1_modify INPUT.</b>
IF itab_det-bill_no <> ' ' .
CLEAR:net_pr,tax,bil_amt,bil_dt.
SELECT SINGLE fkdat netwr mwsbk FROM vbrk INTO (bil_dt,net_pr,tax)
WHERE vbeln = itab_det-bill_no .
bil_amt = net_pr + tax.
itab_det-bill_date = bil_dt.
itab_det-bill_amt = bil_amt.
ENDIF.
MODIFY itab_det
FROM itab_det
INDEX tab1-current_line.
APPEND itab_det.
<b>ENDMODULE. "TAB1_MODIFY INPUT</b>
here i am fetching Bill Amount and Bill Date according to Entered Bill Number in Table control.
So i am checking bill number here in If condition and when i press enter that two value automatically populated...
reward if useful..... -
How do i count number of table being used in a view
Hi All,
I will be really thankful if anyone please let me know about the count of number of tables being used in a view. I am using Oracle 10g Release 2 on HP-UX(11.31).
In actual I have to find views in my database with more than 5 table in join.
Thank you
GursimranTry :
select count(*) from dba_dependencies
where name ='<view name>'
and owner = '<view owner>'
and referenced_type= 'TABLE';Example:
SQL> select * from v$version;
BANNER
Oracle Database 10g Express Edition Release 10.2.0.1.0 - Product
PL/SQL Release 10.2.0.1.0 - Production
CORE 10.2.0.1.0 Production
TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
NLSRTL Version 10.2.0.1.0 - Production
SQL> show user;
USER is "HR"
SQL>
SQL> drop table t1 purge;
Table dropped.
SQL> drop table t2 purge;
Table dropped.
SQL> drop table t3 purge;
Table dropped.
SQL> drop view v;
View dropped.
SQL>
SQL> create table t1(x int);
Table created.
SQL> create table t2(y int);
Table created.
SQL> create table t3(z int);
Table created.
SQL> create view v as select x,y,z from t1,t2,t3;
View created.
SQL>
SQL> connect / as sysdba
ConnectÚ.
SQL> alter session set nls_language=english;
Session altered.
SQL>
SQL> select count(*) from dba_dependencies
2 where name ='V'
3 and owner = 'HR'
4 and referenced_type= 'TABLE';
COUNT(*)
3
SQL>Edited by: P. Forstmann on 26 juil. 2010 17:45 -
Maximum number of tables that can be outer joined with one table in a query
Hi All,
Iam new to sql, so can you please let me know What is the maximum number of tables that can be outer joined with one table in a query?
Thanks,
Srinisrinu2 wrote:
Iam new to sql, so can you please let me know What is the maximum number of tables that can be outer joined with one table in a query?
There is no limit to the number of tables you can outer join as long as you join them correctly.
SQL> with a as
2 (
3 select 1 id, 2 b_key, 3 c_key from dual union all
4 select 2 id, 1 b_key, 4 c_key from dual union all
5 select 3 id, 3 b_key, 1 c_key from dual union all
6 select 4 id, 4 b_key, 2 c_key from dual
7 ),
8 b as
9 (
10 select 1 id, 1 c_key2 from dual union all
11 select 2 id, 5 c_key2 from dual union all
12 select 3 id, 3 c_key2 from dual union all
13 select 4 id, 2 c_key2 from dual
14 ),
15 c as
16 (
17 select 1 key1, 1 key2, '1-1' dta from dual union all
18 select 1 key1, 2 key2, '1-2' dta from dual union all
19 select 1 key1, 3 key2, '1-3' dta from dual union all
20 select 1 key1, 4 key2, '1-4' dta from dual union all
21 select 2 key1, 1 key2, '2-1' dta from dual union all
22 select 2 key1, 2 key2, '2-2' dta from dual union all
23 select 2 key1, 3 key2, '2-3' dta from dual union all
24 select 2 key1, 4 key2, '2-4' dta from dual union all
25 select 3 key1, 1 key2, '3-1' dta from dual union all
26 select 3 key1, 2 key2, '3-2' dta from dual union all
27 select 3 key1, 3 key2, '3-3' dta from dual union all
28 select 3 key1, 4 key2, '3-4' dta from dual union all
29 select 4 key1, 1 key2, '4-1' dta from dual union all
30 select 4 key1, 2 key2, '4-2' dta from dual union all
31 select 4 key1, 3 key2, '4-3' dta from dual union all
32 select 4 key1, 4 key2, '4-4' dta from dual
33 )
34 select d.a_id, d.b_id, c.key1 as c_key1, c.key2 as c_key3, c.dta
35 from
36 c,
37 (
38 select
39 a.id as a_id, b.id as b_id, a.c_key, b.c_key2
40 from a, b
41 where a.b_key = b.id
42 ) d
43 where d.c_key = c.key1 (+)
44 and d.c_key2 = c.key2 (+);
A_ID B_ID C_KEY1 C_KEY3 DTA
3 3 1 3 1-3
4 4 2 2 2-2
2 1 4 1 4-1
1 2
SQL> -
Training an SVM on table with schema flexibility fails
Dear colleagues,
I'm trying to train a Support Vector Machine on a table with schema flexibility.
On a small test table with only a couple of columns both the training and the prediction using the PAL libraries work fine. However, on my large sparse table with more than 1000 columns and "schema flexibility" set at creation time, I constantly run into the following error:
Could not execute 'CALL SYSTEM.AFL_WRAPPER_GENERATOR ('PAL_SV', 'AFLPAL', 'SVMTRAIN', PAL_SV_SIGNATURE)' in 30 ms 529 µs .
SAP DBTech JDBC: [423]: AFL error: [423] "SYSTEM"."AFL_WRAPPER_GENERATOR": line 32 col 1 (at pos 1346): [423] (range 3) AFL error exception: AFL error: registration finished with errors, see indexserver trace
The indexserver trace gives me something strange like:
AFLPM_SQLDriverObj.cpp(02439) : aflpm_creator : direction must be in or out
As far as I see it, all parameters are fine, though.
Is there a limitation that does not allow PAL functions to be executed on tables with schema flexibility? I suspect so, because I'm running into similar problems with the SUBSTITUTE_MISSING_VALUES function.
Thanks for any help,
DanielHey there,
isn't there anyone who came across this issue? I'd love to know if this is a known technical limitation with tables that use the "schema flexibility" or rather a bug. In the former case, can anyone suggest a workaround?
I'd be grateful for any help or any pointer to further documentation.
Best,
Daniel -
Function module to find number of table entries in Data base table
Hi All
I have the urgent requirement to find out the number of table entries in a table using the function Module.
if u know the Function Module name please let me know
Thanks & Regards
Rajmohan.GYou can calculate the total number of records like this.
TABLES : ztable.
DATA cnt type I.
Select count( * ) into cnt FROM ztable.
Regards,
Santosh -
Is there a maximum number of tables for a SELECT?
I know that technically there is nothing preventing me from joining many tables. But I know there are other "limits" like memory, or processing time allowable. I've also read from Tom kyte's advice that as much as possible if it can be done in one SQL statement, do it in SQL. But is there a limit to this?
I have 10 tables that I need to work with. 2 of these tables contain millions of records. At the minimum, I need to at least join 4 of these tables then I could loop for each record and perform processing with the other tables. I could actually join more than 4 tables but I'm wary if it could still cause the system to crash. I'm using BULK COLLECT with LIMIT 1000 to control the fetching. The columns I'm joining are mostly PKs of each table. There are some transactional processing needed so I can't totally avoid a loop.
This is the logic:
<Cursor with joining 4 tables>
Loop
Fetch cursor BULK COLLECT INTO nested table LIMIT 1000
For each record Loop
Per record processing. I'll use the record elements to query from the other 6 tables.
End Loop
EXIT WHEN cursor NOTFOUND
END LOOP
Is it advisable to transfer as many tables as I can in the outside loop so there would be less processing in the inner loop? Or can a query be too big that it is better to manage the number of tables in a SELECT?
Thanks!
Edited by: user12090980 on Jun 3, 2011 6:23 AMuser12090980 wrote:
I have 10 tables that I need to work with. 2 of these tables contain millions of records. At the minimum, I need to at least join 4 of these tables then I could loop for each record and perform processing with the other tables. I could actually join more than 4 tables but I'm wary if it could still cause the system to crash. I'm using BULK COLLECT with LIMIT 1000 to control the fetching. The columns I'm joining are mostly PKs of each table. There are some transactional processing needed so I can't totally avoid a loop.Well, the very same loop structure you code in PL/SQL to "manually join" the required data sets of the 10+ tables, is also implemented in the SQL engine. It is called a nested loop.
So why think that your code can handle joining 10+ tables better than the SQL engine? Also consider the fact that the SQL engine has a number of other (more sophisticated) join algorithms it can use, like hash joins, merge joins and so on.
The biggest problem with the nested loop algorithm is scalability. If you nest the loop 10 deep - then a single loop iteration more in the main loop, can cause a 1000's or even potentially million more iterations in total ("exponential" impact on the loops nested in it).
So nested loop is at times the worse type of join algorithm to use. And IMO, always the wrong thing to code in PL/SQL - as PL/SQL code (irrespective of bulk processing) will always be inferior to the SQL engine when it comes to joining data. -
How to exclude tables from Schema level replication
Hi All,
I am currently trying to setup Oracle Streams (Oracle 11.1.0.6 on RHEL5) to replicate a schema from one instance to another.
The basic schema level replication is working well, and copying DDL and DML changes without any problems. However there are a couple of tables that I need to exclude from the stream, due to incompatible datatypes.
Does anybody have any ideas or notes on how I could achieve this?? I have been reading the Oracle documentation and find it difficult to follow and confusing, and I have not found any examples in the internet.
Thanks heaps.
GavinWhen you use SCHEMA level rules for capture and need to skip the replication of a few tables, you create rules in the negative rule set for the table.
Here is an example of creating table rules in the negative rule set for the capture process.
begin
dbms_streams_adm.add_table_rules(
table_name => 'schema.table_to_be_skipped',
streams_type => 'CAPTURE',
streams_name => 'your_capture_name',
queue_name => 'strmadmin.capture_queue_name',
include_dml => true,
include_ddl => true,
inclusion_rule => false
end;
table_name parameter identifies the fully qualified table name (schema.table)
streams_name identifies the capture process for which the rules are to be added
queue_name specifies the name of the queue associated with the capture process.
Inclusion_rule=> false indicates that the create rules are to be placed in the negative rule set (ie, skip this table)
include_dml=> true indicates DML changes for the table (ie, skip DML changes for this table)
include_ddl=> true indicates DDL changes for the table (ie, skip DDL changes for this table) -
Missing PRCTR from Number Range Table
Hi There,
I need to find out all the the available profit centre which not created yet ie no profit centre available in CEPC from the number range table NRIV.
I tried the FM number_get_next but nothing happened.
Details given below:
In the NRIV table I have got say following details :
Object From Number To Number etc etc
Y_PC 0000000001 0000000100
Y_PC 0000000101 0000000150
Y_PC 0000000160 0000000200
Profit Centre Created Available in CEPC say
In the no range 1 to 100
0000000001
0000000002
0000000003 (Not created)
0000000004
5 ,6 & 7 missing
0000000008
In the no range 101 to 150
0000000101
0000000102
0000000103
0000000104 (Not created)
0000000105
6,7....upto 149...(Not created)
0000000150
Similarly in other ranges few nos missing
My requirement is to read the NRIV table take all from number and to number into an internal table and for those nos find the missing nos from the CEPC table .
Appreciate your help.
Regards.
SunandaResolved by myself !!
Cheers. -
DB02 shows less total number of tables after UC conversion + OS migration
Hi,
We have performed an combined UC conversion and OS migration (BW system on oracle). When comparing the original and migrated system I find in DB02 that in the migrated system the total number of tables is less than in the original system. The migration itself went OK and the system looks fine. I recall that I once heard that tables could be defined to be created when they are called/used for the first time. Is this correct? Could these also be old/non used tables (e.g. left overs from an upgrade) which were still present in the database dictionary but not in the SAP dictionary and were therefor not migrated?
Thanks,
Regards,
BartDB02 counts total number of tables/indecies/views etc in database rather that just SAP tables/indecies/views in database. The migration process recreates a fresh database and loads tables/indecies/views from source SAP dictionary. This means table/indecies/views not part of database catalog or in source SAP dictionary don't get created. This includes objects not owned by db catalog user or SAP connect user. It also includes objects owned by sap connect user but not known by SAP dictionary. Under normal operation of some databases, some tables get automatically created. An example for oracle is PLAN_TABLE as a result of the EXPLAIN SQL command. Did your source DDIC to db comparison tool show any objects in database but not in dictionary? If so, these are connect user owned objects that were not migrated. Does your source system have objects owned by non-db catalog, non-sap connect users (Eg DBA's)? These wern't migrated.
You've already indicated that the SAP connect user counts in source and target are identical. This is what you moved.
Maybe you are looking for
-
Can i share itunes with 3 ipods?
I have 3 iPods and i would like to know how to use the sae itunes account for all 3.
-
How to get the status of service in rac
Dear all, how to get the status of service in rac. SQL> show parameters service; NAME TYPE VALUE service_names string DEVDB, DEVDB1, devdb [oracle@rac1 ~]$ srvctl status service -d de
-
Essbase 9.3.1. I have the following hierarchy: Total Region --North America --------USA --------------East -------------------Boston -------------------New York --------------West -------------------LA.... After I setup the security filter as below R
-
Is this a bug? cfdocumentsection is supposed to have their own header and footer. This example mimics one in the documentation. I loop through a simple counter from 1 to 4 and name each page, header and footer with the loop number. But if you run thi
-
HP Drivers Windows 7 32 bit operating system
I have a HP pavilion G series (g6 - 1232sa). I recently had to reinstall windows, which I did using an installation disc. Windows has been re-installed successfully, but things such as my wireless are not working. From internet searches I have learnt