Which SQL queries to count on OFO 9042 & Content Services 10120
Dear experts,
In customer context we urgently need to know which SQL queries to use :
- to count on OFO 9.0.4.2 the number of workspaces
- to count on OFO 9.0.4.2 the number of metadata elements (categories & attributes) attached to documents
- to count on Content Services 10.1.2.0 the number of containers
- to count on Content Services 10.1.2.0 the number of libraries
- to count on Content Services 10.1.2.0 the number of folders
- to count on Content Services 10.1.2.0 the number of groups
- to count on Content Services 10.1.2.0 the number of categories
- to count on Content Services 10.1.2.0 the number of attributes
- to count on Content Services 10.1.2.0 the number of workspaces
Thank a lot for your kind answer.
Best regards.
Jean-Francois
Congrats to Shanky and Durval!
SQL Server General and Database Engine Technical Guru - June 2014
Shanky
SQL Server: What does Column Compressed Page Count Value Signify
in DMV Sys.dm_db_index_physical_stats ?
DB: "Interesting and detailed"
DRC: "• This is a good article and provides details of each and every step and the output with explanation. Very well formed and great information. • We can modify the create table query with “DEFAULT VALUES". CREATE TABLE [dbo].[INDEXCOMPRESSION](
[C1] [int] IDENTITY(1,1) NOT NULL, [C2] [char](50) NULL DEFAULT 'DEFAULT TEST DATA' ) ON [PRIMARY]"
GO: "Very informative and well formed article as Said says.. Thanks for that great ressource. "
Durval Ramos
How to get row counts for all Tables
GO: "As usual Durva has one of the best articles about SQL Server General and Database Engine articles! Thanks, buddy!" "
Jinchun Chen: "Another great tip!"
PT: "Nice tip"
Ed Price: "Good topic, formatting, and use of images. This would be far better if the examples didn't require the black bars in the images. So it would be better to scrub the data before taking the screenshots. Still a good article. Thank you!"
Ed Price, Azure & Power BI Customer Program Manager (Blog,
Small Basic,
Wiki Ninjas,
Wiki)
Answer an interesting question?
Create a wiki article about it!
Similar Messages
-
How to use database control to execute sql queries which change at run time
Hi all,
I need to execute sql queries using database controls , where the sql changes
at run time
based on some condition. For eg. based on the condition , I can add some where
condition.
Eg. sql = select id,name from emp where id = ?.
based on some condition , I can add the following condition .
and location = ?.
Have anybody had this kind of situation.
thanks,
sathishFrom the perspective of the database control, you've got two options:
1) use the sql: keyword to do parameter substitution. Your observation
about {foo} style sbustitution is correct -- this is like using a
PreparedStatement. To do substitution into the rest of the SQL
statement, you can use the {sql: foo} substitution syntax which was
undocumented in GA but is documented in SP2. Then, you can build up
the filter clause String yourself in a JPF / JWS / etc and pass it into
the DB control.
For example:
* @jc:sql statement="select * from product {sql: filter}"
public Product[] getProducts(String filter) throws SQLException;
This will substitute the String filter directly into the statement that
is executed. The filter string could be null, "", "WHERE ID=12345", etc.
2) you can use the DatabaseFilter object to build up a set of custom
sorts and filters and pass that object into the DB control method.
There have been other posts here about doing this, look for the subject
"DatabaseFilter example".
Hope that helps...
Eddie
Dan Hayes wrote:
"Sathish Venkatesan" <[email protected]> wrote:
Hi Maruthi,
The parameter substituion , I guess is used like setting the values for
prepared
statements.
What I'm trying to do , is change the sql at run time based on some condition.
For example ,
consider the following query :
select col1,col2 from table t where t.col3 > 1
At run time , based on some condition , I need to add one more and condition.
i.e. select col1,col2 from table t where t.col3 > 1 and t.col4 < 10.
This MAY not address your issue but if you are trying to add "optional" parameters
you may try including ALL the possible parameters in the SQL but send in null
for those params that you don't want to filter on in any particular case. Then,
if you word your query
as follows:
select col1, col2 from table t where t.col3 > 1 and (t.col4 = {col4param} or
{col4param} is null) and (t.col5 = {col5param} or {col5param} is null) ...
you will get "dynamic" filters. In other words, col4 and col5 will only be
filtered if you send in non-null parameters for those arguments.
I have not tried this in a WL Workshop database control but I've used
this strategy dozens of times in stored procedures or jdbc prepared statements.
Good luck,
Dan -
JDeveloper and PL/SQL queries which return XML
Hi,
I need to deploy WS with JDeveloper. I have an Oracle9i database which I query with PL/SQL. How can I return XML from my PL/SQL queries? I know I can use SQLX to return XML datatypes, but JDeveloper doesn't seem to be compatible. There is also XSQL, but I don't think JDeveloper allows you to publish this as a web service.
Any help would be greatly appreciated!!! Thanks!!!!The following function should work for what you are trying to do.
htp.p(htf.escape_sc(:P1_ITEM));To escape columns from the database just include it in your select statement:
SELECT htf.escape_sc(COLUMN_NAME) alias
FROM TABLETo escape items on your apex page:
INSERT INTO table
(COLUMN_NAME)
VALUES(htf.escape_sc(:P1_ITEM))Good Luck,
Tyson
Edited by: Tyson Jouglet on Nov 25, 2008 4:12 PM -
Stored procedure which executes the sql queries
Hi This is Swathi,
I have 200 Queries which Gets the rowcounts Of sorce and target tables, so i want to develop a procedure which takes the sql queries one by one exicutes them and gives a report with results of 200 queries.
This is to test ETL mappings.
Please give me some idea to write it.
Thanks.SELECT TABLE_NAME, NUM_ROWS FROM USER_TABLES;Keeping in mind that this is only as accurate as the last time stats were gathered.
-
Which SQL query performance is better
Putting this in a new thread so that i dont confuse and mix the query with my similiar threads.
Based on all of your suggestions for the below query,i finally came up with 2 RESOLUTIONS possible for it.
So,which would be the best in PERFORMANCE from the 2 resolutions i pasted below?I mean which will PERFORM FASTER and is more effecient.
***The original QUERY is at the bottom.
Resolution 1:-/***Divided into 2 sep. queries and using UNION ALL ****/ Is UNION ALL costly?
SELECT null, null, null, null, null,null, null, null, null,
null,null, null, null, null, null,null,null, count(*) as total_results
FROM
test_person p,
test_contact c1,
test_org_person porg
WHERE p.CLM_ID ='11' and
p.person_id = c1.ref_id(+)
AND p.person_id = porg.o_person_id
and porg.O_ORG_ID ='11'
UNION ALL
SELECT lastname, firstname,person_id, middlename,socsecnumber,
birthday, U_NAME,
U_ID,
PERSON_XML_DATA,
BUSPHONE,
EMLNAME,
ORG_NAME,
EMPID,
EMPSTATUS,
DEPARTMENT,
org_relationship,
enterprise_name,
null
FROM
SELECT
beta.*, rownum as alpha
FROM
SELECT
p.lastname, p.firstname, p.person_id, p.middlename, p.socsecnumber,
to_char(p.birthday,'mm-dd-yyyy') as birthday, p.username as U_NAME,
p.clm_id as U_ID,
p.PERSON_XML_DATA.extract('/').getStringVal() AS PERSON_XML_DATA,
c1.CONTACT_DATA.extract('//phone[1]/number/text()').getStringVal() AS BUSPHONE,
c1.CONTACT_DATA.extract('//email[2]/address/text()').getStringVal() AS EMLNAME,
c1.CONTACT_DATA.extract('//company/text()').getStringVal() AS ORG_NAME,
porg.emplid as EMPID, porg.empl_status as EMPSTATUS, porg.DEPARTMENT,
porg.org_relationship,
porg.enterprise_name
FROM
test_person p,
test_contact c1,
test_org_person porg
WHERE p.CLM_ID ='11' and
p.person_id = c1.ref_id(+)
AND p.person_id = porg.o_person_id
and porg.O_ORG_ID ='11'
ORDER BY
upper(p.lastname), upper(p.firstname)
) beta
WHERE
alpha BETWEEN 1 AND 100
Resolution 2:-
/****here,the INNER most count query is removed ****/
select *
FROM
SELECT
beta.*, rownum as alpha
FROM
SELECT
p.lastname, p.firstname, p.person_id, p.middlename, p.socsecnumber,
to_char(p.birthday,'mm-dd-yyyy') as birthday, p.username as U_NAME,
p.clm_id as U_ID,
p.PERSON_XML_DATA.extract('/').getStringVal() AS PERSON_XML_DATA,
c1.CONTACT_DATA.extract('//phone[1]/number/text()').getStringVal() AS BUSPHONE,
c1.CONTACT_DATA.extract('//email[2]/address/text()').getStringVal() AS EMLNAME,
c1.CONTACT_DATA.extract('//company/text()').getStringVal() AS ORG_NAME,
porg.emplid as EMPID, porg.empl_status as EMPSTATUS, porg.DEPARTMENT,
porg.org_relationship,
porg.enterprise_name,
COUNT(*) OVER () cnt -----This is the function
FROM
test_person p,
test_contact c1,
test_org_person porg
WHERE p.CLM_ID ='11' and
p.person_id = c1.ref_id(+)
AND p.person_id = porg.o_person_id
and porg.O_ORG_ID ='11'
ORDER BY upper(p.lastname), upper(p.firstname)
) beta
WHERE
alpha BETWEEN 1 AND 100
ORIGINAL QUERY
SELECT
FROM
SELECT
beta.*, rownum as alpha
FROM
SELECT
p.lastname, p.firstname, porg.DEPARTMENT,
porg.org_relationship,
porg.enterprise_name,
SELECT
count(*)
FROM
test_person p, test_contact c1, test_org_person porg
WHERE
p.p_id = c1.ref_id(+)
AND p.p_id = porg.o_p_id
$where_clause$
) AS results
FROM
test_person p, test_contact c1, test_org_person porg
WHERE
p.p_id = c1.ref_id(+)
AND p.p_id = porg.o_p_id
$where_clause$
ORDER BY
upper(p.lastname), upper(p.firstname)
) beta
WHERE
alpha BETWEEN #startRec# AND #endRec#I have now run the explain plans and put them below seperately for each SQL.The SQL queries for each of the items are posted in the 1st post of this thread.
***The original QUERY is at the bottom.
Resolution 1:-/***Divided into 2 sep. queries and using UNION ALL ****/ Is UNION ALL costly?
EXPLAIN PLANS SECTION
1- Original
Plan hash value: 1981931315
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 22859 | 187M| | 26722 (81)| 00:05:21 |
| 1 | UNION-ALL | | | | | | |
| 2 | SORT AGGREGATE | | 1 | 68 | | | |
| 3 | MERGE JOIN OUTER | | 22858 | 1517K| | 5290 (1)| 00:01:04 |
| 4 | MERGE JOIN | | 22858 | 982K| | 4304 (1)| 00:00:52 |
|* 5 | INDEX RANGE SCAN | test_org_person_I3 | 24155 | 542K| | 363 (1)| 00:00:05 |
|* 6 | SORT JOIN | | 22858 | 468K| 1448K| 3941 (1)| 00:00:48 |
|* 7 | TABLE ACCESS FULL | test_PERSON | 22858 | 468K| | 3716 (1)| 00:00:45 |
|* 8 | SORT JOIN | | 68472 | 1604K| 4312K| 985 (2)| 00:00:12 |
| 9 | INDEX FAST FULL SCAN | test_CONTACT_FK1 | 68472 | 1604K| | 113 (1)| 00:00:02 |
|* 10 | VIEW | | 22858 | 187M| | 21433 (1)| 00:04:18 |
| 11 | COUNT | | | | | | |
| 12 | VIEW | | 22858 | 187M| | 21433 (1)| 00:04:18 |
| 13 | SORT ORDER BY | | 22858 | 6875K| 14M| 21433 (1)| 00:04:18 |
| 14 | MERGE JOIN OUTER | | 22858 | 6875K| | 18304 (1)| 00:03:40 |
| 15 | MERGE JOIN | | 22858 | 4397K| | 11337 (1)| 00:02:17 |
| 16 | SORT JOIN | | 22858 | 3013K| 7192K| 5148 (1)| 00:01:02 |
|* 17 | TABLE ACCESS FULL | test_PERSON | 22858 | 3013K| | 3716 (1)| 00:00:45 |
|* 18 | SORT JOIN | | 24155 | 1462K| 3800K| 6189 (1)| 00:01:15 |
| 19 | TABLE ACCESS BY INDEX ROWID| test_ORG_PERSON | 24155 | 1462K| | 5535 (1)| 00:01:07 |
|* 20 | INDEX RANGE SCAN | test_ORG_PERSON_FK1| 24155 | | | 102 (1)| 00:00:02 |
|* 21 | SORT JOIN | | 68472 | 7422K| 15M| 6968 (1)| 00:01:24 |
| 22 | TABLE ACCESS FULL | test_CONTACT | 68472 | 7422K| | 2895 (1)| 00:00:35 |
Predicate Information (identified by operation id):
5 - access("PORG"."O_ORG_ID"='11')
6 - access("P"."PERSON_ID"="PORG"."O_PERSON_ID")
filter("P"."PERSON_ID"="PORG"."O_PERSON_ID")
7 - filter("P"."CLM_ID"='11')
8 - access("P"."PERSON_ID"="C1"."REF_ID"(+))
filter("P"."PERSON_ID"="C1"."REF_ID"(+))
10 - filter("ALPHA"<=25 AND "ALPHA">=1)
17 - filter("P"."CLM_ID"='11')
18 - access("P"."PERSON_ID"="PORG"."O_PERSON_ID")
filter("P"."PERSON_ID"="PORG"."O_PERSON_ID")
20 - access("PORG"."O_ORG_ID"='11')
21 - access("P"."PERSON_ID"="C1"."REF_ID"(+))
filter("P"."PERSON_ID"="C1"."REF_ID"(+))
-------------------------------------------------------------------------------terprise_name
Resolution 2:-
EXPLAIN PLANS SECTION
1- Original
Plan hash value: 1720299348
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 23518 | 13M| | 11545 (1)| 00:02:19 |
|* 1 | VIEW | | 23518 | 13M| | 11545 (1)| 00:02:19 |
| 2 | COUNT | | | | | | |
| 3 | VIEW | | 23518 | 13M| | 11545 (1)| 00:02:19 |
| 4 | WINDOW SORT | | 23518 | 3536K| | 11545 (1)| 00:02:19 |
| 5 | MERGE JOIN OUTER | | 23518 | 3536K| | 11545 (1)| 00:02:19 |
| 6 | MERGE JOIN | | 23518 | 2985K| | 10587 (1)| 00:02:08 |
| 7 | SORT JOIN | | 23518 | 1561K| 4104K| 4397 (1)| 00:00:53 |
|* 8 | TABLE ACCESS FULL | test_PERSON | 23518 | 1561K| | 3716 (1)| 00:00:45 |
|* 9 | SORT JOIN | | 24155 | 1462K| 3800K| 6189 (1)| 00:01:15 |
| 10 | TABLE ACCESS BY INDEX ROWID| test_ORG_PERSON | 24155 | 1462K| | 5535 (1)| 00:01:07 |
|* 11 | INDEX RANGE SCAN | test_ORG_PERSON_FK1| 24155 | | | 102 (1)| 00:00:02 |
|* 12 | SORT JOIN | | 66873 | 1567K| 4216K| 958 (2)| 00:00:12 |
| 13 | INDEX FAST FULL SCAN | test_CONTACT_FK1 | 66873 | 1567K| | 110 (1)| 00:00:02 |
Predicate Information (identified by operation id):
1 - filter("ALPHA"<=25 AND "ALPHA">=1)
8 - filter("P"."CLM_ID"='11')
9 - access("P"."PERSON_ID"="PORG"."O_PERSON_ID")
filter("P"."PERSON_ID"="PORG"."O_PERSON_ID")
11 - access("PORG"."O_ORG_ID"='11')
12 - access("P"."PERSON_ID"="C1"."REF_ID"(+))
filter("P"."PERSON_ID"="C1"."REF_ID"(+))
ORIGINAL QUERY
EXPLAIN PLANS SECTION
1- Original
Plan hash value: 319284042
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 22858 | 187M| | 21433 (1)| 00:04:18 |
|* 1 | VIEW | | 22858 | 187M| | 21433 (1)| 00:04:18 |
| 2 | COUNT | | | | | | |
| 3 | VIEW | | 22858 | 187M| | 21433 (1)| 00:04:18 |
| 4 | SORT ORDER BY | | 22858 | 6875K| 14M| 21433 (1)| 00:04:18 |
| 5 | MERGE JOIN OUTER | | 22858 | 6875K| | 18304 (1)| 00:03:40 |
| 6 | MERGE JOIN | | 22858 | 4397K| | 11337 (1)| 00:02:17 |
| 7 | SORT JOIN | | 22858 | 3013K| 7192K| 5148 (1)| 00:01:02 |
|* 8 | TABLE ACCESS FULL | test_PERSON | 22858 | 3013K| | 3716 (1)| 00:00:45 |
|* 9 | SORT JOIN | | 24155 | 1462K| 3800K| 6189 (1)| 00:01:15 |
| 10 | TABLE ACCESS BY INDEX ROWID| test_ORG_PERSON | 24155 | 1462K| | 5535 (1)| 00:01:07 |
|* 11 | INDEX RANGE SCAN | test_ORG_PERSON_FK1| 24155 | | | 102 (1)| 00:00:02 |
|* 12 | SORT JOIN | | 68472 | 7422K| 15M| 6968 (1)| 00:01:24 |
| 13 | TABLE ACCESS FULL | test_CONTACT | 68472 | 7422K| | 2895 (1)| 00:00:35 |
Predicate Information (identified by operation id):
1 - filter("ALPHA"<=25 AND "ALPHA">=1)
8 - filter("P"."CLM_ID"='1862')
9 - access("P"."PERSON_ID"="PORG"."O_PERSON_ID")
filter("P"."PERSON_ID"="PORG"."O_PERSON_ID")
11 - access("PORG"."O_ORG_ID"='1862')
12 - access("P"."PERSON_ID"="C1"."REF_ID"(+))
filter("P"."PERSON_ID"="C1"."REF_ID"(+))
-------------------------------------------------------------------------------Edited by: user10817659 on Feb 19, 2009 11:47 PM
Edited by: user10817659 on Feb 21, 2009 12:23 AM -
Replace square brackets in SQL queries against ODBC
We use PowerPivot to access an ODBC provider. PowerPivot generates queries with square brackets, which is not ANSI compliant and incompatible with this ODBC provider. In Office 2010, we had a workaround by modifying this file:
C:\Program Files\Microsoft Analysis Services\AS OLEDB\110\sql2000.xsl
and replace the '[' and ']' with double quotes.
!-- SQL Server 2000 pluggable cartridge -->
<!-- Area of STANDARD parametrizations: these are externally passed -->
<xsl:param name="in_CanUseParams">yes</xsl:param>
<xsl:param name="in_IdentStartQuotingCharacter">[</xsl:param>
<xsl:param name="in_IdentEndQuotingCharacter">]</xsl:param>
However, this workaround does not seem to work after we upgraded to Office 2013. PowerPivot doesn't appear to read this cartridge file anymore. We are familiar with the option to manually write queries to bypass this issue, but we still like to use the generated
queries. Is there another way to replace the square brackets?I think you could do this:
Code Snippet
SELECT
OUTSIDE
.OUTSIDE_SALES,
INSIDE.INSIDE_SALES,
CUST.CUST_NAME,
SOE.CUSTOMER_NO,
SOE.SALE_ORDER_NO,
SOE.INVOICED_DATE,
PICKUP.PICKUP,
(SOE.SALES_AMT/100) AS SALES,
(SOE.COST_GOODS_SOLD/100) AS COGS,
(SOE.SALES_AMT/100) - (SOE.COST_GOODS_SOLD/100) AS MARGIN,
COUNT(TRACKING.PALLET_ID)
FROM
SOE_HEADER SOE
INNER JOIN CUST_NAME CUST
ON SOE.CUSTOMER_NO = CUST.CUSTOMER_NO
INNER JOIN INSIDE_SALES INSIDE
ON SOE.INSIDE_ROUTE = INSIDE.INSIDE_ROUTE
INNER JOIN OUTSIDE_SALES OUTSIDE
ON SOE.SALESMAN_NO = OUTSIDE.SALESMAN_NO
INNER JOIN PICKUP PICKUP
ON SOE.TYPE_OF_FRGHT = PICKUP.TYPE_OF_FRGHT
INNER JOIN PALLET_SALES_ORD SALES
ON SOE.SALE_ORDER_NO = SALES.SALE_ORDER_NO
INNER JOIN PALLET_TRACKING TRACKING
ON SALES.PALLET_ID = TRACKING.PALLET_ID
WHERE
SOE.TYPE_OF_ORDER = '00'
AND SOE.ORDERS_STATUS = '06'
AND SOE.DATE_OF_ORDER > 20080101
AND SOE.TYPE_OF_FRGHT IN ('06','08','10','13','14')
AND SOE.CUSTOMER_NO <> '1000027000'
AND SOE.CUSTOMER_NO <> '1000039000'
AND SOE.CUSTOMER_NO <> '1014869000'
AND SOE.CUSTOMER_NO <> '1014869001'
AND SOE.CUSTOMER_NO <> '1014869002'
AND SOE.CUSTOMER_NO <> '1014869003'
AND SOE.CUSTOMER_NO <> '1014889000'
AND SOE.CUSTOMER_NO <> '1014890000'
AND SOE.CUSTOMER_NO <> '4400000000'
AND SOE.CUSTOMER_NO <> '4499998000'
AND SOE.CUSTOMER_NO <> '4500012000'
AND SOE.CUSTOMER_NO <> '4500018000'
AND SOE.CUSTOMER_NO <> '6900002000'
AND CUST.CUST_NAME NOT LIKE '%PETCO%'
GROUP BY
OUTSIDE
.OUTSIDE_SALES,
INSIDE.INSIDE_SALES,
CUST.CUST_NAME,
SOE.CUSTOMER_NO,
SOE.SALE_ORDER_NO,
SOE.INVOICED_DATE,
PICKUP.PICKUP,
(SOE.SALES_AMT/100),
(SOE.COST_GOODS_SOLD/100),
(SOE.SALES_AMT/100) - (SOE.COST_GOODS_SOLD/100)
ORDER BY
OUTSIDE
.OUTSIDE_SALES,
CUST.CUST_NAME,
SOE.INVOICED_DATE,
SOE.TYPE_OF_FRGHT
Another option would be to encapsulate the second query in a scalar UDF (if using SQL2k or greater) which would get the count for each sales order and then just return this in your select statement.
HTH! -
Hi,
How can i find the top 10 sql queries which consuming high IO, CPU's. in oracle db.
I am doing in one way that by using TOP command trying to get PID's then i am getting the sql query by applying the hash value in v$sqlarea.
Is there any way to get directly high consumed IO and CPU's with out seeing PID's in TOP command.
Thanskhi,
try something along the lines of
select c.* from
(select disk_reads,
buffer_gets,
rows_processed,
executions,
first_load_time,
sql_text
from v$sqlarea
where parsing_user_id !=0
order by
buffer_gets/decode(executions,null,1,0,1,executions) desc ) c
where rownum < 11;
select c.* from
(select disk_reads,
buffer_gets,
rows_processed,
executions,
first_load_time,
sql_text
from v$sqlarea
order by
disk_reads/decode(rows_processed,null,1,0,1,rows_processed) desc ) c
where rownum <11;or even
--Top 10 by Buffer Gets:
set linesize 100
set pagesize 100
SELECT * FROM
(SELECT substr(sql_text,1,40) sql,
buffer_gets, executions, buffer_gets/executions "Gets/Exec",
hash_value,address
FROM V$SQLAREA
WHERE buffer_gets > 10000
ORDER BY buffer_gets DESC)
WHERE rownum <= 10
--Top 10 by Physical Reads:
set linesize 100
set pagesize 100
SELECT * FROM
(SELECT substr(sql_text,1,40) sql,
disk_reads, executions, disk_reads/executions "Reads/Exec",
hash_value,address
FROM V$SQLAREA
WHERE disk_reads > 1000
ORDER BY disk_reads DESC)
WHERE rownum <= 10
--Top 10 by Executions:
set linesize 100
set pagesize 100
SELECT * FROM
(SELECT substr(sql_text,1,40) sql,
executions, rows_processed, rows_processed/executions "Rows/Exec",
hash_value,address
FROM V$SQLAREA
WHERE executions > 100
ORDER BY executions DESC)
WHERE rownum <= 10
--Top 10 by Parse Calls:
set linesize 100
set pagesize 100
SELECT * FROM
(SELECT substr(sql_text,1,40) sql,
parse_calls, executions, hash_value,address
FROM V$SQLAREA
WHERE parse_calls > 1000
ORDER BY parse_calls DESC)
WHERE rownum <= 10
--Top 10 by Sharable Memory:
set linesize 100
set pagesize 100
SELECT * FROM
(SELECT substr(sql_text,1,40) sql,
sharable_mem, executions, hash_value,address
FROM V$SQLAREA
WHERE sharable_mem > 1048576
ORDER BY sharable_mem DESC)
WHERE rownum <= 10
--Top 10 by Version Count:
set linesize 100
set pagesize 100
SELECT * FROM
(SELECT substr(sql_text,1,40) sql,
version_count, executions, hash_value,address
FROM V$SQLAREA
WHERE version_count > 20
ORDER BY version_count DESC)
WHERE rownum <= 10
;you may have to play around with the column formatting a little to show the best results
regards
Alan
Edited by: alanm on Dec 22, 2008 4:01 PM -
Sql queries for this database schema
1) Patient—PatientID, Name, DOB
2) Doctor—DoctorID, Name, MedLicenseNumber, Phone
3) Medication—MedicationID, BrandName, GenericName
4) Prescription—PrescriptionID, Date, PatientID, PrescriberID, MedicationID
Specify the SQL queries to retrieve:
a) The prescription information along with patient name, DOB, medication brand name, and prescribing doctor name sorted by most recent date.
b) The most prescribed generic medication nameLooks like assignment question to me
a, seems to be a case of straight forward join among all the involved tables on common fields. Suggest you learn about joins and try it out yourself.
see below link
http://www.w3schools.com/sql/sql_join.asp
b, Have a look at GROUP BY. you need to apply group by and take medicine name which repeats maximum (ie largest count value)
see
http://www.w3schools.com/sql/sql_groupby.asp
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
SQL queries inside html region
Hi,
I have html page in my application and i need to display result of the sql query inside this page. I would like to know if i can use the sql queries in html and if yes how can do that.
My query is select count(*) from Quote Letter master where service = 'Premier Support'
And i have similar queries like this which i would like to use in the html region.
Thank you,
rakeshrakesh119 wrote:
Hi,
I have html page in my application and i need to display result of the sql query inside this page. I would like to know if i can use the sql queries in html and if yes how can do that.You cannot/should not write sql queries in your html, but in APEX there are different ways to achieve this such as Page Items, PL/SQL Dynamic Content Region etc.
My query is select count(*) from Quote Letter master where service = 'Premier Support'.For this you better use a page item and compute the items value using a page computation on on-load before header point. Then you can place the page items in a APEX region and also add any styling -
Developing portlets for dummies (sql queries)
Hello, I've been trying to build a dynamic menu. First I went with just plain old plsql: i created a function in the portal schema that returns an unordered, nested list of the pages in my pagegroup and called that function in a regular pl/sql item on my portal page. I did this by querying the wwpob_page$ table and that went just great in my test&development setup (of which I am the admin of course :))
Then I realized that since I'm not an administrator of the server hosting our portal and I only have very limited privileges (I am only a page group administrator) I will probably not be allowed to utilize this function nor will they agree to install it in the portal schema, and I decided I should build a portlet that does the same thing (so it can be registered and so on, and so it can use the synonyms and tools that are available to registered providers). There already is such a portlet (and provider) registered for use in the target portal, but I don't like it because i uses tables and hard-coded styles so I will cook my own, better version. :)
So I downloaded an example portlet and am getting the hang of it, but now I just can't for the life of me figure out how to enable any sql-queries. I have run the provsyns-.sql script, logged in as test_provider/portal and installed my portlet-package as test_provider user. I can see the available pl/sql packages in Toad, but there are no tables or views for me to to see. That means I can't query the portal tables that I need to.
edit: ok, stupid user error; I suck at using Toad, so I was looking at the wrong schema altogether :D So now i see the public views and packages, so forget that bit of the question.
But still, I cannot see even wwsbr_all_folders -view, much less the wwpob_page$ -table. I cannot see any way to find the pages that are in my page group. Somehow it must be doable, right?
Have I done something wrong, missed a step in enabling my test-provider / schema perhaps? I don't really know what I'm doing, but I followed instructions here: http://home.c2i.net/toreingolf/oracle/portal/my_first_plsql_portlet.htm (excellent instructions, thanks!)
So should my portlet be able to access those tables or not? How the heck has the third partly portlet maker done it?
i'm on OracleAS Portal 10g Release 2 (10.1.4)
Edited by: Baguette on 23-Apr-2009 05:13
Edited by: Baguette on 23-Apr-2009 05:32i see your perspective now. and let me give a perspective to my first reply too.
what i proposed to you was the answer of what i quoted in the message. that is, why didn't you see those views in the new schema you created! and it is still ok but it is done in the portal schema for which you should have privielges too and i assumed you had. my mistake!
now, i can relate your privacy concerns with your earlier message:
Hello, I've been trying to build a dynamic menu. First I went with just plain old plsql: i created a function in the portal schema that returns an unordered, nested list of the pages in my pagegroup and called that function in a regular pl/sql item on my portal page. I did this by querying the wwpob_page$ table and that went just great in my test&development setup (of which I am the admin of course :))
+Then I realized that since I'm not an administrator of the server hosting our portal and I only have very limited privileges (I am only a page group administrator) I will probably not be allowed to utilize this function nor will they agree to install it in the portal schema, and I decided I should build a portlet that does the same thing (so it can be registered and so on, and so it can use the synonyms and tools that are available to registered providers). There already is such a portlet (and provider) registered for use in the target portal, but I don't like it because i uses tables and hard-coded styles so I will cook my own, better version. :)+
+So I downloaded an example portlet and am getting the hang of it, but now I just can't for the life of me figure out how to enable any sql-queries. I have run the provsyns-.sql script, logged in as test_provider/portal and installed my portlet-package as test_provider user. I can see the available pl/sql packages in Toad, but there are no tables or views for me to to see. That means I can't query the portal tables that I need to.+
- by downloading an example portlet, you probably mean you created a new schema. because provsyns work on a schema.
- if you are not the administrator of the portal, and may not be able to access some portions of the portal, it means that you do not use the portal user (the user which serves as the owner of the portal schema).
- now, your plan to create a new schema and give those privielges would still not work. because, by creating a new schema you cannot sneak in to the oriignal portal schema if you do not have privileges to do it. obvious, right? otherwise, it would be a vulnerability of the software that you can see what you are not allowed to see by creating a new schema.
- however, there is a bright side here. the views give records based on your privileges.
- so if your administrators have generated them already or if they generate on the original portal schema, then you may see the pages and items that you have privileges to see and no more.
so now, you may ask the administrators if they have already done it, and if not, then if they would be willing to do it.
hope that helps!
AMN -
SQL Monitoring using SQL Queries
Hi
I am using Oracle 10 g DB. I have an application which has Oracle as DB. Its a multiuser application. During the testing of one of the page we found that saving couple of records takes huge time (50-60 seconds). I dont have access to the logic written in application code to know which all queries gets executed when user click on save button.
I know that there is SQL Monitoring available in OEM why which you can find out the queries which got executed and also the execution time for those queries.
However I dont have OEM access.
Can you please let me know teh set of queries which probably will use few of teh V$ data dictionary views and provide me the same details as generated in the Monitoting report generated by OEM ?
I want to see all the queries which got executed in last defined time (say 10 mins, 1 hr etc) , same way as it is done in OEM and not just the last sql.
TIA
Regards
ArunHi,
I recommend you to install STATSPACK - it's a free tool that can be used instead of the Automatic Workload Repository (that is used in OEM).
This tool creates snapshots of V$ views in defined time period, so it enables you to see long runing queries in history.
If I were you I would run SQL trace just before the problematic part and then stop it just as it has finished. It would be better to run trace only for the problematic session or module - not for the whole database. SQL trace generates a file with all the executed queries, explain plans, stats, ...
In extreme case just create copy of v$sql before and after the problematic part and compare them (new sql_id, elapsed_time delta, ...) -
Hello,
I have quite strange question related to my research.
Probabley within Oracle DB there is some dynamic library which is parsing SQL queries,
for example.
SELECT * FROM schema.table which is able to extract informations from SQL listed above that we want to display all columns from selected table.
When this construction is valid ( SQL standard ) the next step is verification the object existence in DB. It means
that there are two separated actions which are analyzing queries.
I need only first part of that process, which will give me informations about all object used in SQL query
is it possible to accomodate that internal parser witin stand alone middleware like Weblogic Event Server ?
I can do it with regular expression patters, but it's not going to be fast and effective for subqueries and all advanced
constructions for DHW like MODEL, etc.
Thank You,
R.>
http://search.cpan.org/~rehsack/SQL-Statement-1.23/lib/SQL/Statement.pm
BUGS AND LIMITATIONS ^
* currently we treat NULL and '' as the same - eventually fix
* No nested C-style comments allowed as SQL99 says
>
That's the reason why I want to use dedicated for 11g EE library ;-)
Thanks for quick reply -
Does anyone know how to create a JSP which allows the user to input data into a simple database and then allow the user to input SQL queries? I can do this in a servlet but must hard code the query. It would be great to be able to type in differnet queries to see the results.
Thanks
Michaelhai,
I am giving u a small example which allows user to enter data in the html page and the same is submitted by the Jsp page in the database.
<%@ page language="java" import="java.sql.*" %>
<html>
<body bgcolor="#FFd08d" >
<%! Connection conn;
Statement st,st1;
ResultSet rs,rs1;
int i,j;
%>
<%
try{
Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
conn=DriverManager.getConnection("jdbc:odbc:--DSNName--","--DataBaseUserName--","--DataBasePassword--");
catch(Exception e)
%>
Error in Saving : <%=e%>
<%
try
String email_address=request.getParameter("email_address");
String query_text=request.getParameter("query_text");
PreparedStatement st=null;
String inst="insert into databasename..DataBaseTableName(email_address,query_text) values (?,?)";
st=conn.prepareStatement(inst);
st.setString(1,email_address);
st.setString(2,query_text);
st.executeUpdate();
conn.close();
%>
Your query is Submitted.<br>
Thanks.
<%
catch(SQLException e)
%>
Error in Saving : <%=e%>
<%
%>
</body>
</html>
I think this will solve ur query.
bye
sameer -
Multiple SQL Queries in SAP BPC 5.1 EVMODIFY
Hi All,
We have multiple SQL Queries in Outlooksoft 4.2 which extracts data from Oracle source system with different set of selections and conditions and parameters. We were able to use multiple SQL Queries in Outlooksoft 4.2 using Transform Data Flow Task and paste parameters using evModify dynamic script task, where ever extract to source system is needed.
According to SAP BPC 5.1, all these multiple SQL Queries and extracts by passing parameters will be coded in EVMODIFY dynamic script editor.But, EVMODIFY dynamic script editor is not working with the sets of multiple SQL Queris.It's able to recognize and execute the first SQL Query, but not able to execute from the second SQL Query.
Does any body, did multiple extracts using SQL Queries to the source system by passing parameters using SAP BPC 5.1 data manager and SSIS Packages, please let me know, how you did achieve the above functionality.
Regards,
Sreekanth.Hi Sorin,
Thanks for your update, I tried declaring the variable between %%....GLOBAL(%GTIMEID%,%SELECTION%) and the package runs now but the problem is that the package is executed using the default date value for the variable GTIMEID declared in the DTSX package and its not taken the date that I'm trying to pass from BPC. As showed below, please if you could take a look to the ModifyScript and see the last line for the global variable and line 13 PROMTP(SELECTINPUT,%SELECTION%,,,%TIME_DIM%) where I am selecting the TIMEID:
DEBUG(ON)
PROMPT(INFILES,,"Import file:",)
PROMPT(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
PROMPT(RADIOBUTTON,%CLEARDATA%,"Select the method for importing the data from the source file to the destination database",0,{"Merge data values (Imports all records, leaving all remaining records in the destination intact)","Replace && clear data values (Clears the data values for any existing records that mirror each entity/category/time combination defined in the source, then imports the source records)"},{"0","1"})
PROMPT(RADIOBUTTON,%RUNLOGIC%,"Select whether to run default logic for stored values after importing",1,{"Yes","No"},{"1","0"})
PROMPT(RADIOBUTTON,%CHECKLCK%,"Select whether to check work status settings when importing data.",1,{"Yes, check for work status settings before importing","No, do not check work status settings"},{"1","0"})
INFO(%TEMPFILE%,%TEMPPATH%%RANDOMFILE%)
PROMPT(SELECTINPUT,%SELECTION%,,,%TIME_DIM%)
TASK(CONVERT Task,INPUTFILE,%FILE%)
TASK(CONVERT Task,OUTPUTFILE,%TEMPFILE%)
TASK(CONVERT Task,CONVERSIONFILE,%TRANSFORMATION%)
TASK(CONVERT Task,STRAPPSET,%APPSET%)
TASK(CONVERT Task,STRAPP,%APP%)
TASK(CONVERT Task,STRUSERNAME,%USER%)
TASK(Dumpload Task,APPSET,%APPSET%)
TASK(Dumpload Task,APP,%APP%)
TASK(Dumpload Task,USER,%USER%)
TASK(Dumpload Task,DATATRANSFERMODE,1)
TASK(Dumpload Task,CLEARDATA,1)
TASK(Dumpload Task,FILE,%TEMPFILE%)
TASK(Dumpload Task,RUNTHELOGIC,1)
TASK(Dumpload Task,CHECKLCK,1)
GLOBAL(%GTIMEID%,%SELECTION%)
Do you guess That I am missing something?
Thanks in advanced
Regards -
Pl/sql block to count no of vowels and consonents in a given string
hi,
I need a pl/sql block to count no of vowels and consonants in a given string.
Program should prompt user to enter string and should print no of vowels and consonants in the given string.
Regards
KrishnaEdit, correction to my above post which was wrong
SQL> ed
Wrote file afiedt.buf
1 with t as (select '&required_string' as txt from dual)
2 --
3 select length(regexp_replace(txt,'([bcdfghjklmnpqrstvwxyzBCDFGHJKLMNPQRSTVWXYZ])|.','\1')) as cons_count
4 ,length(regexp_replace(txt,'([aeiouAEIOU])|.','\1')) as vowel_count
5* from t
SQL> /
Enter value for required_string: This is my text string with vowels and consonants
old 1: with t as (select '&required_string' as txt from dual)
new 1: with t as (select 'This is my text string with vowels and consonants' as txt from dual)
CONS_COUNT VOWEL_COUNT
30 11
SQL>
Maybe you are looking for
-
Insert PDF or Doc file to the Database from Frontend
Hi I am in ISB company working with Oracle 9iDS. The scenario is to me that I would like to insert some document files(especially Pdf formated files) to the database. I can use Read_Image_file. but that is used for the image file to bring the image f
-
I am not able to send outgoing mail on my ipad. i keep getting there is an error on relaying
Is anyone having this problem on with theie ipad?
-
Does anyone know when 3.1 will be released?
I know the beta version is out there and I'm hoping 3.1 will solve my issues with short battery life, intermittent loss of signal in areas I had no trouble with prior to 3.1 as well as a random "Restore Needed" error message that pops up about once e
-
Scale Factor for Photoshop and Firefox
Hi everyone, I am attempting to adjust the scale factor on my macbook pro. Everything is good so far except several applications (firefox and cs5 photoshop) do not readjust well. I am wanting to set the independent scale factor of these applications
-
Effects of using differing sample rates?
I recently had a great deal of trouble rendering a 10 minute sequence. I thought it was because I had some effects (pan and scan stills layered on top of each other). After much time and hair pulling, I finally was able to render the sequence and exp