Using Reference Cursor Performance Issue in Report
Hi,
Are reference cursor supposed to be faster than a normal query? The reason why I am asking is because I am using a reference cusor query in the data model and it has a performance issue on the report, it's taking quite a while to run than If I just run the same reference cursor query in sql*plus. The difference is significantly big. Any input is very much appreciated!
Thanks,
Marilyn
From the metalink bug 4372868 on 9.0.4.x. It was fixed on 10.1.2.0.2 and does not have a backport for any 9.0.4 version.
Also the 9.0.4 version is already desupported. Please see the note:
Note 307042.1
Topic: Desupport Notices - Oracle Products
Title: Oracle Reports 10g 9.0.4 & 9.0.4.x
Action plan:
If you are still on 9.0.4 and later version of oracle reports and have no plan yet to migrate to 10.1.2.0.2 version use the same query you are using in your reference cursor and use it as a plain SQL query in your reports data model.
Similar Messages
-
Hello
i am making a change to an existing custom report.I have to pull all the orders except CANCELLED Orders for a parameters passed by user.
I made a change as FLOW_STATUS_CODE<>'CANCELLED' in rdf.The not equal is causing performance issues...and it is taking lot of time to complete.
can any one sujjest what will be the best to use in place of not equal.
ThanksIs there an index on column FLOW_STATUS_CODE?
Run your query in sqlplus through explain plan, and check the execution plan if a query has performance issues.
set pages 999
set lines 400
set trimspool on
spool explain.lst
explain plan for
<your statement>;
select * from table(dbms_xplan.display);
spool offFor optimization questions you'd better go to the SQL forum. -
Performance Issue Crystal Report and Oracle
Hello,
We have one procedure that takes 3 input parameters and returns Cursor from Procedure that is used to design the report. There is no caluculation involved here and cursor is opended dynamically. We are using Oracle Native connection.
When we click on preview button it takes lots of time ( >10 Mins) to show complete data. While we call the same procedure in application and generate HTML report using Cursor returned it is done in < 15 Seconds. Can some point me where to look into to improve the performance of Crystal Report.
DB: Oracle 10G
CR: Version XIHi Vadiraja
The performance of a report is related to:
External factors:
1. The amount of time the database server takes to process the SQL query.
( Crystal Reports send the SQL query to the database, the database process it, and returns the data set to Crystal Reports. )
2. Network traffics.
3. Local computer processor speed.
( When Crystal Reports receives the data set, it generates a temp file to further filter the data when necessary, as well as to group, sort, process formulas, ... )
4. The number of record returned
( If a sql query returns a large number of records, it will take longer to format and display than if was returning a smaller data set.)
Report design:
1. Where is the Record Selection evaluated.
Ensure your Record Selection Formula can be translated in SQL, so the data can be filter down on the server, otherwise the filtering will be done in a temp file on the local machine which will be much slower.
They have many functions that cannot be translated in SQL because they may not have a standard SQL for it.
For example, control structure like IF THEN ELSE cannot be translated into SQL. It will always be evaluated in Crystal Reports. But if you use an IF THEN ELSE on a parameter, it will convert the result of the condition to SQL, but as soon as uses database fileds in the conditions it will not be translated in SQL.
2. How many subreports the report contains and in section section they are located.
Minimise the number of subreports used, or avoid using subreports if possible because
subreports are reports within a report, and if you have a subreport in a details section, and the report returns 100 records, the subreport will be evaluated 100 times, so it will query the database 100 times. It is often the biggest factor why a report takes a long time to preview.
3. How many records will be returned to the report.
Large number of records will slow down the preview of the reports.
Ensure you only returns the necessary data on the report, by creating a Record Selection Formula, or basing your report off a Stored Procedure, or a Command Object that only returns the desired data set.
4. Do you use the special field "Page N of M", or "TotalPageCount"
When the special field "Page N of M" or "TotalPageCount" is used on a report, it will have to generate each page of the report before it displays the first page, therfore it will take more time to display the first page of the report.
If you want to improve the speed of a report, remove the special field "Page N of M" or "Total Page Count" or formula that uses the function "TotalPageCount". If those aren't use when you view a report it only format the page requested.
It won't format the whole report.
5. Link tables on indexed fields whenever possible.
6. Remove unused tables, unused formulas, unused running totals from the report.
7. Suppress unnecessary sections.
8. For summaries, use conditional formulas instead of running totals when possible.
9. Whenever possible, limit records through selection, not suppression.
10. Use SQL expressions to convert fields to be used in record selection instead of using formula functions.
For example, if you need to concatenate 2 fields together, instead of doing it in a formula, you can create a SQL Expression Field. It will concatenate the fields on the database server, instead of doing in Crystal Reports. SQL Expression Fields are added to the SELECT clause of the SQL Query send to the database.
11. Using one command as the datasource can be faster if you returns only the desired data set.
It can be faster if the SQL query written only return the desired data.
12. Perform grouping on server
This is only relevant if you only need to return the summary to your report but not the details. It will be faster as less data will be returned to the reports.
Regards
Girish Bhosale -
Performance issue webi report-BOXI3.1
Hi,
We have a requirement for a report where we will give user a set of objects (26 u2013 31) to do analysis using interactive viewing feature. Here we are facing severe performance issues and memory issues as the data that we are calling is huge( around 6 million records). At the report level we will be summarizing the data.
No of rows in the report is depending on the no of objects.
Mode of view : Interactive view.
Note:
1. Objects which are using in conditional level those have indexes.
2. No of report level variable are two.
3. Version of Business objects: BOXI3.1
4. OS: Sun Solaris
Please let me know if there are any means by which the memory requirements for the report can be minimized/ performance of the report can be improved.
Thanks,
SubashSubash,
At the report level we will be summarizing the data ... any means by which the memory requirements for the report can be minimized/ performance of the report can be improved
Is there any way that you can summarize this on the database side versus the report level? The database should be sized with memory and disk space properly to handle these types of summarizations versus expecting the application to perform it.
Thanks,
John -
Performance issue in Report (getting time out error)
Hi experts,
I am doing Performance for a Report (getting time out error)
Please see the code below and .
while looping internal table IVBAP after 25 minutes its showing time out error at this poit ->
SELECT MAX( ERDAT ) .
please send alternate code for this .
Advance thanks
from
Nagendra
Get Sales Order Details
CLEAR IVBAP.
REFRESH IVBAP.
SELECT VBELN POSNR MATNR NETWR KWMENG WERKS FROM VBAP
INTO CORRESPONDING FIELDS OF TABLE IVBAP
FOR ALL ENTRIES IN IVBAK
WHERE VBELN = IVBAK-VBELN
AND MATNR IN Z_MATNR
AND WERKS IN Z_WERKS
AND ABGRU = ' '.
Check for Obsolete Materials - Get Product Hierarhy/Mat'l Description
SORT IVBAP BY MATNR WERKS.
CLEAR: WK_MATNR, WK_WERKS, WK_PRDHA, WK_MAKTX,
WK_BLOCK, WK_MMSTA, WK_MSTAE.
LOOP AT IVBAP.
CLEAR WK_INVDATE. "I6677.sn
SELECT MAX( ERDAT ) FROM VBRP INTO WK_INVDATE WHERE
AUBEL EQ IVBAP-VBELN AND
AUPOS EQ IVBAP-POSNR.
IF SY-SUBRC = 0.
MOVE WK_INVDATE TO IVBAP-INVDT.
MODIFY IVBAP.
ENDIF. "I6677.e n
SELECT SINGLE * FROM MBEW WHERE "I6759.sn
MATNR EQ IVBAP-MATNR AND
BWKEY EQ IVBAP-WERKS AND
BWTAR EQ SPACE.
IF SY-SUBRC = 0.
MOVE MBEW-STPRS TO IVBAP-STPRS.
IVBAP-TOT = MBEW-STPRS * IVBAP-KWMENG.
MODIFY IVBAP.
ENDIF. "I6759.en
IF IVBAP-MATNR NE WK_MATNR OR IVBAP-WERKS NE WK_WERKS.
CLEAR: WK_BLOCK, WK_MMSTA, WK_MSTAE, WK_PRDHA, WK_MAKTX.
MOVE IVBAP-MATNR TO WK_MATNR.
MOVE IVBAP-WERKS TO WK_WERKS.
SELECT SINGLE MMSTA FROM MARC INTO MARC-MMSTA
WHERE MATNR = WK_MATNR
AND WERKS = WK_WERKS.
IF NOT MARC-MMSTA IS INITIAL.
MOVE '*' TO WK_MMSTA.
ENDIF.
SELECT SINGLE LVORM PRDHA MSTAE MSTAV FROM MARA
INTO (MARA-LVORM, MARA-PRDHA, MARA-MSTAE, MARA-MSTAV)
WHERE MATNR = WK_MATNR.
IF ( NOT MARA-MSTAE IS INITIAL ) OR
( NOT MARA-MSTAV IS INITIAL ) OR
( NOT MARA-LVORM IS INITIAL ).
MOVE '*' TO WK_MSTAE.
ENDIF.
MOVE MARA-PRDHA TO WK_PRDHA.
SELECT SINGLE MAKTX FROM MAKT INTO WK_MAKTX
WHERE MATNR = WK_MATNR
AND SPRAS = SY-LANGU.
ENDIF.
IF Z_BLOCK EQ 'B'.
IF WK_MMSTA EQ ' ' AND WK_MSTAE EQ ' '.
DELETE IVBAP.
CONTINUE.
ENDIF.
ELSEIF Z_BLOCK EQ 'U'.
IF WK_MMSTA EQ '' OR WK_MSTAE EQ ''.
DELETE IVBAP.
CONTINUE.
ENDIF.
ELSE.
IF WK_MMSTA EQ '' OR WK_MSTAE EQ ''.
MOVE '*' TO WK_BLOCK.
ENDIF.
ENDIF.
IF WK_PRDHA IN Z_PRDHA. "I4792
MOVE WK_BLOCK TO IVBAP-BLOCK.
MOVE WK_PRDHA TO IVBAP-PRDHA.
MOVE WK_MAKTX TO IVBAP-MAKTX.
MODIFY IVBAP.
ELSE. "I4792
DELETE IVBAP. "I4792
ENDIF. "I4792
IF NOT Z_ALNUM[] IS INITIAL. "I9076
SELECT SINGLE * FROM MAEX "I9076
WHERE MATNR = IVBAP-MATNR "I9076
AND ALNUM IN Z_ALNUM. "I9076
IF SY-SUBRC <> 0. "I9076
DELETE IVBAP. "I9076
ENDIF. "I9076
ENDIF. "I9076
ENDLOOP.Hi Nagendra!
Get Sales Order Details
CLEAR IVBAP.
REFRESH IVBAP.
check ivbak is not initial
SELECT VBELN POSNR MATNR NETWR KWMENG WERKS FROM VBAP
INTO CORRESPONDING FIELDS OF TABLE IVBAP
FOR ALL ENTRIES IN IVBAK
WHERE VBELN = IVBAK-VBELN
AND MATNR IN Z_MATNR
AND WERKS IN Z_WERKS
AND ABGRU = ' '.
Check for Obsolete Materials - Get Product Hierarhy/Mat'l Description
SORT IVBAP BY MATNR WERKS.
CLEAR: WK_MATNR, WK_WERKS, WK_PRDHA, WK_MAKTX,
WK_BLOCK, WK_MMSTA, WK_MSTAE.
avoid select widin loop. instead do selection outside loop.u can use read statement......and then loop if required.
LOOP AT IVBAP.
CLEAR WK_INVDATE. "I6677.sn
SELECT MAX( ERDAT ) FROM VBRP INTO WK_INVDATE WHERE
AUBEL EQ IVBAP-VBELN AND
AUPOS EQ IVBAP-POSNR.
IF SY-SUBRC = 0.
MOVE WK_INVDATE TO IVBAP-INVDT.
MODIFY IVBAP.
ENDIF. "I6677.e n
SELECT SINGLE * FROM MBEW WHERE "I6759.sn
MATNR EQ IVBAP-MATNR AND
BWKEY EQ IVBAP-WERKS AND
BWTAR EQ SPACE.
IF SY-SUBRC = 0.
MOVE MBEW-STPRS TO IVBAP-STPRS.
IVBAP-TOT = MBEW-STPRS * IVBAP-KWMENG.
MODIFY IVBAP.
ENDIF. "I6759.en
IF IVBAP-MATNR NE WK_MATNR OR IVBAP-WERKS NE WK_WERKS.
CLEAR: WK_BLOCK, WK_MMSTA, WK_MSTAE, WK_PRDHA, WK_MAKTX.
MOVE IVBAP-MATNR TO WK_MATNR.
MOVE IVBAP-WERKS TO WK_WERKS.
SELECT SINGLE MMSTA FROM MARC INTO MARC-MMSTA
WHERE MATNR = WK_MATNR
AND WERKS = WK_WERKS.
IF NOT MARC-MMSTA IS INITIAL.
MOVE '*' TO WK_MMSTA.
ENDIF.
SELECT SINGLE LVORM PRDHA MSTAE MSTAV FROM MARA
INTO (MARA-LVORM, MARA-PRDHA, MARA-MSTAE, MARA-MSTAV)
WHERE MATNR = WK_MATNR.
IF ( NOT MARA-MSTAE IS INITIAL ) OR
( NOT MARA-MSTAV IS INITIAL ) OR
( NOT MARA-LVORM IS INITIAL ).
MOVE '*' TO WK_MSTAE.
ENDIF.
MOVE MARA-PRDHA TO WK_PRDHA.
SELECT SINGLE MAKTX FROM MAKT INTO WK_MAKTX
WHERE MATNR = WK_MATNR
AND SPRAS = SY-LANGU.
ENDIF.
IF Z_BLOCK EQ 'B'.
IF WK_MMSTA EQ ' ' AND WK_MSTAE EQ ' '.
DELETE IVBAP.
CONTINUE.
ENDIF.
ELSEIF Z_BLOCK EQ 'U'.
IF WK_MMSTA EQ '' OR WK_MSTAE EQ ''.
DELETE IVBAP.
CONTINUE.
ENDIF.
ELSE.
IF WK_MMSTA EQ '' OR WK_MSTAE EQ ''.
MOVE '*' TO WK_BLOCK.
ENDIF.
ENDIF.
IF WK_PRDHA IN Z_PRDHA. "I4792
MOVE WK_BLOCK TO IVBAP-BLOCK.
MOVE WK_PRDHA TO IVBAP-PRDHA.
MOVE WK_MAKTX TO IVBAP-MAKTX.
MODIFY IVBAP.
ELSE. "I4792
DELETE IVBAP. "I4792
ENDIF. "I4792
IF NOT Z_ALNUM[] IS INITIAL. "I9076
SELECT SINGLE * FROM MAEX "I9076
WHERE MATNR = IVBAP-MATNR "I9076
AND ALNUM IN Z_ALNUM. "I9076
IF SY-SUBRC 0. "I9076
DELETE IVBAP. "I9076
ENDIF. "I9076
ENDIF. "I9076
endloop.
U have used many select queries widin loop-endloop which is a big hindrance as far as performance is concerned.Avoid such practice.
Thanks
Deepika -
Performance issue in report programming..
Hi,
I am using one customized Function Module whithin a loop of internal table containing fields of PROJ table for about 200 records . And in the source code of function module there is set of select queries for different tables like COSS COSP , AUFK , PRPS , BPJA PRHI , AFPO , AFKO etc . so due to that my performance of a report is very low , So how can i improve it .
Is there any other way to change a code.
regards
ChetanHi John ,
I am using SAP ECC 6.0 .
The report is used to update a ztable which is already created for Project System plan data .
So i am calling function module which will return a internal table , I am appending this to other internal table and refreshing it , like this I am doing for each project within a loop of PROJ internal table , finaly by using the final itab I am modifying the ztable fields.
Code is as below..
select pspid from proj client specified into corresponding fields of
table t_itab1 where mandt = sy-mandt
and pspnr in s_pspnr
and vbukr = p_vbukr
and prctr in s_prctr.
loop at t_itab1.
l_pspid = t_itab1-pspid.
CALL FUNCTION 'ZPS_FUN_BUDGETS'
EXPORTING
L_PSPID = l_pspid
L_VBUKR = p_vbukr
TABLES
T_DATA = t_itab2 .
loop at t_itab2.
append t_itab2 to t_itab.
endloop.
clear : t_itab2.
refresh : t_itab2.
endloop.
LOOP AT t_itab.
***MODIFY ZTABLE.*****
ENDLOOP.
Regards
Chetan -
Hi Friends,
Can you please give me the exact answer for the below question.
Suppose my report program is taking long time to execute, what are the reasons that may effect the report.
and what are the remedies should i follow, please provide appropriate answer.
Advance thanks,
Chandra.hi,,
Go to tranx SE30 here you will get Tips and Trciks for the performance tuning.
Check it carefully and check whether you can do something for improving performance of you code.
It will help you a lot and will reduce your Execution time.
If you are using READ TABLE command in your program then you can check whether you can use HASHED TABLE it will improve performance of your code a lot but it has some restrictions.
I will explain some of the points to improve the performance of the report try to write code according to that ok.
SOME STEPS USED TO IMPROVE UR PERFORMANCE:
1. Avoid using SELECT...ENDSELECT... construct and use SELECT ... INTO TABLE.
2. Use WHERE clause in your SELECT statement to restrict the volume of data retrieved.
3. Design your Query to Use as much index fields as possible from left to right in your WHERE statement
4. Use FOR ALL ENTRIES in your SELECT statement to retrieve the matching records at one shot.
5. Avoid using nested SELECT statement, SELECT within LOOPs.
6. Avoid using INTO CORRESPONDING FIELDS OF TABLE. Instead use INTO TABLE.
7. Avoid using SELECT * and Select only the required fields from the table.
8. Avoid nested loops when working with large internal tables.
9. Use assign instead of into in LOOPs for table types with large work areas
10. When in doubt call transaction SE30 and use the examples and check your code
11. Whenever using READ TABLE use BINARY SEARCH addition to speed up the search. Be sure to sort the internal table before binary search. This is a general thumb rule but typically if you are sure that the data in internal table is less than 200 entries you need not do SORT and use BINARY SEARCH since this is an overhead in performance.
12. Use "CHECK" instead of IF/ENDIF whenever possible.
13. Use "CASE" instead of IF/ENDIF whenever possible.
14. Use "MOVE" with individual variable/field moves instead of "MOVE-
CORRESPONDING", creates more coding but is more effcient.
Reward points if helpful.
regards,
rekha -
Cursors are not closed when using Ref Cursor Query in a report ORA-01000
Dear Experts
Oracel database 11g,
developer suite 10.1.2.0.2,
application server 10.1.2.0.2,
Windows xp platform
For a long time, I'm hitting ORA-01000
I have a 2 group report (master and detail) using Ref Cusor query, when this report is run, I found that it opens several cursors (should be only one cursor) for the detail query although it should not, I found that the number of these cursors is equal to the number of master records.
Moreover, after the report is finished, these cursors are not closed, and they are increasing cumulatively each time I run the report, and finally the maximum number of open cursors is exceeded, and thus I get ORA-01000.
I increased the open cursors parameter for the database to an unbeleivable value 30000, but of course it will be exceeded during the session because the cursors are increasing cumulatively.
I Found that this problem is solved when using only one master Ref Cursor Query and create a breake group, the problem is solved also if we use SQL Query instead of Ref Query for the master and detail queries, but for some considerations, I should not use neither breake group nor SQL Query, I have to use REF Cursor queries.
Is this an oracle bug , and how can I overcome ?
Thanks
Edited by: Mostafa Abolaynain on May 6, 2012 9:58 AMThank you Inol for your answer, However
Ref Cursor give me felxibility to control the query, for example see the following query :
function QR_1RefCurDS return DEF_CURSORS.JOURHEAD_REFCUR is
temp_JOURHEAD DEF_CURSORS.JOURHEAD_refcur;
v_from_date DATE;
v_to_date DATE;
V_SERIAL_TYPE number;
begin
SELECT SERIAL_TYPE INTO V_SERIAL_TYPE
FROM ACC_VOUCHER_TYPES
where voucher_type='J'
and IDENT_NO=:IDENT
AND COMP_NO=TO_NUMBER(:COMPANY_NO);
IF :no_date=1 then
IF V_SERIAL_TYPE =1 THEN
open temp_JOURHEAD for select VOCH_NO, VOCH_DATE
FROM JOURHEAD
WHERE COMP_NO=TO_NUMBER(:COMPANY_NO)
AND IDENT=:IDENT
AND ((TO_NUMBER(VOCH_NO)=:FROM_NO and :FROM_NO IS NOT NULL AND :TO_NO IS NULL)
OR (TO_NUMBER(VOCH_NO) BETWEEN :FROM_NO AND :TO_NO and :FROM_NO IS NOT NULL AND :TO_NO IS NOT NULL )
OR (TO_NUMBER(VOCH_NO)<=:TO_NO and :FROM_NO IS NULL AND :TO_NO IS NOT NULL )
OR (:FROM_NO IS NULL AND :TO_NO IS NULL ))
ORDER BY TO_NUMBER(VOCH_NO);
ELSE
open temp_JOURHEAD for select VOCH_NO, VOCH_DATE
FROM JOURHEAD
WHERE COMP_NO=TO_NUMBER(:COMPANY_NO)
AND IDENT=:IDENT
AND ((VOCH_NO=:FROM_NO and :FROM_NO IS NOT NULL AND :TO_NO IS NULL)
OR (VOCH_NO BETWEEN :FROM_NO AND :TO_NO and :FROM_NO IS NOT NULL AND :TO_NO IS NOT NULL )
OR (VOCH_NO<=:TO_NO and :FROM_NO IS NULL AND :TO_NO IS NOT NULL )
OR (:FROM_NO IS NULL AND :TO_NO IS NULL ))
ORDER BY VOCH_NO;
END IF;
ELSE
v_from_date:=to_DATE(:from_date);
v_to_date:=to_DATE(:to_date);
IF V_SERIAL_TYPE =1 THEN
open temp_JOURHEAD for select VOCH_NO, VOCH_DATE
FROM JOURHEAD
WHERE COMP_NO=TO_NUMBER(:COMPANY_NO)
AND IDENT=:IDENT
AND ((voch_date between v_from_date and v_to_date and :from_date is not null and :to_date is not null)
OR (voch_date <= v_to_date and :from_date is null and :to_date is not null)
OR (voch_date = v_from_date and :from_date is not null and :to_date is null)
OR (:from_date is null and :to_date is null ))
ORDER BY VOCH_DATE,TO_NUMBER(VOCH_NO);
ELSE
open temp_JOURHEAD for select VOCH_NO, VOCH_DATE
FROM JOURHEAD
WHERE COMP_NO=TO_NUMBER(:COMPANY_NO)
AND IDENT=:IDENT
AND ((voch_date between v_from_date and v_to_date and :from_date is not null and :to_date is not null)
OR (voch_date <= v_to_date and :from_date is null and :to_date is not null)
OR (voch_date = v_from_date and :from_date is not null and :to_date is null)
OR (:from_date is null and :to_date is null ))
ORDER BY VOCH_DATE,VOCH_NO;
END IF;
END IF;
return temp_JOURHEAD;
end; -
Performance issue - Pricing Report
Hi Experts,
I have developed an ALV report and I need to improve the performance in production.
I have a relatively complex issue here. I have to fetch data for pricing from a pricing condition from all the respective underlying condition tables (AXXX). Then for every material, I have to display the material class classification characteristics and their values maintained in the material master as well as batch classification characteristics and their corresponding values maintained for every batch.
For example:
In this way, if the condition is having 10 materials, every material has say 10 material class characteristic and 12 batch class characteristics then the total number of rows in the out put should be 130 (one for material and 12 rows for classifications).
How can I optimize the output so that the performance is optimized.
I am also fetching other data for the output like stock and sales order quantity against every material.
I have tried minimizing loops and used select for all entries(I have read a few threads which suggests otherwise). Read statements use binary search. No select * queries.
Warm Regards,
AbdullahI would not sign a lot of the recommendations going around here. Seems like a complex report you are doing there, so here is my personal generic recommendations:
- always use index when selecting from database tables
- avoid redundant database accesses to unbuffered tables, only read fields you really need for processing
- use sorted or hashed tables for reads or loops and access them by key
- use ASSIGNING rather than INTO when reading or looping (small gain)
- use SE30 and if necessary ST05 to fine tune if not yet satisfied with the runtime
Thomas -
Performance issue for Report Regions
We have an app which has as it's "Home Page" a portal-style page. The "portlets" are separate regions. Some of these regions are HTML forms, but most are report regions, invariably with PL/SQL returning the SQL query.
On our dev box this portal renders fine, in no more than 5 seconds (acceptable). On the production box a couple of the portlets go crazy, usually taking between 1-2 minutes each. I know this because I added the #TIMING# string to each region footer. The portal page has a META tag to refresh every 120 seconds.
The first thing I did to debug was to take the PL/SQL function body returning the SQL query, replace all the variables (which are just local vars, app items and page items) with hardcoded values, strip all quotes and run the "cleaned up" SQL query in SQLPlus. It executes very fast.
So my conclusion is that the problem lies with the application server (which is Oracle iAS). Is this a logical conclusion? What should my next move be to get to the bottom of this?
Even without the problem portlets the portal refresh speed varies widely, between <1 sec and >10 sec.
The application also has other pages displaying the portal regions individually. These pages (including those with the "problem portlet" queries) render ok (ie, order of 5 seconds or less). This tells me that there is an issue with having a bunch of query-based regions on the same page.
My guess is that I might need to examine the logs and/or httpd/modplsql settings of the app server. Number of simultaneous connections?
What would happen if hundreds of users used this app concurrently, navigated to the portal page and left it like that to auto-refresh every 2 minutes? Are such pages a no-no in HTMLDB?Some more information:
When I run the page in debug mode, I see the following timings for one of the problem portlets:
11.53: show report
11.54: determine column headings
11.54: parse query
60.95: binding: ":F118_BUNDLE"="F118_BUNDLE" value="1"
60.95: binding: ":P52_STMETRICS_PRODBRANCH"="P52_STMETRICS_PRODBRANCH" value="RDBMS_MAIN"
60.95: binding: ":P52_STMETRICS_PLATFORM"="P52_STMETRICS_PLATFORM" value="LINUX"
61.35: print column headings
61.35: rows loop: 15 row(s)
[report is printed here]
61.49: FORMITEM: P52_STMETRICS_PLATFORM HIDDEN
61.54: FORMITEM: P52_STMETRICS_PRODBRANCH HIDDEN
This tells me that the bulk of the time (50 seconds in this case) is eaten up in parsing the query. What is this symptomatic of?
Another problem I've just noticed. It takes about 5 seconds to open up a region for editing (Region Definition) in HTMLDB. When I click "Apply Changes" (even without editing anything) it is taking on the order of minutes for the changes to be applied to the page. Our app server was bounced just before I noticed this.
These are the params in our DAD:
<Location /pls/toolsdb>
SetHandler pls_handler
Order deny,allow
PlsqlDatabaseConnectString ...
PlsqlDatabasePassword ...
PlsqlDatabaseUsername ...
PlsqlDefaultPage ...
PlsqlDocumentPath docs
</Location> -
Performance issue with report server
we have recently migrated our application from c/s 6i to 10g web (Oracle AS rel 10.1.2). when we are running one balsheet report it was taking about 16-17min to complete running in web. We are using rwservlet to run this report. to my annoyance the same report when run in c/s environment (with same parameters) takes hardly 1min to complete running. The report has 10 pages in all. what we have observed from em console is that that it formats first 5-6pages of the report in just 150-200secs but later for other pages it is taking lot of time. we have even tried out using jvmoptions but that did not help me either. I request any one of you to suggest me as to what can be done at the application server level so that i can acheive the performance of the c/s version report. I am novice to oracle reports and any help is highly appreciated.
Regards
MuraliWe seem to have fixed the problem of hanging/slow reports by configuring the init.ora file and changing the optimizer_mode to FIRST_ROWS_10. We set the workarea_size_policy to AUTO too and reports sped up significantly, I don't know the exact changes because our DBA's did them but I think the optimizer was the key change.
-
Performance issues in reporting side
hi gurus,
how to improve performance in reporting side.
thankuHi kumar,
Query Performance Techniques:
1. Check Query propertiesUse RSRT tcode
2. Check whether cube is compressed
3. Optimize query definition
4. Analyze query execution
5. Check for additional indexes
6. Archive unwanted data
7. Check for partitioning options
8. Check for additional aggregates ( Consider DB ratio and KPI ratio)
9. Check for parallelization options
10. Use Nav attributes instead of hierarchies, use free char and filters.
Possible causes for the performance :
A) High Database Runtime
B) High OLAP Runtime
C) High Frontend Runtime
Depending upon your analysis
A)Strategy - High Database Runtime
Check if an aggregate is suitable (use All data to get values "selected records to transferred records", a high number here would be an indicator for query performance improvement using an aggregate)
Check if database statistics are update to data for the Cube/Aggregate, use Tcode RSRV output (use database check for statistics and indexes)
Check if the read mode of the query is unfavourable - Recommended (H)
B)Strategy - High OLAP Runtime
Check if a high number of Cells transferred to the OLAP (use "All data" to get value "No. of Cells")
a) Use RSRT technical Information to check if any extra OLAP-processing is necessary (Stock Query, Exception Aggregation, Calc. before Aggregation, Virtual Char. Key Figures, Attributes in Calculated Key Figs, Time-dependent Currency Translation) together with a high number of records transferred.
b) Check if a user exit Usage is involved in the OLAP runtime?
c) Check if large hierarchies are used and the entry hierarchy level is as deep as possible. This limits the levels of the hierarchy that must be processed.
C)Strategy - High Frontend Runtime
1) Check if frontend PC are within the recommendation (RAM, CPU Mhz)
2) Check if the bandwidth for WAN connection is sufficient.
Hope this helps you..
Regards
Mallikarjun -
Post Author: adhiann
CA Forum: General
Hi,
I am trying to create a report using CR11 based on one of my ClearQuest records (recA). This record is has 2 reference fields, each pointing to 2 other record types (Rec B and Rec C). At any given time, only one or the other field will be populated.
TO create a report on RecA, I select the fields I want from the field list of RecA. I can also select recB and RecC's fields as the record types are associated and their fields show up as drill downs on the association fields on RecA.
since RecA can only be associated to either RecB or RecC at any given time, I wish to add field labels on the report format and then for the data use formula that if field 1 is not null, then use recB's headline value else use RecC;s headline value etc..
However, I am encountering blank reports when I use both recb and recC's association fields. I ran several tests with different combinations of using the reference fields and determined that if I add fields from both the reference fields (recB.headline, recB.description and recC.Headline and recC.Description), the report is blank. However, if I only have either recB's fields or recC's fields, the report returns data.
The query I am associating with this report doesn't have any reference field values as filters, its pretty simple and returns a decent amount of results. so I know the problem is with Crystal reports handling reference fields.
Is it not possible to use more than one reference (or reference_list) field on a report? If it is possible, what am I doing wrong?
Thanks in advance,
Nanditaand therefore in this forum with minor chances to get an answer.
regards
Peter -
Reg : Performance Issue in Report
Hi All,
I need Some our ideas to reduce my performance in an program for the below scenario,
i have two internal tables say for example i1 and i2.
i1 has 30,000 records.
i2 has 2,00,000 records.
now i want to find out whether that 30,000 records exists in the 2,00,000 records for that i have written a piece of code as follows,
loop at i1.
loop at i2 where matnr = i1-matnr.
if yes
appending.
endloop.
endloop.
am getting the output but it's very time consuming it takes more than 20 minutes.
Is there any option to compare internal tables without looping even i use read instead of second loop it does the same.
Can any one suggest me how to make a performance faster.
Regards,
Suresh.Suresh Wrote:Hi All,
I need Some our ideas to reduce my performance in an program for the below scenario,
Sorry am not aware how we can reduce the performance of program!! there is lots of way to reduce the performance.
i know only about increase the performance of any program.
sort :
i1 by matnr,
i2 by matnr.
loop at i2.
read table i1 with key matnr = i2-matnr binary search.
if sy-subrc = 0.
appending.
endif.
endloop.
Amit. -
Using Composite Index Performance Issue
Hi,
Need help:
I have a table 'TABLEA' which has composite index on CODE and DEPT_CODE
While fetching the data from the table 'TABLEA' in the WHERE condition I am using only CODE and not DEPT_CODE.
Is the usage of the WHERE condition by using only the CODE column and not DEPT_CODE column affects the performance?
Any help will be needful for meSee the test case below
SQL> create table test_emp
2 (
3 emp_ssn number,
4 emp_name varchar2(50),
5 emp_state varchar2(15),
6 emp_city varchar2(20)
7 );
Table created
SQL> create index test_emp_idx on test_emp(emp_ssn,emp_name);
Index created
SQL> insert into test_emp values (123456789,'Robben','New York','Buffalo');
1 row inserted
SQL> insert into test_emp values (223456789,'Jack','Florida','Miami');
1 row inserted
SQL> insert into test_emp values (323456789,'Peter','Texas','Dallas');
1 row inserted
SQL> insert into test_emp values (423456789,'Johny','Georgia','Atlanta');
1 row inserted
SQL> insert into test_emp values (523456789,'Carmella','California','San Diego');
1 row inserted
SQL> commit;
Commit complete
SQL> explain plan for select /*+ index(test_emp test_emp_idx) */ * from test_emp where emp_ssn = 323456789;
Explained
SQL> select * from table(dbms_xplan.display);
PLAN_TABLE_OUTPUT
Plan hash value: 2345760695
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)
| 0 | SELECT STATEMENT | | 1 | 61 | 163 (1)
| 1 | TABLE ACCESS BY INDEX ROWID| TEST_EMP | 1 | 61 | 163 (1)
|* 2 | INDEX RANGE SCAN | TEST_EMP_IDX | 1 | | 2 (0)
Predicate Information (identified by operation id):
2 - access("EMP_SSN"=323456789)
Note
- dynamic sampling used for this statement
18 rows selected
SQL> explain plan for select /*+ INDEX_SS(test_emp test_emp_idx) */ * from test_emp where emp_name = 'Robben';
Explained
SQL> select * from table(dbms_xplan.display);
PLAN_TABLE_OUTPUT
Plan hash value: 85087452
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)
| 0 | SELECT STATEMENT | | 1 | 61 | 15 (0)
| 1 | TABLE ACCESS BY INDEX ROWID| TEST_EMP | 1 | 61 | 15 (0)
|* 2 | INDEX SKIP SCAN | TEST_EMP_IDX | 1 | | 11 (0)
Predicate Information (identified by operation id):
2 - access("EMP_NAME"='Robben')
filter("EMP_NAME"='Robben')
Note
- dynamic sampling used for this statement
19 rows selectedThanks,
Andy
Maybe you are looking for
-
Memory Question - is this normal?
I just got my new MacBook yesterday. Built week 24. I got 1 GB of RAM on it. I just restarted it and it is charging the battery (after calibrating it). I have not run anything on it since it restarted only the dashboard because I have installed a win
-
Location of DTD or Schema for XSQL Statements
Hi, where can I find a DTD or a XML schema for XSQL statements? I am using XML Spy as XML editor. With a XSQL DTD it would be more comfortable to create xsql files. Kind regards, Achim
-
Integration Of Websphere MQ with Tuxedo
I am developing a small programme which uses websphere MQ5.3 and Oracle 8.1.7.4.My server programme connects to a Queue Manager of Websphere MQ and opens a connection to the queue.But the MQCONN call of websphere MQ fails to connect to queue manager
-
Secure Wireless Corporate SSID
Hi all, I have signed up to these forums today specifically for advice. I am very new to the network / cisco community and hope to spend some time on these forums both learning from everyone here and hopefully in the near future providing input as we
-
BW Report requirement (Trend Analysis Report (Project Systems))
Hello, BW experts I need some help . This requirement is for Trend Analysis Report (Project Systems) I am loading data from ODS to Cube as a Delta Upload ODS is getting data from R/3 data source as a full upload (no delta in the Datasource) I added