Slow report

Hi
I have the next problem:
I have a report on which two of his columns are generated by function and this report is very slow.I made the next tests:
1.When I run the script of the report in DB it is fast .
2.When I remove this two columns the report is fast also (in APEX).

function GET_ANALYSIS(RUN_NUM NUMBER, A_TYPE VARCHAR2 ) return varchar2 is
result varchar2(32767);
begin
result :='<table>';
for i in (select to_char(a.start_date,'dd.mm.yyyy hh24:mi:ss') as start_date, a.id
from tilo_analysis a
where a.run_id = RUN_NUM
and a.type = a_type) loop
result := result||'<tr><td> <span class="class2"><aa href='||apex_util.prepare_url('f?p='||v('APP_ID')
||':23100:'||v('SESSION')||':DO_NOT_CLEAR:'||v('DEBUG')||':23100:TILO_ANALYSE_ID,TILO_LOG_RUN_ID:'||i.id||','||RUN_NUM )||'
title='||i.start_date||'>'||i.start_date||'</a></span></td>'; --DB
result := result||'<td> <span class="class2"><aahref='||apex_util.prepare_url('f?p='||v('APP_ID')
||':27100:'||v('SESSION')||':DO_NOT_CLEAR:'||v('DEBUG')||':27100:TILO_ANALYSE_ID,TILO_LOG_RUN_ID:'||i.id||','||RUN_NUM)||'
title='||i.start_date||'>'||i.start_date||'</a></span></td></tr>'; --SESSION    
end loop;
-- result := result||'<tr><td><input type="button" value="Generate" onclick="doSubmit(''GENERATE_BUTTON'');" id="GENERATE_BUTTON" tabindex="4" class="htmldbButton" /></td></tr>';
result := result||'</table>';
if A_TYPE = 'TILO_ONLINE_ANALYSIS' then
result := result||'<input type="button" value="Generate" onclick="doSubmit(''GENERATE_TILO_ANALYSIS!'||RUN_NUM||''');" id="GENERATE_TILO_ANALYSIS_BUTTON" tabindex="4" class="htmldbButton" />';
end if;
return(Result);
end ;
P.S <aa = <a I made it to escape the link
Edited by: slyy on Aug 27, 2009 10:52 PM

Similar Messages

  • How to make slow reports fast?

    Hi all,
    How to make slow reports fast? What could be the various reasons and optimizations/solutions?
    Thanks,
    Charles.
    ++++++++++++++++++++++++++++++

    Its an on going process. Couple of steps you will have to follow: Did you perform any SQL trace ..?
            Did you perform any ABAP trace...?
            How much business data these reports are processing.Is there a possibility you can do some changes.
    SQL Trace ( SAP in built tool) will give you a fair idea about each and every SQL statements in your report. And you can anlyze bad ones. And also propse optimized SQL or
    index if needed.
    ABAP Trace(SAP in bulit tool), you can find out amount of ABAP run time thats being elapsed by each and individual abap module.
    Also the amount of business data is every  important. While designing the report alogorithm this is very important factor.

  • Slow report viewer/rdlc performance in local mode with Single Sign On

    Hi Team,
    We have recently enabled Single Sign On to our application and after that our rdlc reports loading got extremely slow.
    Please find the below configuration that we are using.
    1. Report Viewer 11.0.0.0
    2. running rdlc file in local mode (not using Report Server)
    3. System.IdentityModel.Services 4.0.0.0
    The query behind the reports is returning result in 5-10 sec but report is taking 1-4 min to load (sometimes getting timeout) (as per the complexity of the report).
    We have tried a lot of workaound but nothing worked.
    i saw performance improvement in reports by addding <trust legacyCasModel = "True"   level="Full" /> in config file, But using this we are getting "Dynamic operations can only be performed in homogenous AppDomain" error
    in many pages of our application.
    Without SSO reports are running completely fine.
    We are stucked here and not able to proceed. Is there any issue with the SSO and rdlc in local mode ? Is there any hot fix available for the same ?
    Please help !!!
    Regards,
    Pranav Sharma

    This problem is probably related to :
    [http://blogs.oracle.com/stevenChan/2010/03/ebs_jre_issues_16018.html]
    Oracle problem ID : 1054293.1
    Loginpage / Error in Browser for Export and Attachments after upgrading to Sun JRE 1.6.0_18 [ID 1054293.1]
    Sun bug : 6927268
    ShowDocument calls results in new iexplorer process

  • SLOW report performance with bind variable

    Environment: 11.1.0.7.2, Apex 4.01.
    I've got a simplified report page where the report runs slowly compared to running the same query in sqldeveloper. The report region is based on a pl/sql function returning a query. If I use a bind variable in the query inside apex it takes 13 seconds to run, and if I hard code a string it takes only a few hundredths of a second. The query returns one row from a table which has 1.6 million rows. Statistics are up-to-date and the columns in the joins and where clause are indexed.
    I've run traces using p_trace=YES from Apex for both the bind variable and hard coded strings. They are below.
    The sqldeveloper explain plan is identical to the bind variable plan from the trace, yet the query runs in 0.0x seconds in sqldeveloper.
    What is it about bind variable syntax in Apex that is causing the bad execution plan? Apex Bug? 11g bug? Ideas?
    tkprof output from Apex trace with bind variable is below...
    select p.master_id link, p.first_name||' '||p.middle_name||' '||p.last_name||' '||p.suffix personname,
    p.gender||' '||p.date_of_birth g_dob, p.master_id||'*****'||substr(p.ssn,-4) ssn, p.status status
    from persons p
    where
       p.person_id in (select ps.person_id from person_systems ps where ps.source_key  like  LTRIM(RTRIM(:P71_SEARCH_SOURCE1)))
    order by 1
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      1      0.00       0.01          0          1         27           0
    Fetch        2     13.15      13.22      67694      72865          0           1
    total        4     13.15      13.23      67694      72866         27           1
    Misses in library cache during parse: 0
    Optimizer mode: ALL_ROWS
    Parsing user id: 62  (ODPS_PRIVACYVAULT)   (recursive depth: 1)
    Rows     Row Source Operation
          1  SORT ORDER BY (cr=72869 pr=67694 pw=0 time=0 us cost=29615 size=14255040 card=178188)
          1   FILTER  (cr=72869 pr=67694 pw=0 time=0 us)
          1    HASH JOIN RIGHT SEMI (cr=72865 pr=67694 pw=0 time=0 us cost=26308 size=14255040 card=178188)
          1     INDEX FAST FULL SCAN IDX$$_0A300001 (cr=18545 pr=13379 pw=0 time=0 us cost=4993 size=2937776 card=183611)(object id 68485)
    1696485     TABLE ACCESS FULL PERSONS (cr=54320 pr=54315 pw=0 time=21965 us cost=14958 size=108575040 card=1696485)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
          1   SORT (ORDER BY)
          1    FILTER
          1     HASH JOIN (RIGHT SEMI)
          1      INDEX   MODE: ANALYZED (FAST FULL SCAN) OF
                     'IDX$$_0A300001' (INDEX)
    1696485      TABLE ACCESS   MODE: ANALYZED (FULL) OF 'PERSONS' (TABLE)
    Elapsed times include waiting on following events:
      Event waited on                             Times   Max. Wait  Total Waited
      ----------------------------------------   Waited  ----------  ------------
      db file scattered read                       1276        0.00          0.16
      db file sequential read                       812        0.00          0.02
      direct path read                             1552        0.00          0.61
    ********************************************************************************Here's the tkprof output with a hard coded string:
    select p.master_id link, p.first_name||' '||p.middle_name||' '||p.last_name||' '||p.suffix personname,
    p.gender||' '||p.date_of_birth g_dob, p.master_id||'*****'||substr(p.ssn,-4) ssn, p.status status
    from persons p
    where
       p.person_id in (select ps.person_id from person_systems ps where ps.source_key  like  LTRIM(RTRIM('0b')))
    order by 1
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.02       0.04          0          0          0           0
    Execute      1      0.00       0.00          0          0         13           0
    Fetch        2      0.00       0.00          0          8          0           1
    total        4      0.02       0.04          0          8         13           1
    Misses in library cache during parse: 1
    Optimizer mode: ALL_ROWS
    Parsing user id: 62  (ODPS_PRIVACYVAULT)   (recursive depth: 1)
    Rows     Row Source Operation
          1  SORT ORDER BY (cr=10 pr=0 pw=0 time=0 us cost=9 size=80 card=1)
          1   FILTER  (cr=10 pr=0 pw=0 time=0 us)
          1    NESTED LOOPS  (cr=8 pr=0 pw=0 time=0 us)
          1     NESTED LOOPS  (cr=7 pr=0 pw=0 time=0 us cost=8 size=80 card=1)
          1      SORT UNIQUE (cr=4 pr=0 pw=0 time=0 us cost=5 size=16 card=1)
          1       TABLE ACCESS BY INDEX ROWID PERSON_SYSTEMS (cr=4 pr=0 pw=0 time=0 us cost=5 size=16 card=1)
          1        INDEX RANGE SCAN IDX_PERSON_SYSTEMS_SOURCE_KEY (cr=3 pr=0 pw=0 time=0 us cost=3 size=0 card=1)(object id 68561)
          1      INDEX UNIQUE SCAN PK_PERSONS (cr=3 pr=0 pw=0 time=0 us cost=1 size=0 card=1)(object id 68506)
          1     TABLE ACCESS BY INDEX ROWID PERSONS (cr=1 pr=0 pw=0 time=0 us cost=2 size=64 card=1)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
          1   SORT (ORDER BY)
          1    FILTER
          1     NESTED LOOPS
          1      NESTED LOOPS
          1       SORT (UNIQUE)
          1        TABLE ACCESS   MODE: ANALYZED (BY INDEX ROWID) OF
                       'PERSON_SYSTEMS' (TABLE)
          1         INDEX   MODE: ANALYZED (RANGE SCAN) OF
                        'IDX_PERSON_SYSTEMS_SOURCE_KEY' (INDEX)
          1       INDEX   MODE: ANALYZED (UNIQUE SCAN) OF 'PK_PERSONS'
                      (INDEX (UNIQUE))
          1      TABLE ACCESS   MODE: ANALYZED (BY INDEX ROWID) OF
                     'PERSONS' (TABLE)

    Patrick, interesting insight. Thank you.
    The optimizer must be peeking at my bind variables with it's eyes closed. I'm the only one testing and I've never passed %anything as a bind value. :)
    Here's what I've learned since my last post:
    I don't think that sqldeveloper is actually using the explain plan it says it is. When I run explain plan in sqldeveloper (with a bind variable) it shows me the exact same plan as Apex with a bind variable. However, when I run autotrace in sqldeveloper, it takes a path that matches the hard coded values, and returns results in half a second. That autotrace run is consistent with actually running the query outside of autotrace. So, I think either sqldeveloper isn't really using bind variables, OR it is using them in some other way that Apex does not, or maybe optimizer peeking works in sqldeveloper?
    Using optimizer hints to tweak the plan helps. I've tried both /*+ FIRST_ROWS */ and /*+ index(ps pk_persons) */ and both drop the query to about a second. However, I'm loath to use hints because of the very dynamic nature of the query (and Tom Kyte doesn't like them either). The hints may end up hurting other variations on the query.
    I also tested the query by wrapping it in a select count(1) from ([long query]) and testing the performance in sqldeveloper and in Apex. The performance in that case is identical with both bind variables and hard coded variables for both Apex and SqlDeveloper. That to me was very interesting and I went so far as to set up two bind variable report regions on the same page. One region wrapped the long query with select count(1) from (...) and the other didn't. The wrapped query ran in 0.01 seconds, the unwrapped took 15ish seconds with no other optimizations. Very strange.
    To get performance up to acceptable levels I have changed my function returning query to:
    1) Set the equality operator to "=" for values without wildcards and "like" for user input with wildcards. This makes a HUGE difference IF no wildcard is used.
    2) Insert a /*+ FIRST_ROWS */ hint when users chose the column that requires the sub-query. This obviously changes the optimizer's plan and improves query speed from 15 seconds to 1.5 seconds even with wildcards.
    I will NOT be hard coding any user supplied values in the query string. As you can probably tell by the query, this is an application where sql injection would be very bad.
    Jeff, regarding your question about "like '%' || :P71_SEARCH_SOURCE1 || '%'". I've found that putting wildcards around values, particularly at the beginning will negate any indexing on the column in question and slows performance even more.
    I'm still left wondering if there isn't something in Apex that is breaking the optimizer "peeking" that Patrick describes. Perhaps something in the way it switches contexts from apex_public_user to the workspace schema?

  • Slow report caused by 85K record dataset

    I'm developing a report that cross validates fields in the Item table, based on criteria-FILTERS. I have 30 such FILTERS.
    Item Table is all about Item SKU properties and has Fields like: group, category, colour, price, cost, brand,class, etc
    Filters should cross validate the fields that has not been set properly, for example: For category field = ABC , group field <> class field.
    The output of each filter expected to be 5-50 records (possible mismatches) 
    My initial plan was
    Use full  big single  dataset -  85K records x 20 fields  from  Item table NOT FILTERED
    Create 30 Tablixes in the report and APPLY FILTERS on Tablix level 
    After  creating 3 tablixes with filters, my report 's become very slow.
    QUESTION: for the better performance is it better to use 1 big dataset, 30 tablixes will reuse this dataset but apply own filters , each filter expected to be 5-50 records (my initial plan).
    OR create 30 datasets and each dataset would return  5-50 records ???

    Hi BrBa,
    As per my understanding, I think the second plan that create 30 datasets and each dataset returns 5-50 records can improve the report performance. The reasons are as follows:
    There are 85k records in your database, while only about 1000 records are needed. If we use the first plan, we should add a lot of filters on the tablix. This would spend much more time than directly filter the values using SQL server database engine.
    Additional, if you need all 85k records, maybe the first pane would be better. Because SQL server database engine can quickly retrieve the data using table scan, it still needs a lot of filters on the tablix. But in the second plan, it needs good enough
    quick search method. This would depends on your data.
    The following document about Troubleshooting Reports: Report Performance is for your reference:
    http://technet.microsoft.com/en-us/library/bb522806(v=sql.105).aspx
    If there are any other questions, please feel free to ask.
    Regards,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Slow reports giving "An error has occurred: Request timed out" on first run

    I have BO XI R1 SP4 installed on Windows 2003 R2 SP3 with .Net 1.1 installed.
    Most of our reports are quite large (>60000 rows returned) and will timeout in InfoView when first run and give the message in the subject. When you close the window and re-run them then they come up almost straight away.
    This tells me that the report runs and it gets stored in the cache server, which is fine, but I want to be able to eliminate the timeout for the users.
    I found a solution if .Net 2.0 is installed (HOW TO: Resolve 'An error has occurred: Request timed out' issue) but my server only has .Net 1.1.
    Is there a change to a timeout I can made to the server as it is, or should I install .Net 2.0 and change IIS to use that version of .Net then make the solution change already suggested?
    TIA
    I should add that the timeout is after around 5 minutes, so the normal changes recommended at note 1198539 doesn't seem to apply, as the timeout happens after the normal 90 seconds expires for some time.
    Edited by: Geoff Croxson on Sep 3, 2009 2:02 PM

    Will something like this work on an iPhone and where would I locate it?  My App Store was working just fine and this last week it's been slower and slower, plus difficulties with downloading and updates. Both my downloads and updates start and then pop up a 'can't install at this time'. I thought it might b my storage space so I started deleting. Well I have plenty of space now and still get same problem.
    Now over the last two days I will type in a search to reinstall the apps that I removed and the App Store tells me request timed out.
    Help please

  • Peformance its very slow Report BI 7

    Hi All,
    Very Good Morning.........
    Peformance it's very slow in Bex Report..... how to reactify the problem, please provide solutions ...
    Please provide step-by-step.
    If any possibel Enchancement of the report.
    Advance thanks........
    Bye,
    Vijaay.

    Hi Vijay,
    There is very much you can do about performance in the background (infocube level)
    But, very littile you can do in the front end (query level)
    to increase performance,
    1) create indexes on the cube - creating indexes increase the readspeed of data.
    Infocube manage -> performance tab -> create DB indexes*
    2)Roll up of data to aggregates - If aggregates are build already, you can roll up the data else you can check the option to create aggregates
    Infocube manage -> Rollup tab -> Execute.*
    3)Compression - compression is a technique where you compress all the data in to a single request hence the over head of maintaining mulotiple requests get reduced subsequently increasing the query performance.
    Infocube Manage -> collapse -> execute.
    apart from the above, you also have the option to partition the cube. a little search on cube partition in SDN can give you the details.
    Hope this helps,
    Sri...

  • Better user experience for slow reports (loading in background, progressbar

    Hi
    In our application, some reports take a long time till they are displayed. Therefore it would be nice, if i could provide the user with some feedback (example: progressbar). Is it possible to have this? Upon print, when a pdf file is being constructed (in web) we already have such functionality in place, is it possible to have it generally?
    Mayb,e it would also be possible to have the report load the first page quickly, so the user has already something to look at, and load all the other needed data in a background thread. (i dont think it is possible, however, somebody may have an idea)
    Does anybody have an idea how we could achieve a better user experience for our reports. Any help is appreciated. Thanks.
    Greetings
    I am using:
    Crystal Reports 2008 (in Asp.net, Windows Forms)
    Version .NET 3.5 (SP1)
    PS: of course a very fast report would be the finest solution. However, i guess this will not be possible with our reports and the volume of data they access

    Hi Daniel,
    Crystal Report 2008 we have a inbuilt functionality of progress bar .If you are not getting progress bar for every report that means those reports are not taking to much time.
    For more info regarding progressbar  this check this link:
    [https://boc.sdn.sap.com/node/8035]
    Look for processingIndicatorDelay and processingIndicatorText.
    If you want to improve performance of report then we need to keep in mind few thing while creating report in Crystal report designer:
    Here are a few points related to performance issue .
    The performance of a report is related to:
    External factors:
    1. The amount of time the database server takes to process the SQL query.
    ( Crystal Reports send the SQL query to the database, the database process it, and returns the data set to Crystal Reports. )
    2. Network traffics.
    3. Local computer processor speed.
    ( When Crystal Reports receives the data set, it generates a temp file to further filter the data when necessary, as well as to group, sort, process formulas, ... )
    4. The number of records returned
    ( If a SQL query returns a large number of records, it will take longer to format and display than if was returning a smaller data set.)
    Report design:
    1. Where is the Record Selection evaluated?
    Ensure your Record Selection Formula can be translated in SQL, so the data can be filter down on the server, otherwise the filtering will be done in a temp file on the local machine which will be much slower.
    They have many functions that cannot be translated in SQL because they may not have a standard SQL for it.
    For example, control structure like IF THEN ELSE cannot be translated into SQL. It will always be evaluated
    in Crystal Reports. But if you use an IF THEN ELSE on a parameter, it will convert the result of the condition to
    SQL, but as soon as uses database fileds in the conditions it will not be translated in SQL.
    2. How many subreports the report contains and in section section they are located.
    Minimise the number of subreports used, or avoid using subreports if possible because
    subreports are reports within a report, and if you have a subreport in a details section, and the report returns 100
    records, the subreport will be evaluated 100 times, so it will query the database 100 times. It is often the biggest
    factor why a report takes a long time to preview.
    3. How many records will be returned to the report.
    Large number of records will slow down the preview of the reports. Ensure you only returns the necessary data on the report, by creating a Record Selection Formula, or basing your report off a Stored Procedure, or a Command Object that only returns the desired data set.
    4. Do you use the special field "Page N of M", or "TotalPageCount"
    When the special field "Page N of M" or "TotalPageCount" is used on a report, it will have to generate each page
    of the report before it displays the first page, therfore it will take more time to display the first page of the report.
    If you want to improve the speed of a report, remove the special field "Page N of M" or "Total Page Count" or formula that uses the function "TotalPageCount". If those aren't use when you view a report it only format the page requested. It won't format the whole report.
    5. Link tables on indexed fields whenever possible.
    6. Remove unused tables, unused formulas, unused running totals from the report.
    7. Suppress unnecessary sections.
    8. For summaries, use conditional formulas instead of running totals when possible.
    9. Whenever possible, limit records through selection, not suppression.
    10. Use SQL expressions to convert fields to be used in record selection instead of using formula functions.
    For example, if you need to concatenate 2 fields together, instead of doing it in a formula, you can create a SQL Expression Field. It will concatenate the fields on the database server, instead of doing in Crystal Reports.
    SQL Expression Fields are added to the SELECT clause of the SQL Query send to the database.
    11. Using one command as the datasource can be faster if you returns only the desired data set. It can be faster if the SQL query written only return the desired data.
    12. Perform grouping on server
    This is only relevant if you only need to return the summary to your report but not the details. It will be faster as less data will be returned to the reports.
    Hope this helps!!
    Regards,
    Shweta

  • Slow report generation

    Hi,
    I have a report where its results depend on what option the user chooses in a selection list. The select list option alters the where clause in the query and displays new information. The problem is that it takes a ridiculously long amount of time to display results that have a larger amount of results (~100). Even with a table with 12 results there is a noticeable delay. The query includes a subquery as well as an order by and uses links to the database which is meant to make it act slower but I don't think such a thing would affect it so badly with 100 results? By slow display I mean around 5 minutes of loading. Meanwhile a query (not in the report) without the where statement doesn't take very long. The code is as follows:
    select rpad(svc_app_name,8) project,rpad(loc_name,12)location,
    rpad(svc_name,10)service,svs_date size_date,
    round((svs_used_sz/1024)) used_size_MB
    from hkadm.srvc@HKADM2_DBAAPEX.TNTEWW.COM, hkadm.svcsize@HKADM2_DBAAPEX.TNTEWW.COM, hkadm.host@HKADM2_DBAAPEX.TNTEWW.COM, hkadm.loc@HKADM2_DBAAPEX.TNTEWW.COM
    where
       svc_name = svs_svc_name
       and svs_date = (select max(svs_date) from hkadm.svcsize@HKADM2_DBAAPEX.TNTEWW.COM a
                          where a.svs_hst_ip = svcsize.svs_hst_ip
                            and a.svs_svc_name = svcsize.svs_svc_name)
    and svc_app_name = :P5_Link
      and svc_hst_ip = hst_ip
      and hst_loc_code = loc_code
    order by 1,2,3,4My select list has submit on click and and there's a branch to the same page on submit. Any help appreciated.
    Mike

    You should create a view at your remote server like this:
    CREATE OR REPLACE VIEW test_view
    AS
       SELECT   RPAD (svc_app_name, 8) project, RPAD (loc_name, 12) LOCATION,
                RPAD (svc_name, 10) service, svs_date size_date,
                ROUND ((svs_used_sz / 1024)) used_size_mb,
                svc_app_name
           FROM hkadm.srvc, hkadm.svcsize, hkadm.HOST, hkadm.loc
          WHERE svc_name = svs_svc_name
            AND svs_date =
                   (SELECT MAX (svs_date)
                      FROM hkadm.svcsize a
                     WHERE a.svs_hst_ip = svcsize.svs_hst_ip
                       AND a.svs_svc_name = svcsize.svs_svc_name)
            AND svc_hst_ip = hst_ip
            AND hst_loc_code = loc_code
       ORDER BY 1, 2, 3, 4and query it like this:
    SELECT project, LOCATION, service, size_date, used_size_mb
      FROM test_view@hkadm2_dbaapex.tnteww.com
    WHERE svc_app_name = :p5_listDenes Kubicek
    http://deneskubicek.blogspot.com/
    http://www.opal-consulting.de/training
    http://apex.oracle.com/pls/otn/f?p=31517:1
    ------------------------------------------------------------------------------

  • Slow report performance when filter by 1 date

    Post Author: poh_michelle
    CA Forum: Formula
    Record selection formula : {PartTran.TransDate} in {?FromDate} to {?ToDate}
    Report performance was okay when i select #08/10/2007# for {?FromDate} and #10/10/2007# fro ; report displayed within 5 seconds.However, the report slow down wehn i select same date (#10/10/2007#) for both {?FromDate} and {?ToDate}; report displayed after 20 minutes.
    What is the reason and what can be done to improve it.
    Thanks for any advice.

    Post Author: foghat
    CA Forum: Formula
    Not sure why selecting a single date would slow down your report.  Are you sure something else wasn't/isn't going on? You could give this a try: {PartTran.TransDate} >= {?FromDate}and {PartTran.TransDate} <= {?ToDate}

  • Slow Reports after adding calculated item

    Hi All,
       I am facing a slow response after adding few calculated items in a report. Is it a normal behavior of OBIEE in terms of adding calculated items?
    Anyone faced similar issue please comment.
    Thanks,
    Muhammad Waqas

    Hi Muhammad Waqas,
    Have you run the physical query that is generated by BI Server on Database? Once executed on database, compare the execution time with OBIEE execution times, with this you will get the answer.
    Depends on the amount of data and the type of calculation, it is normal behaviour to slow down the performance of the reports after adding calculated measures.
    Please mark it Helpful/Correct, if it is.
    est Regards,
    Kalyan Chukkapalli
    http://123obi.com

  • VERY slow reporting performance

    We've created reports which, when run directly on Oracle through sqlplus (copying the exact SQL from the report) take around 3 minutes, when run from BO (using Web Intelligence) take hours.
    The SQL outputs around 300,000 rows, which equates to a few hundred pages in the report.  Is there a reason why the report would take so long (the session to Oracle is open the whole time)?  I can understand it not taking just the time for Oracle to execute the query, as WebI then needs to take the results and produce the report, but I wouldn't expect it to take hours.
    So far on our servers we've concentrated on making Oracle perform faster, and now we've got that working better our attention has been drawn to the performance issues of the reporting, and we don't have any experience with tuning it, so any advice would be very much appreciated.
    We're running BO Enterprise XI R2 SP3 on Linux with Web Intelligence as the only reporting interface.  The database is Oracle RAC (3 node cluster) 10.2.0.4.

    Correction: I incorrectly said jobserver in the earlier post (I read your post wrong).
    You need to start a trace on the CMS (for querytiming), the Webi Report Server and the Connection Server (which accesses the db)
    For the CMS trace, stop the CMS, in ccm.config, add a '-trace -querytiming' to the end of the cmsLaunch line. Similarly, add a -trace to the command line for the Webi Report Server (I don't remember the exact line; I don't have a XI R2 deployment on Linux available right now)
    To enable tracing on the connection server, do the following:
    (SAP 1197745 - How to enable connection server tracing on a UNIX platform)
    Open cs.cfg in a text editor. These are located by default in
    Linux: bobje/enterprise115/linux_x86/dataAccess/RDBMS/connectionServer/cs.cfg
    Locate the following entry:
    Traces ClassID="csTRACELOG" Active="No"
    Change the "No" to "Yes".
    Traces ClassID="csTRACELOG" Active="Yes"
    Save the file.
    Restart the Web Intelligence Report Server.
    ====================
    NOTE:
    Be aware that connection server traces slow down the server performance. Detailed traces get generated when this level of tracing is enabled. Disable tracing when not needed.
    ====================
    PS: The Webi Report Server contains the Connection Server libraries. The Connection Server runs in-proc within the Webi Report Server. Hence the need to restart the Webi Report Server.
    Also, I encourage you to open a ticket with BO on this. Performance issues are never easy to troubleshoot, even less so on a forum such as this.
    Finally, I don't believe there is any document that lists all command line options. Or rather, if such a document exists, I don't believe it's available for public consumption.

  • Slow Report Load Speed with SQL Command

    I have the following SQL Command in my Crystal Report.  When I add the "SalesOrderDeliveries.omdDeliveryDate" to the report it examines 240,000 records and takes 3 minutes to completely load the report.  However, if I do not include this, it only examines 137,000 and only takes 10 seconds to run.
    Why is this slowing down the report and running through more records than when it is omitted from the report?
    SELECT
      SalesOrderDeliveries.omdDeliveryDate,
      SalesOrderDeliveries.omdSalesOrderID,
      Jobs.jmpJobID
    FROM dbo.Jobs
    INNER JOIN dbo.SalesOrderJobLinks
      ON Jobs.jmpJobID = SalesOrderJobLinks.omjJobID
    INNER JOIN dbo.SalesOrders
      ON SalesOrderJobLinks.omjSalesOrderID = SalesOrders.ompSalesOrderID
    INNER JOIN dbo.SalesOrderLines
      ON SalesOrders.ompSalesOrderID = SalesOrderLines.omlSalesOrderID
    INNER JOIN dbo.SalesOrderDeliveries
      ON SalesOrderLines.omlSalesOrderID = SalesOrderDeliveries.omdSalesOrderID
    WHERE SalesOrderDeliveries.omdShippedComplete = 0
    AND Jobs.UJMPPMTRACK = -1
    GROUP BY Jobs.jmpJobID,
             SalesOrderDeliveries.omdDeliveryDate,
             SalesOrderDeliveries.omdSalesOrderID
    ORDER BY SalesOrderDeliveries.omdSalesOrderID

    Try doing something like this for your query:
    SELECT
      qryNextDate.nextDate,
      qryNextDate.omdSalesOrderID,
      Jobs.jmpJobID
    FROM dbo.Jobs
    INNER JOIN dbo.SalesOrderJobLinks
      ON Jobs.jmpJobID = SalesOrderJobLinks.omjJobID
    INNER JOIN dbo.SalesOrders
      ON SalesOrderJobLinks.omjSalesOrderID = SalesOrders.ompSalesOrderID
    INNER JOIN dbo.SalesOrderLines
      ON SalesOrders.ompSalesOrderID = SalesOrderLines.omlSalesOrderID
    INNER JOIN (
      select omdSalesOrderID, min(omdDeliveryDate) as nextDate
      from dbo.SalesOrderDeliveries
      where omdShippedComplete = 0
      group by omdSalesOrderID) as qryNextDate
      ON SalesOrderLines.omlSalesOrderID = qryNextDate.omdSalesOrderID
    where Jobs.UJMPPMTRACK = -1
    ORDER BY SalesOrderDeliveries.omdSalesOrderID
    This will get you just the minimum delivery date for each order, which should be the "next" due date.
    -Dell

  • Slow Report Processing  somtimes timout, but no cpu load

    Hi,
    after searching different forums, the help and here I finally have to post our problem.
    Running a report from our web application  is very slow (5 minutes), even the  the timeout  message
    "Error  The request timed out because there has been no reply from the server for 600,000 milliseconds. " for some reports occur.
    Monitoring the database server shows me a  hit at the cpu load,  while processing the sql, after this i see   some load at the server where BOE is installed, but only for a small amount of time (<1 minute).
    After this  no load on any server can be seen, but the viewing page  in a browser is loading and loading and loading  till the timeout appears.
    Running the same reports in The CR Designer works, it takes some time but it works
    I'm bit unsure now where to look next, a network problem I would exclude, having a 1 Gigabit network here.
    Any one here with hints  where to look next ?
    Thanks in advance for your help
    Regards
    Thomas

    Hi
    You need to adjust the Windows Power Plane settings in order to improve the cooling system.
    <Setting Method>
    Changes in Operating System:
    Control Panel --> Power Options --> Change plan settings --> Change advanced power settings --> TOSHIBA Power Saver Settings --> Cooling Method
    Changes in Bios (Power Management):
    Intel Turbo boost Technology
    Changes in Operating System:
    TOSHIBA HW Setup --> CPU Frequency mode
    Try this settings:
    Power plane (Toshiba power saver) Battery optimized
    Intel Turbo Boost technology (BIOS) - Disable
    CPU frequency Always low
    From my experience this should reduce the noise.

  • Very slow report output in browser

    Hi,
    I have forms&reports 10gR2 with APS 10gR2, when I run report normal(with less data)report take atleast 2-3 minutes at first time although later on it takes only seconds to display but first time it takes too much time, I don't know what is wrong when I check detail on report server it shows that it took only few seconds to process report(Start and Finish time) but it displays in browser too late some one have idea, plz help.
    Thanks and Regards.
    Khawar
    Message was edited by:
    S. Khawar

    Dear Hamdy,
    Hope You are fine.
    I have some strange problems running Oracle 10gR2 Forms and Reports Services(WIthout a Patch) on Windows 2000(P4.1.5GRam) with Oracle Database 10gR2 on RHEL 4.0(P3Xeon,2GRam).
    Users are using WindowsXp 2002 or Higher.
    Other Info about Setup
    1). One more OC4J Instance was created but could not solve problem.
    2). Report Servers min 3 Max 10.
    There is a frequently accessed Form related to Voucher Entery and its Report.
    1). Sometimes records more than 20 detail enteries could not be saved and after refreshing the webbrowser(Internet Explorer 6 or higher) the user is able to insert a large voucher. This is very trouble for the data entery operators.
    2). Other is that, Sometimes user is working on application but control losses from user(ability to work with application) where as there can work on applications and window services. This is also very problem for a user when he has entered 10 detail records against a voucher.
    3). The Application is also being accessed from other cities by using public IP.
    These Problems are very strange for me.
    Waiting for user response.....

Maybe you are looking for

  • Data overwriting n not getting displayed in seperate ro

    Hi Experts, I have a very simple doubt here. I have two popups with me. Say popup1 and popup2. Both are tables. popup2 opens through popup1. I have some data in popup2 which i have to pass to popup1. I select one row in popup2 and pass the same to po

  • I cannot import Power Point slides into my presentation...

    Hello, have created brand new project with Captivate 7 and tried to import a Power Point slide I made seconds earlier, and I get an error message.  I tried the same with a new project in Captivate 6 (I have 6 and 7) and I get the same message.  See i

  • Validity start and Validity End date in PO

    HI  gurus    can you plz tell me how to activate  validity start and Validity End date in PO Neha

  • Static variable in openmp

    Hello all, I'd like to receive some hints about how to parallelize the code below that Thread Analyser has detected races for static variables: #pragma omp parallel for private(i) for (i=0; i<n; i++){ x = calc(a); int calc(int a){ static int x,y,z=0;

  • E50 with CK-20W

    Hi, can anyone tell me if the ck-20w works ok with the E50 at the moment i have a E50 with ck-7w but the functions are terible i.e you cannot adjust sound (via seperate speaker)just get loud beep no voice calling (again loud beep) infact the only thi