Regarding Reports performance

Hi All,
Now i am working in Production system.
How can i check the performance for reports.
How can i find the report performance is low/high?
Please let me know the details.
Thanks
Vasu.

Hi Vasu,
Use the search on the word  "performance", you will got lot of answers which are related to queries.
There are lot of factrors involved regarding query performance :
1. Install BW stats and analyze the queries
2. Option of using aggregates is a good idea to consider
3. you can multi providers to do reporting for better query performance
4. Query Read Mode - Read when navigating and expanding hierarchies
5. Check if indices under ORACLE are degenerated. OSS Note 323090
OLAP Cache OSS Note # 456068
Also take a look to the OSS Note 567746 'Composite note BW 3.x performance: Query & Web' in which you can find all the info required...
Assign points if helpful
Bye
Dinesh

Similar Messages

  • Improving Report Performance

    Hi HTMLDB Team,
    I need some of the undocumented HTMLDB information to improve the performance of the Reports, which will be of useful to every one.
    I have a Customer table, which has 1 Million records in it. It has the following columns:
    First Name, Last Name, Customer ID, State.
    I have a Report Region (PL SQL Block returning the Query), which directly picks up records from this table.
    User can filter Customer Records by any of the above-mentioned column.
    With out any sorting I am getting the response in 2 seconds.
    But If I put Sort on any of these columns my response time goes to 1 min.
    I have index on all the columns too.
    If the user applies any filter like "State=NY" or "First Name=Balaji" I am getting the response in 2-3 seconds.
    Now I want to keep the Report Response time within 2-3 seconds. (Without any filter condition and sorting on some column say "First Name".)
    In order to achieve this I want to add dynamically some filter conditions to the original query. when there are no filter conditions specified by the user.
    My approach is based on the assumption that user will be shown only the first 500 records of the query results (This is the normal HTML DB report behavior and I don't want to increase the Report Region max count beyond 500)
    My Original Report region query is
    SELECT * FROM CUSTOMER ORDER BY FIRST_NAME. -- It is taking 1 minute to returns all 1 million records out of which we are showing only the first 500 records.
    If I rewrite this query as
    SELECT * FROM CUSTOMER WHERE FIRST_NAME < 'B' ORDER BY FIRST_NAME. -- Takes 2 seconds and returns more than 1000 records.
    Now I can use the second query to show the records if the user did not specify any filters and he will be shown only the first 500 records.
    Now my issue is if the user changes the Sort Order to "Last Name" then the above query will not work and I need to change my query as
    SELECT * FROM CUSTOMER WHERE LAST_NAME < 'B' ORDER BY LAST_NAME.
    Similarly if the user selects State as the Sort Order
    SELECT * FROM CUSTOMER WHERE STATE="AK" -- Will definitely give me > 500 records.
    Also I need to consider whether the user is sorting in Ascending or Descending order too.
    From where can I get this Report Sort Order information in HTML DB?
    If you can provide this information, it will be of great use to all HTML DB folks.
    Regards
    Balaji. C

    Hi!
    What exactly did you find at mentioned link regarding report performance ... it is not about performance issues for reports ... id is about navigation from a report row to a page.
    Please, can you tell more exactly what thing solved your solution?
    I have the same situation, a query runing about 1 second dirrectly on the database from SQL Developer, but from APEX page it takes about 55 seconds.
    Thank you.
    Edited by: bustiuci on Nov 15, 2008 9:46 PM

  • Regarding Database performance for report creation

    Hi,
    Currently i have one database in that two schema one for table data and second for reports creation.But it is taking more time to display data because there is more load on first schema.
    I want to create second database for report schema .But i have to access table data from first database .
    1) There is one option to fetch data from first database is through DB Link . But i think it also takes more time to fetch data.
    Is there any way i can access data from first database and i get more performance with creating new database.
    Kindly give me suggestion . What should i do to improve reports performance?

    user647572 wrote:
    Hi,
    Currently i have one database in that two schema one for table data and second for reports creation.But it is taking more time to display data because there is more load on first schema.
    I want to create second database for report schema .But i have to access table data from first database .
    1) There is one option to fetch data from first database is through DB Link . But i think it also takes more time to fetch data.
    Is there any way i can access data from first database and i get more performance with creating new database.
    Kindly give me suggestion . What should i do to improve reports performance?You have more two options:
    1. Use Oracle Streams and replicate tables between databases. WHile using reporting, you'll refer to the second database
    2. Create Standby database, it's the clone of your database where you can update it by adding archived redo log files from primary database

  • Report performance while creating report on BEx

    All all!
    I am creating a report on BOE 4.0 on top of BEx connection as a source. I have developed reports on top of universe in the past and i know that if we keep calculations on reporting end it hampers the report performance. Is this the same case with BEx? if we are following the best practices is it ok to say that we should keep all heavy calculations/ aggregation on BEx or backend for better report performance.
    Can you guys please provide your opinion based on your experiance and knowledge.  Any feedbacks will help! Thanks.

    Hi,
    Definitely  best-practice to delegate a maximum of CKF to the Cube where possilble,  put RKF in the BEx query, and Filters too.
    also, add Default Values to your Variables (this will speed up generation of the bics transient universe)
    also, since Patch2.10, we are seeing some significant performance improvements  reducing 'document initialization' and  'time to prompts'  by up to 50% (step such as these often took 1.5 minutes, even on sized systems)
    Also, make sure you have BW corrections like this implemented:  1593802    Performance optimization when loading query views 
    In the BusinessObjects landscape - especially with BI 4.0 - it's all about Sizing and Tuning . Here is your bible the 'sizing companion' guide : http://service.sap.com/~form/sapnet?_SHORTKEY=01100035870000738725&_OBJECT=011000358700000307202011E
    Pay particular attention to BICSChunkSize registry settings
    Also, the  -Xmx JVM Heap Size for the Adaptive Processing Server  that is running the DSL_Bridge service.
    Regards,
    H

  • Report Performance for GL item level report.

    Hi All,
    I have a requirements to get GL line items
    report based on GL Line items so have created data model like 0FI_GL_4->DSO-> cube and tested everything is fine but when execute in production the report performance is very bad.
    Report contains document number, GL act, comp.code, posting date objects.
    I have decided to do as follows to improve reporting performance
    ·         Create Aggregate on Document, GL characteristic
    ·         Compression.
    Can I do aggregates 1st then do the compression.
    Please let me know if I missing out anything.
    Regards,
    Naani.

    Hi Naani,
    First fill the Aggrigates then do Compression,run SAP_INFOCUBE_DESIGN Check the size of Dimension maintain Line item, High cordinality to the dimension, Set Cahe for query in RSRT,
    Try to reduce Novigational Attr in report. Below document may help you.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/6071ed5f-1057-2e10-deb6-d3426fec0219?QuickLink=index&…
    Regards,
    Jagadeesh

  • 2004s Web report performance is not good ,though that of 3x web is OK.

    Hi,
    I feel 2004s Web report performance is bad, though that of 3x web is no problem (the same query is used.) it is worse than BEx analyzer.
    This query has more than 1,000 records and those queries that have many records result in the same bad performance.
    Of course there are many reason for this bad performance, please tell me your solution by which you solve the same problem like this.
    the SIDs of EP and BI is difference here.
    CPU is not consumed when 2004s web report is executed.
    And I have cancelled  virus scan to this web report...
    Kind regards,
    Masaaki

    It is bad, am sure it's down to the new .net and java based technology.  Aggregates are a way forwards though from what i've heard of the BI Accelerator this is the real way forwards.

  • Bex Report Performance

    Dear Friends,
    I would like to know is the complex authorizations can also cause the Bex report performance.
    One of my scenerio is like there are two users A & B
    A is having relevant authorizations for reporting, Drill down etc which are required.
    B is having SAP All authorization.
    When the same report has been executed by both users on the same system.
    the data retrieved by user B(SAP_ALL authorization) is quite faster than User A.
    Its like ther diffference of about 10 minutes.
    There are some exsclude selections in report.
    So my conclusion is like the complex authorizations do also hampers the query performance.
    Please confirm & share your views.
    Thanks & Best Regards,
    Vivek Tripathi
    +91-9372313000

    Hi Vivek
         Can you help us understand what was the exact problem and how you resolved it / solution at Extraction / Modeling / Reporting end.
         I have a quite similiar issue with my report i have Header + Item report on Infoset
    u2022     Header report takes seconds and item report takes minutes
    u2022     The same report executed with exact parameter has inconsistent performance results meaning one time it takes 1 minutes next time same report same user and same authorization takes 5 minutes.
        Any help on this would be really greatfull. Suspecting is not an issue with the report at all , as no changes happened between the pre and post check.
    _Additional Information : _
    We Create Secondary -Bitmap index every week end i do not see that is one of the route cause.
    Except that we have our regular daily loads that are running for master data loads and transaction data loads in series.
       Thanks in Advance.
    Much Regards
    Jagadish Thirumalachetty.
    Edited by: Jagadish Thirumalachetty on Jul 14, 2010 1:35 PM

  • FRS report performance issue

    Hello,
    We have a report developed in FRS in the below style.
    http://postimg.org/image/bn9dt630h/b9c2053d/
    Basically, all the dimensions are asked in POV. In the rows of the reports, we have two sparse dimensions that are drilled down to level 0 as shows in above report. The report works fine when run in local currency (Local currency is a stored member). When the report runs in a different currency (dynamic member) then it keeps on running for ages. We waited for 45 minutes and then had to cancel a report, when the same was run in local currency, it gave us our results in 30 seconds.
    My thinking is that there should be a better way of showing level 0 members than using "Descendants of Current Point of View for Total_Entity AND System-defined member list Lev0,Entity" as I presume what it does is get descendants as well as level0 members and then compare them. I have alternate hierarchies hence I am using this, isn't there a simple way of saying - just give me level 0 members of the member selected in POV ?
    I have used below parameters
    Connection - Essbase
    Suppress rows on Database connection server
    Regards,

    Hello,
    >> The report works fine when run in local currency (Local currency is a stored member). When the report runs in a different currency (dynamic member) then it keeps on running for ages.
    You are focusing on the report. The most likely reason is in the performance of the database. Ofcourse, you can reduce the query size and get your report performing again, but the root cause is likely the database design.
    I do not know a function to drill down to the level0 members of the selected POV member.
    If this is something different per user, then you might think about meta-read filters. They would remove all that is not granted.
    Regards,
    Philip Hulsebosch

  • Interactive report performance problem over database link - Oracle Gateway

    Hello all;
    This is regarding a thread Interactive report performance problem over database link that was posted by Samo.
    The issue that I am facing is when I use Oracle function like (apex_item.check_box) the query slow down by 45 seconds.
    query like this: (due to sensitivity issue, I can not disclose real table name)
    SELECT apex_item.checkbox(1,b.col3)
    , a.col1
    , a.col2
    FROM table_one a
    , table_two b
    WHERE a.col3 = 12345
    AND a.col4 = 100
    AND b.col5 = a.col5
    table_one and table_two are remote tables (non-oracle) which are connected using Oracle Gateway.
    Now if I run above queries without apex_item.checkbox function the query return or response is less than a second but if I have apex_item.checkbox then the query run more than 30 seconds. I have resolved the issues by creating a collection but it’s not a good practice.
    I would like to get ideas from people how to resolve or speed-up the query?
    Any idea how to use sub-factoring for the above scenario? Or others method (creating view or materialized view are not an option).
    Thank you.
    Shaun S.

    Hi Shaun
    Okay, I have a million questions (could you tell me if both tables are from the same remote source, it looks like they're possibly not?), but let's just try some things first.
    By now you should understand the idea of what I termed 'sub-factoring' in a previous post. This is to do with using the WITH blah AS (SELECT... syntax. Now in most circumstances this 'materialises' the results of the inner select statement. This means that we 'get' the results then do something with them afterwards. It's a handy trick when dealing with remote sites as sometimes you want the remote database to do the work. The reason that I ask you to use the MATERIALIZE hint for testing is just to force this, in 99.99% of cases this can be removed later. Using the WITH statement is also handled differently to inline view like SELECT * FROM (SELECT... but the same result can be mimicked with a NO_MERGE hint.
    Looking at your case I would be interested to see what the explain plan and results would be for something like the following two statements (sorry - you're going have to check them, it's late!)
    WITH a AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_one),
    b AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_two),
    sourceqry AS
    (SELECT  b.col3 x
           , a.col1 y
           , a.col2 z
    FROM table_one a
        , table_two b
    WHERE a.col3 = 12345
    AND   a.col4 = 100
    AND   b.col5 = a.col5)
    SELECT apex_item.checkbox(1,x), y , z
    FROM sourceqry
    WITH a AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_one),
    b AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_two)
    SELECT  apex_item.checkbox(1,x), y , z
    FROM table_one a
        , table_two b
    WHERE a.col3 = 12345
    AND   a.col4 = 100
    AND   b.col5 = a.col5If the remote tables are at the same site, then you should have the same results. If they aren't you should get the same results but different to the original query.
    We aren't being told the real cardinality of the inners select here so the explain plan is distorted (this is normal for queries on remote and especially non-oracle sites). This hinders tuning normally but I don't think this is your problem at all. How many distinct values do you normally get of the column aliased 'x' and how many rows are normally returned in total? Also how are you testing response times, in APEX, SQL Developer, Toad SQLplus etc?
    Sorry for all the questions but it helps to answer the question, if I can.
    Cheers
    Ben
    http://www.munkyben.wordpress.com
    Don't forget to mark replies helpful or correct ;)

  • Database migrated from Oracle 10g to 11g Discoverer report performance issu

    Hi All,
    We are now getting issue in Discoverer Report performance as the report is keep on running when database got upgrade from 10g to 11g.
    In database 10g the report is working fine but the same report is not working fine in 11g.
    The query i have changed as I have passed the date format TO_CHAR("DD-MON-YYYY" and removed the NVL & TRUNC function from the existing query.
    The report is now working fine in Database 11g backhand but when I am using the same query in Discoverer it is not working and report is keep on running.
    Please advise.
    Regards,

    Pl post exact OS, database and Discoverer versions. After the upgrade, have statistics been updated ? Have you traced the Discoverer query to determine where the performance issue is ?
    How To Find Oracle Discoverer Diagnostic and Tracing Guides [ID 290658.1]
    How To Enable SQL Tracing For Discoverer Sessions [ID 133055.1]
    Discoverer 11g: Performance degradation after Upgrade to Database 11g [ID 1514929.1]
    HTH
    Srini

  • Hyperion Interactive reporting performance issue.

    Hi,
    We created a report in Hyperion Interactive reporting using Hyperion Essbase as database connection file .
    Report performance was good in Interactive reporting Studio we don't have any problem in studio.
    when we open the the report in Hyperion Workspace We are facing performance issue of the report and also when i hit refresh button to refresh data in the Workspace,i am getting the following error message
    *"An Interactive Reporting Service error has occurred - Failed to acquire requested service. Error Code : 2001"*
    Any suggestions to resolve this will be really helpful.
    Thanks in advance
    Thanks
    Vamsi
    Edited by: user9363364 on Aug 24, 2010 7:49 AM
    Edited by: user9363364 on Sep 1, 2010 7:59 AM

    Hi
    i also faced such an issue and then i found the answer on metalink
    Error: "An Interactive Reporting Service Error has Occurred. Failed to Acquire Requested Service. Error Code: 2001" when Processing a bqy Report in Workspace. [ID 1117395.1]     
    Applies to:
    Hyperion BI+ - Version: 11.1.1.2.00 and later [Release: 11.1 and later ]
    Information in this document applies to any platform.
    Symptoms
    Obtaining the following error when trying to process a BQY that uses an Essbase data source in Workspace:
    "An Interactive Reporting Service error has occurred. Failed to acquire requested service. Error Code: 2001".
    Cause
    The name of the data source in the CMC contained the machine name in fully qualified name format whereas the OCE contained the machine name only. This mismatch in machine names caused the problem. Making the machine name identical in both cases resolved the problem.
    Solution
    Ensure that the name of the data source as specified in the OCE in Interactive Reporting Studio matches the name specified in the CMC tool in the field "Enter the name of the data source".
    In fact, all fields need to match between the OCE and the CMC Data Source.
    regards
    alex

  • 2014 SSRS Reports Performance issues

    Hi All,
    After upgrading SQL 2008 reports to SQL 2014 i observed there is performance lag in 2014 ssrs reports,
    2008 reports which used to render in <2 secs now taking >50 secs,
    After doing some checks on why this lag occurred, i found that this is because of the expressions in reports, If i remove all the expressions then report renders < 2 secs otherwise its taking >50 secs 
    My question here is, we used the same expressions in 2008 version also which displays the report in <2 secs but why same thing taking more time in 2014 version.
    Is expressions handling in 2008 and 2014 different.
    and below are the expressions used in the both the versions
    IIF(ISNOTHING(Fields!Comp.Value),"-",Fields!Comp.Value) 
    IIF(ISNOTHING(Fields!Base.Value),"-",Fields!Base.Value)
    IIF(ISNOTHING(Fields!Var.Value),"-",Fields!Var.Value)
    iif(Fields!check.Value=true,"yellow","Transparent")
    Thanks in advance 
    Chandra.

    Hi Chandra,
    According to your description, the same report render slower in SQL Server 2014 than in SQL Server 2008.
    In both SSRS 2008 and SSRS 2014, the expression is processed in the same way. In Reporting Services, the total time to generate a report include TimeDataRetreval, TimeProcessing and TimeRendering. To analyze which section take much time, we can check the
    table Executionlog3 in the ReportServer database. For more information, Please refer to this article:
    More tips to improve performance of SSRS reports.
    After check out which section costs most of time, then you can refer to this article to optimize your report:
    Troubleshooting Reports: Report Performance.
    If possible, please share some information about your report migration.
    Best regards,
    Qiuyun Yu
    Qiuyun Yu
    TechNet Community Support

  • Webi Reports - Performance Issues

    Hi Experts,
           Right now we are using BO XI R2 version. We have 2 servers, server 1 is old and server 2 is new (AIX server u2013 new upgrade of old server).
          When I trying to schedule the report (webi) in both server, reports is running successfully. But problem is that the report scheduling time is more in new server (AIX) than old server (Server1).
    There is some performance issues
    Example:
    Old serve     : 1 hrs (time taken)
    New server  : 2 hrs (time taken)
    Could you please tell me how to increase the webi report performance in new server?
    Regards,
    Sridharan Krishnan

    Hi,
    How to enable Excel and Pdf option under Save as file in infoview.
    When i trying to click modify option under public folder reports ,Report is getting open but i am not able to save that report as excel or pdf , since those option is disabled in infoview.
    But it is enabled in user private folder Reports.
    Right now we have upgraded the objects from XI R2 to BO 3.1, Since there is some difference in security rights in 3.1, Please tell me how to fix it.
    BO Version u2013  3.1
    Regards,
    Sridharan

  • Web reporting performance

    Hi,
    We are experiencing very slow response times when viewing reports over the web.  We get a connection time out in most cases.  Can somebody provide any guidance as to how to optimize/improve web reporting performance.  Thanks for your help.
    SC

    Hello:
    Since you stated explicitly "when viewing reports over the web", I assume that your performance issue has been isolated to some extent (web queries instead of general query problems).  Therefore things like aggregates and the timeout parameter are not likely your main bottleneck (you would notice the issue for all queries).
    I would start here - on SDN, Business Information Warehouse > Performance Tuning section.  The whitepaper on the performance tuning topic is available there, which is quite comprehensive.
    Since the issue appears to be web-related, I would speculate that there may be an IGS issue.  When analyzing the performance data, make sure that the front-end times for the queries you have problems with are high.  Network overhead is a component to this - since BWQ results are compressed this shouldn't be a big overhead though.
    In terms of optimizing/improving web performance, your killer apps are precalaculate template / information broadcasting.  These techniques allow you to run the query ahead of time in the background and store the result set to deliver immediately when the user runs a query. 
    Good luck with the performance analysis.  If you hit a wall you may consider contacting SAP Active Global Support or SAP Consulting for further assistance.
    Regards -
    Ron Silberstein, SAP

  • Report Performance - timeout short dump

    Hello Experts,
    i am trying to improve the performace of a report that was developed long time ago.
    Issues i found:
    1. The report has many select...Endselect combinations, and selects inside the loop statements.
    2. Most of the selects have the addition 'into corresponding fields of' for selecting a few fields, without  the table addition.
    3.  Also few selects have the 'select * from'  syntax.
    data: begin of itab occurs 0,
              f1,
              f2
              f3.....
              fn,          
            end of itab.
    Ex: loop at itab.
             select f1 f2 f3 from table1
                   into corresponding fields of itab1.
               collect itab1.
             endselect.
              select f4 f5 from table2
                  into corresponding fields of itab2.
               endselect.
          endloop.
    All this leeds to performace issues.
    i have checked ST05, and i have got the details of the error.
    My question is which one of the reasons i mentioned above are a major factor in delaying the report performance?
    Which one of the above should i conetrate first to get the long runtime down? My goal is to keep my changes to the minimum and improve the performance. Please advise.

    > My question is which one of the reasons i mentioned above are a major factor in delaying the report
    > performance?
    Don't ask people for guesses, if you can see the facts!
    Run the SQL Trace several times, and use go to 'Trace List' -> 'Summarize Trace by SQL Statement'
    => Shows you total DB time and time per statement (all executions), the problems are on top of the list.
    Check ABAP, detail, and explain!
    Read more here:
    /people/siegfried.boes/blog/2007/09/05/the-sql-trace-st05-150-quick-and-easy
    Siegfried

Maybe you are looking for

  • Calling stored procedure with php

    I'm trying to call a stored procedure through php but keep getting the error message Fatal error: Call to undefined function: oci_bind_by_name() The code I have written so far is (:v_staff_id is an in parameter and the other two are outs) : $staff_id

  • Updated Itunes and hit a problem

    Not sure if this is the right forum as i can't remember what my ipod is, bought two years ago and it may be a video ipod i'm just not sure. Anyway I updated itunes the other day and now all my playlists appear to now be empty. Not sure what i've done

  • Error Installing SAP Digital Dashboard - Trial Version

    I downloaded the trail version of SAP Digital Dashboard earlier today and am trying to install. I'm getting a message that XP and Office must be pre-installed, but I have XP and Office 2010 (32-bit). I'm not sure why it won't install. I extracted all

  • Mac Book Pro slow

    Problem with the Mac Book Pro 13 Intel Core 2 Duo, 4GB RAM I have imported quite a lot of photographs to my HDD library and before I noticed, I had less that 2GB space left and my mac seriously slowed down. After a short while it crashed and then it

  • MacBook random shutdown w/10.6.7 update

    I installed the 10.6.7 combo update this morning, and now I have a problem with random shutdowns. The problem persists even after a PRAM reset. When I run Apple Hardware Test, the short test reports no problem. The extended test...shuts down before c