About report performance

Hi Friends,
I created a report with 45 ref.cursors,
All ref.cursors are in a Package,
The package is in Database side.
The report is report server.
IF i start to run the report through application
the report is taking 50% of cpu memory around 40 seconds.
is this report performance problem ?
if i have more ref.cursors in report
is there any problem in report performance ?
Can somebody help me ?

One performance consideration I'd do is try to avoid multiple similar queries or even repeats of the same query.
Is
from invoice
where trunc(invoice_date) between :date1 and :date2
and currency_code = '$' -- sometimes 'euro' and so no
and ISSUE_PLACE = 'xx'
and investor_code = :investor_code;
return(v_comm*5.5137);
in main query? Can those Formulas be included/replaced into the main query? Are appropriate Indexes created for the joins?

Similar Messages

  • Report performance while creating report on BEx

    All all!
    I am creating a report on BOE 4.0 on top of BEx connection as a source. I have developed reports on top of universe in the past and i know that if we keep calculations on reporting end it hampers the report performance. Is this the same case with BEx? if we are following the best practices is it ok to say that we should keep all heavy calculations/ aggregation on BEx or backend for better report performance.
    Can you guys please provide your opinion based on your experiance and knowledge.  Any feedbacks will help! Thanks.

    Hi,
    Definitely  best-practice to delegate a maximum of CKF to the Cube where possilble,  put RKF in the BEx query, and Filters too.
    also, add Default Values to your Variables (this will speed up generation of the bics transient universe)
    also, since Patch2.10, we are seeing some significant performance improvements  reducing 'document initialization' and  'time to prompts'  by up to 50% (step such as these often took 1.5 minutes, even on sized systems)
    Also, make sure you have BW corrections like this implemented:  1593802    Performance optimization when loading query views 
    In the BusinessObjects landscape - especially with BI 4.0 - it's all about Sizing and Tuning . Here is your bible the 'sizing companion' guide : http://service.sap.com/~form/sapnet?_SHORTKEY=01100035870000738725&_OBJECT=011000358700000307202011E
    Pay particular attention to BICSChunkSize registry settings
    Also, the  -Xmx JVM Heap Size for the Adaptive Processing Server  that is running the DSL_Bridge service.
    Regards,
    H

  • Bad reporting performance after compressing infocubes

    Hi,
    as I learned, we should compress requests in our infocubes. And since we're using Oracle 9.2.0.7 as database, we can use partitioning on the E-facttable to still increase reporting performance. So far all theory...
    After getting complaints about worse reporting performance we tested this theory. I created four InfoCubes (same datamodel):
    A - no compression
    B - compression, but no partitioning
    C - compression, one partition for each year
    D - compression, one partition for each month
    After loading 135 requests and compressing the cubes, we get this amount of data:
    15.6 million records in each cube
    Cube A: 135 partitions (one per request)
    Cube B:   1 partition
    Cube C:   8 partitions
    Cube D:  62 partitions
    Now I copied one query on each cube and with this I tested the performance (transaction rsrt, without aggregates and cache, comparing the database time QTIMEDB and DMTDBBASIC). In the query I selected always one month, some hierarchy nodes and one branch.
    With this selection on each cube, I expected that cube D would be fastest, since we only have one (small) partition with relevant data. But reality shows some different picture:
    Cube A is fastest with an avg. time of 8.15, followed by cube B (8.75, +8%), cube C (10.14, +24%) and finally cube D (26.75, +228%).
    Does anyone have an idea what's going wrong? Are there same db-parameters to "activate" the partitioning for the optimizer? Or do we have to do some other customizing?
    Thanks for your replies,
    Knut

    Hi Björn,
    thanks for your hints.
    1. after compressing the cubes I refreshed the statistics in the infocube administration.
    2. cube C ist partitioned using 0CALMONTH, cube D ist partitioned using 0FISCPER.
    3. here we are: alle queries are filtered using 0FISCPER. Therefor I could increase the performance on cube C, but still not on D. I will change the query on cube C and do a retest at the end of this week.
    4. loaded data is joined from 10 months. The records are nearly equally distributed over this 10 months.
    5. partitioning was done for the period 01.2005 - 14.2009 (01.2005 - 12.2009 on cube C). So I have 5 years - 8 partitions on cube C are the result of a slight miscalculation on my side: 5 years + 1 partion before + 1 partition after => I set max. no. of partitions on 7, not thinking of BI, which always adds one partition for the data after the requested period... So each partition on cube C does not contain one full year but something about 8 months.
    6. since I tested the cubes one after another without much time between, the system load should be nearly the same (on top: it was a friday afternoon...). Our BI is clustered with several other SAP installations on a big unix server, so I cannot see the overall system load. But I did several runs with each query and the mentioned times are average times over all runs - and the average shows the same picture as the single runs (cube A is always fastest, cube D always the worst).
    Any further ideas?
    Greets,
    Knut

  • Bex Report Performance

    Dear Friends,
    I would like to know is the complex authorizations can also cause the Bex report performance.
    One of my scenerio is like there are two users A & B
    A is having relevant authorizations for reporting, Drill down etc which are required.
    B is having SAP All authorization.
    When the same report has been executed by both users on the same system.
    the data retrieved by user B(SAP_ALL authorization) is quite faster than User A.
    Its like ther diffference of about 10 minutes.
    There are some exsclude selections in report.
    So my conclusion is like the complex authorizations do also hampers the query performance.
    Please confirm & share your views.
    Thanks & Best Regards,
    Vivek Tripathi
    +91-9372313000

    Hi Vivek
         Can you help us understand what was the exact problem and how you resolved it / solution at Extraction / Modeling / Reporting end.
         I have a quite similiar issue with my report i have Header + Item report on Infoset
    u2022     Header report takes seconds and item report takes minutes
    u2022     The same report executed with exact parameter has inconsistent performance results meaning one time it takes 1 minutes next time same report same user and same authorization takes 5 minutes.
        Any help on this would be really greatfull. Suspecting is not an issue with the report at all , as no changes happened between the pre and post check.
    _Additional Information : _
    We Create Secondary -Bitmap index every week end i do not see that is one of the route cause.
    Except that we have our regular daily loads that are running for master data loads and transaction data loads in series.
       Thanks in Advance.
    Much Regards
    Jagadish Thirumalachetty.
    Edited by: Jagadish Thirumalachetty on Jul 14, 2010 1:35 PM

  • FRS report performance issue

    Hello,
    We have a report developed in FRS in the below style.
    http://postimg.org/image/bn9dt630h/b9c2053d/
    Basically, all the dimensions are asked in POV. In the rows of the reports, we have two sparse dimensions that are drilled down to level 0 as shows in above report. The report works fine when run in local currency (Local currency is a stored member). When the report runs in a different currency (dynamic member) then it keeps on running for ages. We waited for 45 minutes and then had to cancel a report, when the same was run in local currency, it gave us our results in 30 seconds.
    My thinking is that there should be a better way of showing level 0 members than using "Descendants of Current Point of View for Total_Entity AND System-defined member list Lev0,Entity" as I presume what it does is get descendants as well as level0 members and then compare them. I have alternate hierarchies hence I am using this, isn't there a simple way of saying - just give me level 0 members of the member selected in POV ?
    I have used below parameters
    Connection - Essbase
    Suppress rows on Database connection server
    Regards,

    Hello,
    >> The report works fine when run in local currency (Local currency is a stored member). When the report runs in a different currency (dynamic member) then it keeps on running for ages.
    You are focusing on the report. The most likely reason is in the performance of the database. Ofcourse, you can reduce the query size and get your report performing again, but the root cause is likely the database design.
    I do not know a function to drill down to the level0 members of the selected POV member.
    If this is something different per user, then you might think about meta-read filters. They would remove all that is not granted.
    Regards,
    Philip Hulsebosch

  • Apex report performance is very poor with apex_item.checkbox row selector.

    Hi,
    I'm working on a report that includes some functionality to be able to select multiple records for further processing.
    The report is based on a view that contains a couple of hundred thousand records.
    When i make a selection from this view in sqlplus , the performance is acceptable but the apex report based on the same view performes very poorly.
    I've noticed that when i omit the apex_item.checkbox from my report query, performance is on par with sqlplus. (factor 10 or so quicker).
    Explain plan appears to be the same with or without checkbox function in the select.
    My query is:
    select apex_item.checkbox(1,tan_id) Select ,
    brt_id
    , tan_id
    , message_id
    , conversation_id
    , action
    , to_acn_code
    , information
    , brt_created
    , tan_created
    from (SELECT brt.id brt_id, -- view query
    MAX (TAN.id) tan_id,
    brt.message_id,
    brt.conversation_id,
    brt.action,
    TAN.to_acn_code,
    TAN.information,
    brt.created brt_created,
    TAN.created tan_created
    FROM (SELECT brt_id, id, to_acn_code, information, created
    FROM xxcjib_transactions
    WHERE tan_type = 'DELIVER' AND status = 'FINISHED') TAN,
    xxcjib_berichten brt
    WHERE brt.id = TAN.brt_id
    GROUP BY brt.id,
    brt.message_id,
    brt.conversation_id,
    brt.action,
    TAN.to_acn_code,
    TAN.information,
    brt.created,
    TAN.created)
    What could be the reason for the poor performance of the apex report?
    And is there another way to select multiple report records without the apex_item.checkbox function?
    I'm using apex 3.2 on oracle 10g database.
    Thanks,
    Niels Ingen Housz
    Edited by: user11986529 on 19-mrt-2010 4:06

    Thanks for your reply.
    Unfortunately changing the pagination doesnt make much of a difference in this case.
    Without the checkbox the query takes 2 seconds.
    With checkbox it takes well over 30 seconds.
    The second report region on this page based on another view seems to perform reasonably well with or without the checkbox.
    It has about the same number of records but with a different view query.
    There are also a couple of filter items in the where clause of the report queries (same for both reports) based on date and acn_code and both reports have a selectlist item displayed in their regions based on a simple lov. These filter items don't seem to be of influence on the performance.
    I have also recreated the report on a seperate page without any other page items or where clause and the same thing occurs.
    With the checkbox its very very slow (more like 20 times slower).
    Without it , the report performs well.
    And another thing, when i run the page with debug on i don't see the actual report query:
    0.08: show report
    0.08: determine column headings
    0.08: activate sort
    0.08: parse query as: APEX_CMA_ONT
    0.09: print column headings
    0.09: rows loop: 30 row(s)
    and then the region is displayed.
    I am using databaselinks in the views b.t.w
    Edited by: user11986529 on 19-mrt-2010 7:11

  • How Can we improve the report performance..?

    Hi exports,
    I am learning the Business Objects XIR2, Please let me know How Can we improve the report performance..?
    Please give the answer in detailed way.

    First find out why your report is performing slowly. Then fix it.
    That sounds silly, but there's really no single-path process for improving report performance. You might find issues with the report. With the network. With the universe. With the database. With the database design. With the query definition. With report variables. With the ETL. Once you figure out where the problem is, then you start fixing it. Fixing one problem may very well reveal another. I spent two years working on a project where we touched every single aspect of reporting (from data collection through ETL and all the way to report delivery) at some point or another.
    I feel like your question is a bit broad (meaning too generic) to address as you have phrased it. Even some of the suggestions already given...
    Array fetch size - this determines the number of rows fetched at a single pass. You really don't need to modify this unless your network is giving issues. I have seen folks suggest setting this to one (which results in a lot of network requests) or 500 (which results in fewer requests but they're much MUCH larger). Does either improve performance? They might, or they might make it worse. Without understanding how your network traffic is managed it's hard to say.
    Shortcut joins? Sure, they can help, as long as they are appropriate. [Many times they are not.|http://www.dagira.com/2010/05/27/everything-about-shortcut-joins/]
    And I could go on and on. The bottom line is that performance tuning doesn't typically fall into a "cookie cutter" approach. It would be better to have a specific question.

  • Is there any benchmarks about the performance of BI web Intelligence

    Need to know any benchmarks about the performance of business objects web intelligence report against number of records stored in a relational database

    There no as such benchmark. However you can verify the actual time take by report query when fired directly on DB and time taken by Report to execute. and if there is lots of difference, then those report would ideal for tuning.
    e.g. if MOnthwise revenue report takes 5 mins and the actual query takes only 1.5 mins when fired directly on DB. that means something wrong with universe/reports.
    I hope this help you to get started.
    Please share your exp. its interesting to know.
    --Kuldeep

  • 2014 SSRS Reports Performance issues

    Hi All,
    After upgrading SQL 2008 reports to SQL 2014 i observed there is performance lag in 2014 ssrs reports,
    2008 reports which used to render in <2 secs now taking >50 secs,
    After doing some checks on why this lag occurred, i found that this is because of the expressions in reports, If i remove all the expressions then report renders < 2 secs otherwise its taking >50 secs 
    My question here is, we used the same expressions in 2008 version also which displays the report in <2 secs but why same thing taking more time in 2014 version.
    Is expressions handling in 2008 and 2014 different.
    and below are the expressions used in the both the versions
    IIF(ISNOTHING(Fields!Comp.Value),"-",Fields!Comp.Value) 
    IIF(ISNOTHING(Fields!Base.Value),"-",Fields!Base.Value)
    IIF(ISNOTHING(Fields!Var.Value),"-",Fields!Var.Value)
    iif(Fields!check.Value=true,"yellow","Transparent")
    Thanks in advance 
    Chandra.

    Hi Chandra,
    According to your description, the same report render slower in SQL Server 2014 than in SQL Server 2008.
    In both SSRS 2008 and SSRS 2014, the expression is processed in the same way. In Reporting Services, the total time to generate a report include TimeDataRetreval, TimeProcessing and TimeRendering. To analyze which section take much time, we can check the
    table Executionlog3 in the ReportServer database. For more information, Please refer to this article:
    More tips to improve performance of SSRS reports.
    After check out which section costs most of time, then you can refer to this article to optimize your report:
    Troubleshooting Reports: Report Performance.
    If possible, please share some information about your report migration.
    Best regards,
    Qiuyun Yu
    Qiuyun Yu
    TechNet Community Support

  • Report about reports

    Hi Gurus
    How to create a report (using Presentation Services) which contains informations about reports, Dashboards?
    I mean exactly, to see the content of the Presentation catalog in a report form.
    Thanks
    Laszlo

    Hi Laszlo
    I am not a guru so that's only my opinion:
    The content of the catalog is stored in files in a folder tree. These files are used by a java-based service that queries the BI server, performs some computations and displays the whole in the pretty report you asked.
    So I don't think there is any easy way to query the catalog. I think it would mean dig in the folders and files to try and get some data, store it in some Oracle table and report on that. Tricky. Some files contain XML (report description) so you can most probably load that. But attribute files are harder to understand. They most probably contain security and other settings information...
    Maybe another source of information could be the NQ server log files. You would then raise the users' log level to some level and then read the log files when you want to perform some query. Again, not sure this is very easy, and most probably not following the best practices.
    good luck!
    Ced.

  • Improving Report Performance

    Hi HTMLDB Team,
    I need some of the undocumented HTMLDB information to improve the performance of the Reports, which will be of useful to every one.
    I have a Customer table, which has 1 Million records in it. It has the following columns:
    First Name, Last Name, Customer ID, State.
    I have a Report Region (PL SQL Block returning the Query), which directly picks up records from this table.
    User can filter Customer Records by any of the above-mentioned column.
    With out any sorting I am getting the response in 2 seconds.
    But If I put Sort on any of these columns my response time goes to 1 min.
    I have index on all the columns too.
    If the user applies any filter like "State=NY" or "First Name=Balaji" I am getting the response in 2-3 seconds.
    Now I want to keep the Report Response time within 2-3 seconds. (Without any filter condition and sorting on some column say "First Name".)
    In order to achieve this I want to add dynamically some filter conditions to the original query. when there are no filter conditions specified by the user.
    My approach is based on the assumption that user will be shown only the first 500 records of the query results (This is the normal HTML DB report behavior and I don't want to increase the Report Region max count beyond 500)
    My Original Report region query is
    SELECT * FROM CUSTOMER ORDER BY FIRST_NAME. -- It is taking 1 minute to returns all 1 million records out of which we are showing only the first 500 records.
    If I rewrite this query as
    SELECT * FROM CUSTOMER WHERE FIRST_NAME < 'B' ORDER BY FIRST_NAME. -- Takes 2 seconds and returns more than 1000 records.
    Now I can use the second query to show the records if the user did not specify any filters and he will be shown only the first 500 records.
    Now my issue is if the user changes the Sort Order to "Last Name" then the above query will not work and I need to change my query as
    SELECT * FROM CUSTOMER WHERE LAST_NAME < 'B' ORDER BY LAST_NAME.
    Similarly if the user selects State as the Sort Order
    SELECT * FROM CUSTOMER WHERE STATE="AK" -- Will definitely give me > 500 records.
    Also I need to consider whether the user is sorting in Ascending or Descending order too.
    From where can I get this Report Sort Order information in HTML DB?
    If you can provide this information, it will be of great use to all HTML DB folks.
    Regards
    Balaji. C

    Hi!
    What exactly did you find at mentioned link regarding report performance ... it is not about performance issues for reports ... id is about navigation from a report row to a page.
    Please, can you tell more exactly what thing solved your solution?
    I have the same situation, a query runing about 1 second dirrectly on the database from SQL Developer, but from APEX page it takes about 55 seconds.
    Thank you.
    Edited by: bustiuci on Nov 15, 2008 9:46 PM

  • Turning off 504 non reporting performance collection rules in the Exchange 2010 MP

    Hello
    I read the following MS article about tuning the Exchange 2010 Management Pack
    http://support.microsoft.com/kb/2592561
    It advises to turn off 504 non reporting performance collection rules, then only turn back on the ones of interest.
    However it does not state how to identify / locate these 504 rules. I would prefer to find/disable them via PowerShell rather than manually.
    Can someone please advise how to locate these 504 rules please and disable on mass.
    Thanks
    AAnotherUser__
    AAnotherUser__

    you’ll need a script that performs the following steps:
    Retrieve the management pack in which to store the overrides
    Retrieve the class that will be targeted by the override
    Retrieve the rules or monitors that will be disabled
    Disable the rule or monitor
    To disable bulk of rule, Here is the sample that disables all rules matching the “*events/sec*” filter.
    $MP =
    Get-SCOMManagementPack -displayname
    "Exchange 2010 Overrides" |
    `
    where {$_.Sealed
    -eq $False}
    $Class =
    Get-SCOMClass -DisplayName
    "Exchange Performance rule"
    $Rule =
    Get-SCOMRule -DisplayName
    "*Events/sec"
    Disable-SCOMRule
    -Class $Class -Rule
    $Rule -ManagementPack
    $MP -Enforce
    Also, you can refer below link
    http://www.systemcentercentral.com/opsmgr-2012-disabling-rules-and-monitors-in-bulk-in-powershell/
    Please remember, if you see a post that helped you please click "Vote As Helpful" and if it answered your question, please click "Mark As Answer"

  • Report Performance - timeout short dump

    Hello Experts,
    i am trying to improve the performace of a report that was developed long time ago.
    Issues i found:
    1. The report has many select...Endselect combinations, and selects inside the loop statements.
    2. Most of the selects have the addition 'into corresponding fields of' for selecting a few fields, without  the table addition.
    3.  Also few selects have the 'select * from'  syntax.
    data: begin of itab occurs 0,
              f1,
              f2
              f3.....
              fn,          
            end of itab.
    Ex: loop at itab.
             select f1 f2 f3 from table1
                   into corresponding fields of itab1.
               collect itab1.
             endselect.
              select f4 f5 from table2
                  into corresponding fields of itab2.
               endselect.
          endloop.
    All this leeds to performace issues.
    i have checked ST05, and i have got the details of the error.
    My question is which one of the reasons i mentioned above are a major factor in delaying the report performance?
    Which one of the above should i conetrate first to get the long runtime down? My goal is to keep my changes to the minimum and improve the performance. Please advise.

    > My question is which one of the reasons i mentioned above are a major factor in delaying the report
    > performance?
    Don't ask people for guesses, if you can see the facts!
    Run the SQL Trace several times, and use go to 'Trace List' -> 'Summarize Trace by SQL Statement'
    => Shows you total DB time and time per statement (all executions), the problems are on top of the list.
    Check ABAP, detail, and explain!
    Read more here:
    /people/siegfried.boes/blog/2007/09/05/the-sql-trace-st05-150-quick-and-easy
    Siegfried

  • Report Performance degradation

    hi,
    We are using around 16 entities in crm on demand R 16which includes both default as well as custom entites.
    Since custom entities are not visible in the historical subject area , we decided to stick to the real time reporting.
    Now the issue is , we have total 45 lakh record in these entites as a whole.We have reports where we need to retrieve the data across all the enties in one report.Intially we tested the reports with lesser no of records...the report performance was not that bad....but gradually it has degraded as we loaded more n more data over a period of time.The reports now takes approx 5-10 min and then finally diaplays an error msg.Infact after creating a report structure in Step 1 - Define Criteria......n moving to Step 2 - Create Layout it takes abnormal amount of time to display.As far as reports are concerned, we have built them using the best practice except the "Historical Subject Area Issue".
    Ideally for best performance how many records should be there one entity?
    What cud be the other reasons for such a performance?
    We are working in a multi tenant enviroment
    Edited by: Rita Negi on Dec 13, 2009 5:50 AM

    Rita,
    Any report built over the real-time subject areas will timeout after a period of 10 minutes. Real-time subject areas are really not suited for large reports and you'll find running them also degrades the application performance.
    Things that will degrade performance are:
    * Joins to other dimensions
    * Custom calculations
    * Number of records
    * Number of fields returned
    There are some things that just can't be done in real-time. I would look to remove joins from other dimensions e.g. Accounts/Contacts/Opportunities all in the same report. Apply more restrictive filters, e.g. current week/month to reduce the number of records required. Alternatively have very simple report, extract to excel and modify from there. Hopefully in R17 this will be added as a feature but it seems like you're stuck till then
    Thanks
    Oli @ Innoveer

  • Report Performance for GL item level report.

    Hi All,
    I have a requirements to get GL line items
    report based on GL Line items so have created data model like 0FI_GL_4->DSO-> cube and tested everything is fine but when execute in production the report performance is very bad.
    Report contains document number, GL act, comp.code, posting date objects.
    I have decided to do as follows to improve reporting performance
    ·         Create Aggregate on Document, GL characteristic
    ·         Compression.
    Can I do aggregates 1st then do the compression.
    Please let me know if I missing out anything.
    Regards,
    Naani.

    Hi Naani,
    First fill the Aggrigates then do Compression,run SAP_INFOCUBE_DESIGN Check the size of Dimension maintain Line item, High cordinality to the dimension, Set Cahe for query in RSRT,
    Try to reduce Novigational Attr in report. Below document may help you.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/6071ed5f-1057-2e10-deb6-d3426fec0219?QuickLink=index&…
    Regards,
    Jagadeesh

Maybe you are looking for

  • How to get the first 4 chars form a var ?

    Hi guys, How to get the first 4 chars form a var ? i.e  temp type num20 value '00000000000012345678'. how to move the first 4 chars to another var? thx in advance.

  • External display recommendations?

    I have a black MacBook (2.0ghz, 2 gigs of ram, etc.) and am in the market for a new external display. I'm currently running a 19" LCD (VGA) as an extended display, but the color is rather miserable and the image quality is awful. I've been leaning to

  • FlexUnit ant - running tests that make server calls

    Hi, I am using BlazeDS in my project to connect to a Tomcat server. I have configured my services in the services-config.xml file and passed the location of the file as an argument to the compiler in Flash Builder. I have a few unit tests that make a

  • VNX Monitoring MP not working

    Hi, I have imported VNX monitoring MP in our environment. I installed the navisphere host agent on the same server where the event will be sent to, from the Storage array. I added the storage array and switches through network discovery. I created th

  • IPhoto 10 will not open, 7 gigs worth of pictures, can't see any

    iphoto will not launch, gets hung up and have to force quit to stop it, tried reinstalling but did not help.