BW Report Performance, Accuracy of Data

Hi,
Can someone help give explanations to following questions :
1.) Does BW Report show how current is my data?
2.) What are the reason why the performance of my BW Report is slow?
3.) What are the reason why my BW Report is have missing data?
4.) Why is my BW Report have incorrect data?
5.) Why doesnu2019t my BW Report Data match SAP R/3 Data?
Thanks,
Milind
Locked - duplicate post and very generic questions
Report performance and data quality
Edited by: Arun Varadarajan on Apr 9, 2010 2:07 AM

Hi,
1) Does BW Report show how current is my data?
Yes, Last refresh of your data stat in query Properties.Run report and check the details for last refresh.
2.) What are the reason why the performance of my BW Report is slow?
Reason could be:
Poor Design
Business Logic (Transformations)
Nav attributes used in the reports
Time dependent MD
Aggregates missing
Data Vol in the Cubes or DSO's
http://wiki.sdn.sap.com/wiki/display/BI/SomeusefulT-CodeforBIperformancetuning
3.) What are the reason why my BW Report is have missing data?
  Check the source system data and check mapping in transformation with all the business logic.
4.) Why is my BW Report have incorrect data?
Depedns if you are loading from FF or R/3 or your are cleansing the data once it enters in to BW.
5.) Why doesnu2019t my BW Report Data match SAP R/3 Data?
Check the Source system data in RSA3 and pick one Document and run the same document in BI.
Thanks!
@AK

Similar Messages

  • Report performance and data quality

    Hi,
    Can someone help give explanations to following questions :
    1.) Does BW Report show how current is my data?
    2.) What are the reason why the performance of my BW Report is slow?
    3.) What are the reason why my BW Report is have missing data?
    4.) Why is my BW Report have incorrect data?
    5.) Why doesnu2019t my BW Report Data match SAP R/3 Data?
    Thanks,
    Milind
    Please do not raise generic questions across multiple forums
    Edited by: Arun Varadarajan on Apr 9, 2010 2:08 AM

    Milind,
    1.) Does BW Report show how current is my data?
    You should be able to see the data currency when you run in the web - which method are you using - BEx or Web...?
    2.) What are the reason why the performance of my BW Report is slow?
    It could be due to anything - please search the forums for the same on how to identify possible performance bottlenecks
    3.) What are the reason why my BW Report is have missing data?
    It depends - Missing data loads etc etc
    4.) Why is my BW Report have incorrect data?
    You should be knowing that...? I can just say that it has incorrect data because ...." The sun rises in the east...".!!! more akin to asking "Why did the chicken cross the road"
    5.) Why doesnu2019t my BW Report Data match SAP R/3 Data?
    You should ask SAP that question...
    Honestly I am not sure what the reason behind such generic questions are.... if you are looking for answers - then you need to be more specific - if these are more like Interview Questions asked to you - I guess you should be able to answer them or ask further questions to clarify the question further....

  • Slow report performance when filter by 1 date

    Post Author: poh_michelle
    CA Forum: Formula
    Record selection formula : {PartTran.TransDate} in {?FromDate} to {?ToDate}
    Report performance was okay when i select #08/10/2007# for {?FromDate} and #10/10/2007# fro ; report displayed within 5 seconds.However, the report slow down wehn i select same date (#10/10/2007#) for both {?FromDate} and {?ToDate}; report displayed after 20 minutes.
    What is the reason and what can be done to improve it.
    Thanks for any advice.

    Post Author: foghat
    CA Forum: Formula
    Not sure why selecting a single date would slow down your report.  Are you sure something else wasn't/isn't going on? You could give this a try: {PartTran.TransDate} >= {?FromDate}and {PartTran.TransDate} <= {?ToDate}

  • Report Performance - timeout short dump

    Hello Experts,
    i am trying to improve the performace of a report that was developed long time ago.
    Issues i found:
    1. The report has many select...Endselect combinations, and selects inside the loop statements.
    2. Most of the selects have the addition 'into corresponding fields of' for selecting a few fields, without  the table addition.
    3.  Also few selects have the 'select * from'  syntax.
    data: begin of itab occurs 0,
              f1,
              f2
              f3.....
              fn,          
            end of itab.
    Ex: loop at itab.
             select f1 f2 f3 from table1
                   into corresponding fields of itab1.
               collect itab1.
             endselect.
              select f4 f5 from table2
                  into corresponding fields of itab2.
               endselect.
          endloop.
    All this leeds to performace issues.
    i have checked ST05, and i have got the details of the error.
    My question is which one of the reasons i mentioned above are a major factor in delaying the report performance?
    Which one of the above should i conetrate first to get the long runtime down? My goal is to keep my changes to the minimum and improve the performance. Please advise.

    > My question is which one of the reasons i mentioned above are a major factor in delaying the report
    > performance?
    Don't ask people for guesses, if you can see the facts!
    Run the SQL Trace several times, and use go to 'Trace List' -> 'Summarize Trace by SQL Statement'
    => Shows you total DB time and time per statement (all executions), the problems are on top of the list.
    Check ABAP, detail, and explain!
    Read more here:
    /people/siegfried.boes/blog/2007/09/05/the-sql-trace-st05-150-quick-and-easy
    Siegfried

  • Report Performance degradation

    hi,
    We are using around 16 entities in crm on demand R 16which includes both default as well as custom entites.
    Since custom entities are not visible in the historical subject area , we decided to stick to the real time reporting.
    Now the issue is , we have total 45 lakh record in these entites as a whole.We have reports where we need to retrieve the data across all the enties in one report.Intially we tested the reports with lesser no of records...the report performance was not that bad....but gradually it has degraded as we loaded more n more data over a period of time.The reports now takes approx 5-10 min and then finally diaplays an error msg.Infact after creating a report structure in Step 1 - Define Criteria......n moving to Step 2 - Create Layout it takes abnormal amount of time to display.As far as reports are concerned, we have built them using the best practice except the "Historical Subject Area Issue".
    Ideally for best performance how many records should be there one entity?
    What cud be the other reasons for such a performance?
    We are working in a multi tenant enviroment
    Edited by: Rita Negi on Dec 13, 2009 5:50 AM

    Rita,
    Any report built over the real-time subject areas will timeout after a period of 10 minutes. Real-time subject areas are really not suited for large reports and you'll find running them also degrades the application performance.
    Things that will degrade performance are:
    * Joins to other dimensions
    * Custom calculations
    * Number of records
    * Number of fields returned
    There are some things that just can't be done in real-time. I would look to remove joins from other dimensions e.g. Accounts/Contacts/Opportunities all in the same report. Apply more restrictive filters, e.g. current week/month to reduce the number of records required. Alternatively have very simple report, extract to excel and modify from there. Hopefully in R17 this will be added as a feature but it seems like you're stuck till then
    Thanks
    Oli @ Innoveer

  • Report Performance for GL item level report.

    Hi All,
    I have a requirements to get GL line items
    report based on GL Line items so have created data model like 0FI_GL_4->DSO-> cube and tested everything is fine but when execute in production the report performance is very bad.
    Report contains document number, GL act, comp.code, posting date objects.
    I have decided to do as follows to improve reporting performance
    ·         Create Aggregate on Document, GL characteristic
    ·         Compression.
    Can I do aggregates 1st then do the compression.
    Please let me know if I missing out anything.
    Regards,
    Naani.

    Hi Naani,
    First fill the Aggrigates then do Compression,run SAP_INFOCUBE_DESIGN Check the size of Dimension maintain Line item, High cordinality to the dimension, Set Cahe for query in RSRT,
    Try to reduce Novigational Attr in report. Below document may help you.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/6071ed5f-1057-2e10-deb6-d3426fec0219?QuickLink=index&…
    Regards,
    Jagadeesh

  • Bad reporting performance after compressing infocubes

    Hi,
    as I learned, we should compress requests in our infocubes. And since we're using Oracle 9.2.0.7 as database, we can use partitioning on the E-facttable to still increase reporting performance. So far all theory...
    After getting complaints about worse reporting performance we tested this theory. I created four InfoCubes (same datamodel):
    A - no compression
    B - compression, but no partitioning
    C - compression, one partition for each year
    D - compression, one partition for each month
    After loading 135 requests and compressing the cubes, we get this amount of data:
    15.6 million records in each cube
    Cube A: 135 partitions (one per request)
    Cube B:   1 partition
    Cube C:   8 partitions
    Cube D:  62 partitions
    Now I copied one query on each cube and with this I tested the performance (transaction rsrt, without aggregates and cache, comparing the database time QTIMEDB and DMTDBBASIC). In the query I selected always one month, some hierarchy nodes and one branch.
    With this selection on each cube, I expected that cube D would be fastest, since we only have one (small) partition with relevant data. But reality shows some different picture:
    Cube A is fastest with an avg. time of 8.15, followed by cube B (8.75, +8%), cube C (10.14, +24%) and finally cube D (26.75, +228%).
    Does anyone have an idea what's going wrong? Are there same db-parameters to "activate" the partitioning for the optimizer? Or do we have to do some other customizing?
    Thanks for your replies,
    Knut

    Hi Björn,
    thanks for your hints.
    1. after compressing the cubes I refreshed the statistics in the infocube administration.
    2. cube C ist partitioned using 0CALMONTH, cube D ist partitioned using 0FISCPER.
    3. here we are: alle queries are filtered using 0FISCPER. Therefor I could increase the performance on cube C, but still not on D. I will change the query on cube C and do a retest at the end of this week.
    4. loaded data is joined from 10 months. The records are nearly equally distributed over this 10 months.
    5. partitioning was done for the period 01.2005 - 14.2009 (01.2005 - 12.2009 on cube C). So I have 5 years - 8 partitions on cube C are the result of a slight miscalculation on my side: 5 years + 1 partion before + 1 partition after => I set max. no. of partitions on 7, not thinking of BI, which always adds one partition for the data after the requested period... So each partition on cube C does not contain one full year but something about 8 months.
    6. since I tested the cubes one after another without much time between, the system load should be nearly the same (on top: it was a friday afternoon...). Our BI is clustered with several other SAP installations on a big unix server, so I cannot see the overall system load. But I did several runs with each query and the mentioned times are average times over all runs - and the average shows the same picture as the single runs (cube A is always fastest, cube D always the worst).
    Any further ideas?
    Greets,
    Knut

  • Bex Report Performance

    Dear Friends,
    I would like to know is the complex authorizations can also cause the Bex report performance.
    One of my scenerio is like there are two users A & B
    A is having relevant authorizations for reporting, Drill down etc which are required.
    B is having SAP All authorization.
    When the same report has been executed by both users on the same system.
    the data retrieved by user B(SAP_ALL authorization) is quite faster than User A.
    Its like ther diffference of about 10 minutes.
    There are some exsclude selections in report.
    So my conclusion is like the complex authorizations do also hampers the query performance.
    Please confirm & share your views.
    Thanks & Best Regards,
    Vivek Tripathi
    +91-9372313000

    Hi Vivek
         Can you help us understand what was the exact problem and how you resolved it / solution at Extraction / Modeling / Reporting end.
         I have a quite similiar issue with my report i have Header + Item report on Infoset
    u2022     Header report takes seconds and item report takes minutes
    u2022     The same report executed with exact parameter has inconsistent performance results meaning one time it takes 1 minutes next time same report same user and same authorization takes 5 minutes.
        Any help on this would be really greatfull. Suspecting is not an issue with the report at all , as no changes happened between the pre and post check.
    _Additional Information : _
    We Create Secondary -Bitmap index every week end i do not see that is one of the route cause.
    Except that we have our regular daily loads that are running for master data loads and transaction data loads in series.
       Thanks in Advance.
    Much Regards
    Jagadish Thirumalachetty.
    Edited by: Jagadish Thirumalachetty on Jul 14, 2010 1:35 PM

  • Oracle Report Performance Slow

    I have a report which extracting the data from few complex queries and most of the tables that this report using are large tables.
    The problem I'm facing is the entire application become very very slow/hang whenever this report being runned.
    I have index all the necessary fields and process the report using those indexed field but still can't help.
    Is there anyway to improve the performance? Will it help if I combine all the queries into one query?
    How about the Database? any configurations can be set?
    Regards,
    Cheong

    The execution of the query doesn't depend on Reports, it is done in the database. So, check your query using the normal performance tuning tools.
    If just your report is slow (i.e. not the query) it could be because you have extensive formatting in your report. Formatting is done by Reports, not by the database. Try to do as much formatting in the query.

  • ALV Report performance & export problem

    Hi,
    We have developed an complex ALV report which accesses the data from FI Tables like BKPF, BSEG, BSAK etc.,   There are almost 100 lac records in BSEG and every day around 20000 records are getting added to that.  Eventhough I have used specific search criteria, system is taking lot of time.  Due to this I forced to run the report in background.  In background also it is taking around 4 - 6 hours.
    1) How can I improve the performance of the report.  Especially to access data from huge database table like BSEG with lot of conditions.  Any best practices
    2) I want to have an option (at selection screen) to get the report directly saved in a Excel file at desired path. 
    Please help me.
    Thanks in advance,
    Mallik

    Hi Mallik,
         Already i faced this problem before. At that time i follow some precuations:
    1) Check the estimation cost for that report with the basis people.
    2) The selection fields mentionedin the select statement and fields order in the internal; table should match with order of Data base fields order.
    3) Define type statements and then refer internal table to that types.
    4) Define secondary indexes in the where condition properly.
    5) Add BINARY SEARCH to read table statement.
    6) if possible attach package size n to the select statement.
    7) Avoid nested loops and nested selects.
    8) After populating the final internal table free all the internal tables.
    9) check how much time taking for each select statemnt through SY05.
    Hope this helps you. reply for queries.
    Regards,
    Kumar.

  • BI 7 Report performance slow

    Hi All.
    In BI 7, sales delivery report performance is prety slow. Previously it took 5 mins to execute the report, but now a days it is taking morethan 30 mins. Ithe prob is OLAP time.
    Vasu

    Hi,
    Please run the query in RSRT>Execute and Debug>Display Statistics Data.
    and also search there are lots of useful material available in the forum for query performance
    Check these threads:
    http://help.sap.com/saphelp_nw04s/helpdata/en/44/70f4bb1ffb591ae10000000a1553f7/frameset.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3f66ba90-0201-0010-ac8d-b61d8fd9abe9
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/e0e6a6e3-0601-0010-e6bd-ede96db89ec7
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/cccad390-0201-0010-5093-fd9ec8157802
    Thanks-RK

  • Database migrated from Oracle 10g to 11g Discoverer report performance issu

    Hi All,
    We are now getting issue in Discoverer Report performance as the report is keep on running when database got upgrade from 10g to 11g.
    In database 10g the report is working fine but the same report is not working fine in 11g.
    The query i have changed as I have passed the date format TO_CHAR("DD-MON-YYYY" and removed the NVL & TRUNC function from the existing query.
    The report is now working fine in Database 11g backhand but when I am using the same query in Discoverer it is not working and report is keep on running.
    Please advise.
    Regards,

    Pl post exact OS, database and Discoverer versions. After the upgrade, have statistics been updated ? Have you traced the Discoverer query to determine where the performance issue is ?
    How To Find Oracle Discoverer Diagnostic and Tracing Guides [ID 290658.1]
    How To Enable SQL Tracing For Discoverer Sessions [ID 133055.1]
    Discoverer 11g: Performance degradation after Upgrade to Database 11g [ID 1514929.1]
    HTH
    Srini

  • Crystal Reports performance is too slow

    Dear SDNers,
    I have designed a crystal report which is fetching data from a Z developed function module. From crystal report side, used filters.But while executing report, it is calling the function module multiple times.
    due to this performance is very bad. Why report is calling mutiple times of the function module. Do need to modify from crystal report side or from function module side. Please clarify.
    Regards,
    Venkat

    A similar issue was seen a year ago It was regarding a function module call being executed multiple times from  CR4Ent tool. It involved the usage of sub reports inside that report and the issue was generic for any function module used for testing.
    At that time, the issue was resolved by upgrading to the latest available patch of CR4Ent and also by applying the latest patch at SAP R3 end.
    If you are able to post the exact support package and patch level of CR4Ent and also for the SAP R3 system, then someone can tell whether its the latest or not.
    -Prathamesh

  • Apex report performance is very poor with apex_item.checkbox row selector.

    Hi,
    I'm working on a report that includes some functionality to be able to select multiple records for further processing.
    The report is based on a view that contains a couple of hundred thousand records.
    When i make a selection from this view in sqlplus , the performance is acceptable but the apex report based on the same view performes very poorly.
    I've noticed that when i omit the apex_item.checkbox from my report query, performance is on par with sqlplus. (factor 10 or so quicker).
    Explain plan appears to be the same with or without checkbox function in the select.
    My query is:
    select apex_item.checkbox(1,tan_id) Select ,
    brt_id
    , tan_id
    , message_id
    , conversation_id
    , action
    , to_acn_code
    , information
    , brt_created
    , tan_created
    from (SELECT brt.id brt_id, -- view query
    MAX (TAN.id) tan_id,
    brt.message_id,
    brt.conversation_id,
    brt.action,
    TAN.to_acn_code,
    TAN.information,
    brt.created brt_created,
    TAN.created tan_created
    FROM (SELECT brt_id, id, to_acn_code, information, created
    FROM xxcjib_transactions
    WHERE tan_type = 'DELIVER' AND status = 'FINISHED') TAN,
    xxcjib_berichten brt
    WHERE brt.id = TAN.brt_id
    GROUP BY brt.id,
    brt.message_id,
    brt.conversation_id,
    brt.action,
    TAN.to_acn_code,
    TAN.information,
    brt.created,
    TAN.created)
    What could be the reason for the poor performance of the apex report?
    And is there another way to select multiple report records without the apex_item.checkbox function?
    I'm using apex 3.2 on oracle 10g database.
    Thanks,
    Niels Ingen Housz
    Edited by: user11986529 on 19-mrt-2010 4:06

    Thanks for your reply.
    Unfortunately changing the pagination doesnt make much of a difference in this case.
    Without the checkbox the query takes 2 seconds.
    With checkbox it takes well over 30 seconds.
    The second report region on this page based on another view seems to perform reasonably well with or without the checkbox.
    It has about the same number of records but with a different view query.
    There are also a couple of filter items in the where clause of the report queries (same for both reports) based on date and acn_code and both reports have a selectlist item displayed in their regions based on a simple lov. These filter items don't seem to be of influence on the performance.
    I have also recreated the report on a seperate page without any other page items or where clause and the same thing occurs.
    With the checkbox its very very slow (more like 20 times slower).
    Without it , the report performs well.
    And another thing, when i run the page with debug on i don't see the actual report query:
    0.08: show report
    0.08: determine column headings
    0.08: activate sort
    0.08: parse query as: APEX_CMA_ONT
    0.09: print column headings
    0.09: rows loop: 30 row(s)
    and then the region is displayed.
    I am using databaselinks in the views b.t.w
    Edited by: user11986529 on 19-mrt-2010 7:11

  • How Can we improve the report performance..?

    Hi exports,
    I am learning the Business Objects XIR2, Please let me know How Can we improve the report performance..?
    Please give the answer in detailed way.

    First find out why your report is performing slowly. Then fix it.
    That sounds silly, but there's really no single-path process for improving report performance. You might find issues with the report. With the network. With the universe. With the database. With the database design. With the query definition. With report variables. With the ETL. Once you figure out where the problem is, then you start fixing it. Fixing one problem may very well reveal another. I spent two years working on a project where we touched every single aspect of reporting (from data collection through ETL and all the way to report delivery) at some point or another.
    I feel like your question is a bit broad (meaning too generic) to address as you have phrased it. Even some of the suggestions already given...
    Array fetch size - this determines the number of rows fetched at a single pass. You really don't need to modify this unless your network is giving issues. I have seen folks suggest setting this to one (which results in a lot of network requests) or 500 (which results in fewer requests but they're much MUCH larger). Does either improve performance? They might, or they might make it worse. Without understanding how your network traffic is managed it's hard to say.
    Shortcut joins? Sure, they can help, as long as they are appropriate. [Many times they are not.|http://www.dagira.com/2010/05/27/everything-about-shortcut-joins/]
    And I could go on and on. The bottom line is that performance tuning doesn't typically fall into a "cookie cutter" approach. It would be better to have a specific question.

Maybe you are looking for