Performance Analysis section

Hi,
we are in 10g R2 on Win 2003 Server. In 2 Day DBA page 10-9 ( B14196-02 http://download.oracle.com/docs/cd/B19306_01/server.102/b14196.pdf) I saw for seeing Performance Analysis:
1-Under the Performance Analysis section on the Database Home page, as shown in Figure 10–6.
You can drill down by clicking the finding. The Performance Findings Details page appears, describing the findings and recommended actions.
2-The Diagnostic Summary next to Performance Findings shows the number of findings, if any. Click this link to go to the ADDM page.
But on my Grid control database home page I can not see these. Does it mean that ADDM has found nothing ? How to be sur that ADDM runs every one hour ? How to enable ADDM ?
Thank you.

Click on RUN ADDM
Then Click on hour check box on the graph where it shows spike to see the performance issues.
Ss

Similar Messages

  • Performance Analysis in Oracle Enterprise Manger console..

    Hi,
    Following is a findings of performance analysis section of Oracle Enterprise Manager.
    SQL statements consuming significant database time were found.
    ======
    While clicking on above link system opens another page and it should following details.
    Action Run SQL Tuning Advisor on the SQL statement with SQL_ID "5z50bh2wj1svt".
    SQL Text SELECT /*+FIRST_ROWS INDEX(a ttdpur041730$idx4)*/ a.t$amta,a.t$bqua,a.t$btsp,a.t...
    SQL ID 5z50bh2wj1svt
    Rationale SQL statement with SQL_ID "5z50bh2wj1svt" was executed 732 times and had an average elapsed time of 2.9 seconds.
    The impact of this finding is 77% ...
    THis is an erp apps where we don't have any source code ...
    Any suggestions to reduce impact form 77%
    SSM

    I don't think that the system tablespace datafile is 92% being filled is a problem,but his what you can do:
    set oracle_sid=<your SID>
    I take it the path includes the correct <oracle_home>\bin
    sqlplus / as sysdba
    alter database datafile 'E:\path_to\system.dbf' AUTOEXTEND ON NEXT 1M MAXSIZE 2000M; replace 'E:\path_to by the correct disk and path.
    Changing a datafile size (resize) or changing a datafile so it is able to autoexextend can also be done by the EM itself.
    The controlfile can grow over time but I haven't seen any problems with it's size.
    Sorry, I can't explain the backup issue.
    Eric

  • Performance Analysis Notes

    Hi,
    I want some "Performance Analysis Notes or material"..
    Plz give me some links or send me on my id: [email protected]
    Thanks...

    Hi, this may help you.
    1) Dont use nested select statements
    2) If possible use for all entries in addition
    3) In the where addition make sure you give all the primary key
    4) Use Index for the selection criteria.
    5) You can also use inner joins
    6) You can try to put the data from the first select statement into an Itab and then in order to select the data from the second table use for all entries in.
    7) Use the runtime analysis SE30 and SQL Trace (ST05) to identify the performance and also to identify where the load is heavy, so that you can change the code accordingly
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5d0db4c9-0e01-0010-b68f-9b1408d5f234
    ABAP performance depends upon various factors and in devicded in three parts:
    1. Database
    2. ABAP
    3. System
    Run Any program using SE30 (performance analys) to improve performance refer to tips and trics section of SE30, Always remember that ABAP perfirmance is improved when there is least load on Database.
    u can get an interactive grap in SE30 regarding this with a file.
    also if u find runtime of parts of codes then use :
    Switch on RTA Dynamically within ABAP Code
    *To turn runtim analysis on within ABAP code insert the following code
    SET RUN TIME ANALYZER ON.
    *To turn runtim analysis off within ABAP code insert the following code
    SET RUN TIME ANALYZER OFF.
    Always check the driver internal tables is not empty, while using FOR ALL ENTRIES
    Avoid for all entries in JOINS
    Try to avoid joins and use FOR ALL ENTRIES.
    Try to restrict the joins to 1 level only ie only for tables
    Avoid using Select *.
    Avoid having multiple Selects from the same table in the same object.
    Try to minimize the number of variables to save memory.
    The sequence of fields in 'where clause' must be as per primary/secondary index ( if any)
    Avoid creation of index as far as possible
    Avoid operators like <>, > , < & like % in where clause conditions
    Avoid select/select single statements in loops.
    Try to use 'binary search' in READ internal table. Ensure table is sorted before using BINARY SEARCH.
    Avoid using aggregate functions (SUM, MAX etc) in selects ( GROUP BY , HAVING,)
    Avoid using ORDER BY in selects
    Avoid Nested Selects
    Avoid Nested Loops of Internal Tables
    Try to use FIELD SYMBOLS.
    Try to avoid into Corresponding Fields of
    Avoid using Select Distinct, Use DELETE ADJACENT
    Check the following Links
    Re: performance tuning
    Re: Performance tuning of program
    http://www.sapgenie.com/abap/performance.htm
    http://www.thespot4sap.com/Articles/SAPABAPPerformanceTuning_PerformanceAnalysisTools.asp
    check the below link
    http://www.sap-img.com/abap/performance-tuning-for-data-selection-statement.htm
    See the following link if it's any help:
    http://www.thespot4sap.com/Articles/SAPABAPPerformanceTuning_PerformanceAnalysisTools.asp
    Check also http://service.sap.com/performance
    and
    books like
    http://www.sap-press.com/product.cfm?account=&product=H951
    http://www.sap-press.com/product.cfm?account=&product=H973
    http://www.sap-img.com/abap/more-than-100-abap-interview-faqs.htm
    http://www.thespot4sap.com/Articles/SAPABAPPerformanceTuning_PerformanceAnalysisTools.asp
    Performance tuning for Data Selection Statement
    http://www.sap-img.com/abap/performance-tuning-for-data-selection-statement.htm
    Debugger
    http://help.sap.com/saphelp_47x200/helpdata/en/c6/617ca9e68c11d2b2ab080009b43351/content.htm
    http://www.cba.nau.edu/haney-j/CIS497/Assignments/Debugging.doc
    http://help.sap.com/saphelp_erp2005/helpdata/en/b3/d322540c3beb4ba53795784eebb680/frameset.htm
    Run Time Analyser
    http://help.sap.com/saphelp_47x200/helpdata/en/c6/617cafe68c11d2b2ab080009b43351/content.htm
    SQL trace
    http://help.sap.com/saphelp_47x200/helpdata/en/d1/801f7c454211d189710000e8322d00/content.htm
    CATT - Computer Aided Testing Too
    http://help.sap.com/saphelp_47x200/helpdata/en/b3/410b37233f7c6fe10000009b38f936/frameset.htm
    Test Workbench
    http://help.sap.com/saphelp_47x200/helpdata/en/a8/157235d0fa8742e10000009b38f889/frameset.htm
    Coverage Analyser
    http://help.sap.com/saphelp_47x200/helpdata/en/c7/af9a79061a11d4b3d4080009b43351/content.htm
    Runtime Monitor
    http://help.sap.com/saphelp_47x200/helpdata/en/b5/fa121cc15911d5993d00508b6b8b11/content.htm
    Memory Inspector
    http://help.sap.com/saphelp_47x200/helpdata/en/a2/e5fc84cc87964cb2c29f584152d74e/content.htm
    ECATT - Extended Computer Aided testing tool.
    http://help.sap.com/saphelp_47x200/helpdata/en/20/e81c3b84e65e7be10000000a11402f/frameset.htm
    Just refer to these links...
    performance
    Performance
    Performance Guide
    performance issues...
    Performance Tuning
    Performance issues
    performance tuning
    performance tuning
    You can go to the transaction SE30 to have the runtime analysis of your program.Also try the transaction SCI , which is SAP Code Inspector.
    1 Always check the driver internal tables is not empty, while using FOR ALL ENTRIES
    2 Avoid for all entries in JOINS
    3 Try to avoid joins and use FOR ALL ENTRIES.
    4 Try to restrict the joins to 1 level only ie only for 2 tables
    5 Avoid using Select *.
    6 Avoid having multiple Selects from the same table in the same object.
    7 Try to minimize the number of variables to save memory.
    8 The sequence of fields in 'where clause' must be as per primary/secondary index ( if any)
    9 Avoid creation of index as far as possible
    10 Avoid operators like <>, > , < & like % in where clause conditions
    11 Avoid select/select single statements in loops.
    12 Try to use 'binary search' in READ internal table. Ensure table is sorted before using BINARY SEARCH.
    13 Avoid using aggregate functions (SUM, MAX etc) in selects ( GROUP BY , HAVING,)
    14 Avoid using ORDER BY in selects
    15 Avoid Nested Selects
    16 Avoid Nested Loops of Internal Tables
    17 Try to use FIELD SYMBOLS.
    18 Try to avoid into Corresponding Fields of
    19 Avoid using Select Distinct, Use DELETE ADJACENT.
    Regards
    Madhu

  • What is difference bet runtime analysis & performance analysis..?

    Hi,
    I am Siva Reddy,
    I am new to ABAP,
    I have a doubt,
    Please clarify my doubt..
    what is difference bet runtime analysis & performance analysis..?
    Points will be given to perfect answers..
    Regards,
    Siva Reddy.

    Hi Siva!
    This runtime analysis tools allows the ABAP/4 programmer to trace the tables used by the SAP dialog/reports programs.  In the Analyze button, you can see four more buttons like:-
    Hit List
          Displays the execution time of each statement in the program.
    Tables
          Displays the tables accessed during run time.
    Group hit list
          Displays the execution time of all the statements and grouping them based on the type of command.  e.g. performs, SQL and internal tables used.
    Hirarchy
          Displays the execution time of each statement in the actual order in which were executed.  Uses indentation to indicate the level of nesting of statements within subroutines.
    Any tables use by the transaction or program can be easily trace with the runtime analysis tools.
    Go to transaction SE30
    Type in the transaction code you want to analyze
    4.6x
          In the Restriction section: select the TMP -> Temporary variant
          Click the Change button
          Click the Duratn/type
          Select None for Aggregation
          Save the variant and execute again
          After finishing the process, click back to SE30
          Use F3 to move back to the initial screen of SE30
          Click the Analyze Button
          Click Goto -> Object-centered hit list -> Database tables
    3.0x
          Click Execute
          After finishing the process, click back to SE30
          Click the Analyze Button
          Click the Table Button
    After retrieving the table names, you can check the raw data with transaction SE16 - Data Browser or SE11 - Dictionary.
    For example, if you want to display the data for MSEG  - Material Document table
          Transaction SE16
          Type in MSEG for Table name and click execute.
          Data Browser will display the default selection for you to display data.  If you did not change the default and click execute the data browser will display the first 500 records.
          Click Settings to change the List formats, User parameters and Fields for selection.
          In 4.6x, you can use SE16N.
          Transaction SE11 or SE12 (4.6x)
          Type in MSEG for Object name and click the Display button.
          Click Utilities -> Table contents for the default selection screen.  If you did not change the default and click execute, the Dictionary will display the first 500 record.
          Click Settings to change the List formats, User parameters and Fields for selection.
          Transaction SE17
          Type in the Table Name, put in the Selection value,  put a 'X' in the Output column to display the data field and put in the Sort number from 01..99 (if you want to sort).
    ools provided for Performance Analysis
    Following are the different tools provided by SAP for performance analysis of an ABAP object
       1. Run time analysis transaction SE30
    This transaction gives all the analysis of an ABAP program with respect to the database and the non-database processing.
       2. SQL Trace transaction ST05
        The trace list has many lines that are not related to the SELECT statement in the ABAP program. This is because the execution of any ABAP program requires additional administrative SQL calls. To restrict the list output, use the filter introducing the trace list.
    The trace list contains different SQL statements simultaneously related to the one SELECT statement in the ABAP program. This is because the R/3 Database Interface - a sophisticated component of the R/3 Application Server - maps every Open SQL statement to one or a series of physical database calls and brings it to execution. This mapping, crucial to R/3s performance, depends on the particular call and database system. For example, the SELECT-ENDSELECT loop on the SPFLI table in our test program is mapped to a sequence PREPARE-OPEN-FETCH of physical calls in an Oracle environment.
        The WHERE clause in the trace list's SQL statement is different from the WHERE clause in the ABAP statement. This is because in an R/3 system, a client is a self-contained unit with separate master records and its own set of table data (in commercial, organizational, and technical terms). With ABAP, every Open SQL statement automatically executes within the correct client environment. For this reason, a condition with the actual client code is added to every WHERE clause if a client field is a component of the searched table.
    To see a statement's execution plan, just position the cursor on the PREPARE statement and choose Explain SQL. A detailed explanation of the execution plan depends on the database system in use.
    null

  • Vendor performance analysis report

    Hi all,
    requirement to create an ALV report for vendor performance analysis...  wat r d tables n feilds which are to be used fr this... any help is appreciated..

    Refer the links -
    vendor performance report !!!
    vendor performance report
    vendor performance report
    I need standard vendor performance report

  • NAM Report Analyze -- Wan Optimization -- Application Performance Analysis Report

    Hello,
    Transaction Time (Client Exprerience) report does not show optimized traffic. For this graph what we expect is to show side by side the transaction time of optimized and non optimized traffic side by side to give information about the baseline and optimized application performance.
    May it be that the optimization should be disabled for some time for the Wan Sites, and after some time manually reenabled in order to compare these values?  This should be difficult for large deployments since manually disabling and reenabling for performance analysis time taking task.
    Best Regards,

    Mohammed,
    It is common, in many of the Cisco Express 8.5 environments we have looked at, for the Total Incoming Calls given on a Traffic Analysis report to be a higher number than an Application Report.
    The Traffic Analysis Report counts every unique sessionID (unique call) that is inbound (contact type of 1).  The Application Reports do a similar thing but qualify (filter) only the records that have an application assigned.
    There are simply times where inbound calls have been directed to an "agent" without having an applicaiton assigned.
    The best thing the reporting user can do is to run a query on his or her database such as: 
         select * from ContactCallDetail where contactType=1and startDateTime > '2012-11-16 10:00:00' and startDateTime < '2012-11-16 11:00:00';
    Usually when an application is not assigned to the record in the ContactCallDetail table it is because the destination type is equal to 1, which is an 'Agent' instead of a 'route point'.
    So if you modify your select statement to filter by destinationType, you can quickly find the records that don't have the application assigned. 
    Example:  select * from ContactCallDetail where contactType = 1 and destinationType = 1 and startDateTime > '2012-11-16 10:00:00 and startDateTime < 2012-11-16 11:00:00';
    When you look at these records, you will see the agent that took the call from the destinationID field.  The number in that field should match up with the field called 'resourceID' in a table called 'resouce';
         Example:  select * from resource where resouce=6011; where 6011 was the number you found in the destinationID field.
    If there is still confusion about the source of the call - then talk to that agent and find out what is was.
    Good Luck and let me know if you need further help.
    Ron Reif
    [email protected]

  • Import Performance Analysis report

    Hi,
    I ran the import analysis report in FDM and it tells me the time taken by each step of import process. My question is why does it take time for "Memo Reassign Time". I am not doing anything with Memo. Is there a way to avoid the time it takes to reassign memo.
    Thanks, AJ

    Import Performance Analysis
    File Archive Time 1
    Memo Reassign Time 53
    Total Import Process 254 Seconds
    Total Map Process 65 Seconds
    Almost 1/5th of the total time is taken by Memo Reassign Step for nothing. Is there a way to bring it to 0 seconds.

  • Performance analysis and monitoring of a Forteapplication

    Hello,
    It would be good if one could do some performance analysis and monitoring of
    a Forte application at production time.
    By performance analysis I mean measure the time some selected methods take
    to return. In a CS application such a method would be some selected method
    of a key remote service representative of the application's activity.
    One would like to mesure the min, max, average time, and also give a
    threshhold which when reached will automatically generate an alarm or some
    pre-defined processing.
    The most powerful way would be to use a SNMP application through a Forte -
    SNMP gateway. (see the G. Puterbaugh paper "Building a Forte-SNMP gateway"
    in the 96' Forte Forum proceedings).
    But before going that far some simple means accessible through EConsole
    would already be great.
    A collegue of mine when to that Forum and reported that Puterbaugh said that
    such an agent is currently missing, but that its implementation is not
    difficult.
    I looked at all the agents and their instruments. I came to the following
    conclusions :
    1) instrumented data are available at the granularity level of a partition,
    not at a smaller granularity. For example, the DistObjectMgr agent gives you
    very useful information : the number of events (sent/received) and the
    number of (remote) methods (called/invoked), but this for the entire
    partition. Thus it prevents making tuned observations (unless you partition
    in a special way your application to put in a dedicated partition the thing
    you want to observe and only this thing).
    2) there is no instrumented data related to processing time.
    This leads me to the point that no information observed by the standard
    agents help me figuring out my performances. Thus I have to add at
    development time some lines of code to the methods I potentially want to
    observe later at production time to generate the appropriate information a
    custom agent will then display (process) with the appropriate instruments.
    Does someone share this position ?
    Has someone implemented such an agent and assotiated means ?
    PS: I will probably implement my own one if no other way around.
    best regards,
    Pierre Gelli
    ADP GSI
    Payroll and Human Resources Management
    72-78, Grande Rue, F-92310 SEVRES
    phone : +33 1 41 14 86 42 (direct) +33 1 41 14 85 00 (reception desk)
    fax : +33 1 41 14 85 99

    Hello,
    It would be good if one could do some performance analysis and monitoring of
    a Forte application at production time.
    By performance analysis I mean measure the time some selected methods take
    to return. In a CS application such a method would be some selected method
    of a key remote service representative of the application's activity.
    One would like to mesure the min, max, average time, and also give a
    threshhold which when reached will automatically generate an alarm or some
    pre-defined processing.
    The most powerful way would be to use a SNMP application through a Forte -
    SNMP gateway. (see the G. Puterbaugh paper "Building a Forte-SNMP gateway"
    in the 96' Forte Forum proceedings).
    But before going that far some simple means accessible through EConsole
    would already be great.
    A collegue of mine when to that Forum and reported that Puterbaugh said that
    such an agent is currently missing, but that its implementation is not
    difficult.
    I looked at all the agents and their instruments. I came to the following
    conclusions :
    1) instrumented data are available at the granularity level of a partition,
    not at a smaller granularity. For example, the DistObjectMgr agent gives you
    very useful information : the number of events (sent/received) and the
    number of (remote) methods (called/invoked), but this for the entire
    partition. Thus it prevents making tuned observations (unless you partition
    in a special way your application to put in a dedicated partition the thing
    you want to observe and only this thing).
    2) there is no instrumented data related to processing time.
    This leads me to the point that no information observed by the standard
    agents help me figuring out my performances. Thus I have to add at
    development time some lines of code to the methods I potentially want to
    observe later at production time to generate the appropriate information a
    custom agent will then display (process) with the appropriate instruments.
    Does someone share this position ?
    Has someone implemented such an agent and assotiated means ?
    PS: I will probably implement my own one if no other way around.
    best regards,
    Pierre Gelli
    ADP GSI
    Payroll and Human Resources Management
    72-78, Grande Rue, F-92310 SEVRES
    phone : +33 1 41 14 86 42 (direct) +33 1 41 14 85 00 (reception desk)
    fax : +33 1 41 14 85 99

  • [request] Intel VTune Performance Analyser

    Hello, I am wondering if someone have made a pkgbuild for the Intel VTune Performance Analyser: http://www.intel.com/cd/software/produc … 239143.htm. At the moment I do not have the skills or time to make a PKGBUILD, so I am hoping that someone already have done it. I am planning too learn the PKGBUILD system, so if do not have any success with request, I may prowide a PKGBUILD, in a distant future!

    OK.
    ThinkPad Edge E530 (3259-AD9)
    i5-3210M(2.5Ghz), 6GB RAM, 750GB 5400rpm HD, 15.6in 1366x768 LCD, Intel HD Graphics, 802.11bgn wireless, 1Gb Ethernet, UltraNav, Fingerprint reader, Camera, 6c Li-Ion, Win7 Home Premium 64
    First, in case you don't have it, here's your complete driver matrix: http://support.lenovo.com/en_US/research/hints-or-tips/detail.page?&DocID=HT073274
    From there, please confirm that you have installed the most recent release of the Intel HD Graphics drivers:
    http://support.lenovo.com/en_US/research/hints-or-tips/detail.page?&DocID=HT073274 from November 2013.
    Regards.
    English Community   Deutsche Community   Comunidad en Español   Русскоязычное Сообщество
    Community Resources: Participation Rules • Images in posts • Search (Advanced) • Private Messaging
    PM requests for individual support are not answered. If a post solves your issue, please mark it so.
    X1C3 Helix X220 X301 X200T T61p T60p Y3P • T520 T420 T510 T400 R400 T61 Y2P Y13
    I am not a Lenovo employee.

  • BOM Performance Analysis using SAT or SE30 t-code?

    Hi,
        Currently I'm doing performance analysis of BOM to find out the bottlenecks. I'm doing the analysis using SAT or SE30 t-code.
    From my understanding, BOM performance depends upon
    1. no. of BOM items
    2. BOM Level
    3. BOM Evolution over time
    Can anyone provide me a hint to find out the bottlenecks in BOM performance. Thanks!!
    Regards,
    Saravana

    I have done the BOM Performance Analysis. Based on the most execution time and top performance consumers, please find my observations…
    Functionally, Performance of BOM execution depends upon
    Number of BOM items available in the BOM
    Number of Assemblies present in the BOM
    BOM Levels
    BOM Evolution over time
    Technically, BOM execution time is purely depends upon
    fetching the data from the SAP tables
    hitting the same table again and again when goes down the level
    loop at the tables / Nested functions used in SAP Program
    Hence, I'm closing this thread.....

  • Tuxedo Performance Analysis - Query - tried txrpt not receiving report

    Hi All,
    We have currently migrated from Solaris-8 - Tuxedo 8 environment to Solaris 10 - Tuxedo 11gr1 Staging environment.
    All the application binaries were running fine.
    However we conducted a performance testing for 500 users we could see the performance coming down, when compared to our old performance report of
    Solaris 8 - tux8 environment.
    However we doubt it may be some issue with the network or some other server via which the requests are hitting or responses leaving our system.
    Can you suggest any utility or its usage by which we can check that our current Tuxedo version is working fine than the older one?
    Like how much time tuxedo is taking now to process,the throughputs.
    can you please suggest some way to monitor or generate some sort of report to compare tuxedo 8 performance and Tuxedo 11g R1 performance for same application.
    tried the txrpt
    I5_RQMG_0002_00
    SRVGRP=AQUREQGRP
    SRVID=2000
    CLOPT="-r -s PSY_MG_0002_00:I5_RQMG_0002_00 -o logs/slog/I5_RQMG_0002_00.out -e logs/slog/I5_RQMG_0002_00.err"
    MIN=1 MAX=1
    RQADDR="I5_RQMG_0012_00Q2"
    REPLYQ=Y
    for this above server :
    -rwxrwxrwx 1 tuxedo tuxedo 71022518 May 29 14:56 I5_RQMG_0002_00.err
    I get the above file generated.
    Executed the command:
    tuxconsolas1:/apps/tuxapps/intserv/logs/slog>txrpt -nPSY_MG_0002_00 -d05/29 -s02:00 -e14:00 I5_RQMG_0002_00.err > txrpt.out
    I am just receiving :
    tuxconsolas1:/apps/tuxapps/intserv/logs/slog>less txrpt.out
    START AFTER: Tue May 29 02:00:00 2012
    END BEFORE: Tue May 29 14:00:00 2012
    in the report and the process hangs for me.
    Am i doing any error?
    Edited by: 835542 on May 29, 2012 10:27 AM

    Hi,
    If you want detailed performance analysis of your Tuxedo 11gR1 application, you can use TSAM (Tuxedo System and Application Monitor) to get very detailed performance information including time spent on network links, time spent waiting in IPC queues, etc. Unfortunately you won't be able to use the same tool for your Tuxedo 8.0 application as it is too old for TSAM to support.
    Regards,
    Todd Little
    Oracle Tuxedo Chief Architect

  • Performance analysis program

    hello gurus!
    Someboody can send me the program name for making performance analysis in queries and infocube, please?
    Thanks a lot.
    BR.

    i think there is no particular program, to performance analysis.  there are different scenaios based on that particular performance issues.
    check the link
    http://www.google.co.in/url?sa=t&rct=j&q=%20performance%20analysis%20in%20queries%20and%20infocube%2C%20%20in%20sap%20bi%20in%20sdn&source=web&cd=9&ved=0CFoQFjAI&url=http%3A%2F%2Fcsc-studentweb.lr.edu%2Fswp%2FBerg%2Farticles%2FTeched2010%2FBerg_TechEd_2010_Performance_Tuning_v5.pptx&ei=e1zwTvWmJ4ayrAfA5KXSDw&usg=AFQjCNGy7RVynsLUPqKInmQGlVCSbaxlQA&cad=rja

  • Performance analysis with SE30

    Hi,
      I want to know about performance analysis with SE30 How much should be percentage of DATABASE SYSTEM and on ABAP  how to reduce the load of system  and which is more critical load on DATABASE OR abap, also for my report i have checked with different  parameter according to the posting period which i have selected  the graph is changing
    regards,
      zafar

    There are a variety of materials on the subject of Performance Analysis and SE30.  If you'll take the time to search the blogs and articles in SCN, I'm confident you'll find something useful...
    Percentages vary, depending upon what we're doing, which tables we're using, how many tables, etc.  So, there are no hard and fast rules about what the percentages should be (so far as I know).   Generally, very high database percentages can predict long runtime if one is obtaining a large number of data rows, and suggest (to me, at least) the need to review the select statements within the ST05 trace tool, to be sure I'm utilizing index and doing so optimally for each table I'm reading, that I'm not bypassing SAP buffers if not necessary, etc.

  • Performance Tuning Section in Business Blueprint

    Dear Gurus,
    I am at a client location doing a blueprint. We are in the closure level of the business blueprint and one of the sections need to be added is the Performance Tuning. Can anybody suggest me what would be content to be added in the Performance Tuning section of blueprint. If anybody got the content section with them please forward that to my mail id, i.e. [email protected]
    Thanks and regards
    Vijay

    Hi,
    Think abt all BI is made up of an dhow to tune them..
    Some hints:
    1 Performance tuning for data loads : ( number range buffering, parallel loading..pseudo delta..chnging technical properties of tables if rec >100,000...IDOC clogging...etc)
    2) Performnce tuing for Queries Aggregate...Cube modelling, OLAP cache....efficient multiprovider queries.etc.)
    MEntion about transactiona whoich you will require for these things..such as RSRV..RSRTRACE..STO3n..ST05..etc...
    gaurav

  • How to do Performance analysis of any ABAP report

    Hello Friends,
                         I have to do the performance analysis of ABAP report, it takes time to execute. I am  not aware about, how to do performance analysis? I know only we have check index, do ST04 and ST05 analysis. but how to do these things  that i don't know.
                        Can anyone please guide me in this regards.
    Thanks

    HI,
    Please Check this,
    System Trace: Transaction ST01 lets you do various levels of system trace such as authorization checks, SQL traces, table/buffer trace etc. It is a general Basis tool but can be leveraged for BW.
    Workload Analysis: You use transaction code ST03
    Database Performance Analysis: Transaction ST04 gives you all that you need to know about whatu2019s happening at the database level.
    Performance Analysis: Transaction ST05 enables you to do performance traces in different are as namely SQL trace, Enqueue trace, RFC trace and buffer trace.
    ABAP Runtime Analysis Tool: Use transaction SE30 to do a runtime analysis of a transaction, program or function module. It is a very helpful tool if you know the program or routine that you suspect is causing a performance bottleneck.
    Rgds
    Sabu

Maybe you are looking for

  • External hard drive won't always register

    I have a WD Green WD15EADS 1.5TB hard drive in an Iocell MyDisk Solo enclosure connected to my mac mini via USB. When I first turn on the enclosure, it will show up on my desktop and I can backup right away. But after the initial backup, usually a fe

  • Newbie question about software installation

    I just bought my first mac, testing the waters. I downloaded apple remote desktop, and am unable to install this software. I work from home and VPN to our servers. I have always used a PC. The process goes through the installation until it gets to co

  • Solaris 10 - print services not running

    Hi One of our servers has just been upgraded from Solaris 9 to 10 but we are having problems getting the lp scheduler to start. We have tried starting the service manually by issuing *'svcadm -v enable application/print/server'* which returns '*svc:/

  • How  can you avoid that an ABAB-Function works with an  EXPORT_PARAMETER

    Hello everybody, I have to call the ABAP-Function ECATT_EXECUTE via JCO. In SAP 7.0 there is a new EXPORT_PARAMETER for the function ECATT_EXECUTE, which is called E_RESULT_XML. There is a statement "IF E_RESULT_XML is requested" which gives always T

  • Form results does not show column headers

    I created a form and when I enter info and then view the results, the results page does not show column headers.  How do I correct this?