Performance Optimization for Collection

I have a query with fetches the data as a Record Type in the form of an Array like below :-
SELECT <Long_Column_List> FROM
(TABLE(<Schema_Name>.<Package_Name>.<Function_Name>))y
Where the Function Returns a PLSQL table type.
This approach we had taken so that by using the Collection the Performance for all other Queries became better except this one.
It takes around 20 mins for its execution and its Explain Plan only shows 2 Lines with
' COLLECTION ITERATOR PICKLER FETCH ' and a very minimal Cost.
Can anybody help in Optimizing the Performance of this query?
Thanks
Arnab

Yes. the function is Pipelined and it returns a Pipe Row.
It handles around 50 Columns in the Select Clause and each Column is fetched into the Record Type PLSQL table. The Output Resultset is only 100 rows of data for which it is taking around 20 mins to execute.
Thanks for explaining the Explain Plan Output.
Is the approach of using a Pipelined Function causing the Performance Problem?
Can we use some other approach to make the Performance better or is there any way in which we can Optimize the behaviour of Pipelined Functions?
Thanks
Arnab

Similar Messages

  • Performance Optimization for Cubes

    Hi All,
    In our project, we have a daily proces chain which will refresh four reporting cube, which is consuming 8-10 hours to complete the refresh. We have suggested to archive the historical data to the new cube to improve the performance of the daily load
    In UAT, the performance of the daily load did not improved after we performed the archiving.
    Kindly suggest the performance improvement for the cubes.
    Regards
    Suresh Kumar

    Hi,
    Before loading the cube , you need to delete the index and once the load is complete recreate the same.For this you have to go to the manage screen of the infocube----> Performance Tab.
    Also Create the DB Statistics.For this you have to go to the manage screen of the infocube----> Performance Tab.This will reduce the load time to a considerable amount.
    Also increase the Maximum size of the data packet in the Infopackage. For this you have to go to the infopackage-->Scheduler in the menu bar--> Data S. Default Data Transfer.Increase the size to a considerable amount(not very high).Also increase the Number of Data packets per info IDOC. This field will be available just after Maximum size of the data packet in the Infopackage.
    Hope It Helps,
    Regards,
    Amit
    Edited by: Amit Kr on Sep 4, 2009 5:37 PM

  • Performance optimization related.

    Hi.
    I am doing Performance optimization on code.
    Actually, I am doing performance optimization for old code where it is of JDK1.4 related. I met up with a doubt when I optimize code for JDK1.5.
    Problem statement:
    Collection errors = new ArrayList();
                errors.add(new GenericException(ErrorCodes.EMPLOYEE_INVALID_PERMISSION));
                setErrorsInRequest(request, errors);In the above code the compiler tells us to Parameterize the Collection type reference. If we don't make any parameterization for Collection type, will that be dealing with Performace issue?
    Please help me out to resolve the problem statement.
    Thanks and regards,
    Leslie V
    www.googlestepper.blogspot.com
    www.scrollnroll.blogspot.com

    If we don't make any parameterization for Collection type, will that be dealing with Performace issue?No. Not really. But performance isn't really the issue... it's runtime-type-safety which is at issue. There's nothing to prevent me from adding an Integer (like an error number) to your collection of exceptions.
    And "GenericException"... Sheesh, come down from the trees allready. WTF am I (the user of this class/method/package) supposed to with a friggin "GenericException"... you may as well have thrown a raw RuntimeException and saved all that cumbersom interveening try/catch code.

  • Insert, search, delete performance overhead for different collections

    Hi,
    I am trying to create a table which compares performance overheads for different collections data structures. Does anyone want to help me out? I want to put a number from 1 - 9 in each of the question marks. 1 being very poor performance and 9 being very good performance (the reason I am doing this is that I had this question in a job interview test and I didn't pass it)
    anyone have any comments?
              Searching     Inserting     Deleting     
    ArrayList ? ? ?
    LinkedList ? ? ?
    TreeSet ? ? ?
    TreeMap ? ? ?
    HashMap ? ? ?
    HashSet ? ? ?
    Stack ? ? ?

    sorry the formatting has screwed up a bit when I posted it. It should have a list of the collection types and three columns (inserting, deleting, searching)

  • Batch fetch optimization for lazy collections

    Hi,
    I feel like this question must have been asked by somebody already but couldn't find any answers in the forum.
    We're trying to evaluate migrating from Hibernate to KODO JDO. One feature we use extensively from Hibernate is the "batch fetch optimization for lazy collections".
    For instance, there's a class User with collection "Set<String> permissions". In the DB, there're tables USER(id, name, etc) and USER_PERMISSIONS(user_id, permission) with one-to-many relationship b/w them.
    Suppose the code is the following:
    query = ....;
    List<User> users = (List<User>) query.execute();
    for (User user : users) {
    println("User: " + user.getName());
    println("Permissions: " + user.getPermissions());
    I've setup EagerFetchMode to parallel. For field "permissions" I had to specify default-fetch-group="true" b/c I wanted this to happen lazily. When I look through the logs, I see that permissions SQL query is executed for each user.
    With Hibernate we were to setup mapping the way that permissions are not fetched at first, but when they are requested for the first user (user.getPermissions()) they are automatically selected for several users in one query using SQL IN clause (similar to the parallel mode).
    Is this possible to recreate the same in KODO JDO? (JPA?) If it is, this would greatly simplify migration. Please notice, that we can't use explicit fetch groups for this, b/c we don't know ahead which collection can be navigated (in the real life User class have many relationships).
    thanks in advance,
    Dimitry

    Kodo doesn't have a direct analog for that behavior. The typical way to solve that problem is to keep the field out of the default fetch group, and then explicitly include the desired field in the current fetch group at runtime. In pure JDO, you can do this by creating a separate fetch group that includes the relationship field, and designating that the query should use that fetch group:
    <pre>
    query = ...;
    query.getFetchPlan().addGroup("relationshipGroup");
    List<User> users = (List<User>) query.execute();
    </pre>
    Kodo has JDO extensions that allow you to do this a bit more easily:
    <pre>
    query = ...;
    ((kodo.jdo.KodoFetchPlan) query.getFetchPlan()).addField("com.example.User.permissions");
    List<User> users = (List<User>) query.execute();
    </pre>
    Finally, you can do this with Kodo's JPA extensions like so:
    <pre>
    import org.apache.openjpa.persistence.OpenJPAQuery;
    query = (OpenJPAQuery) ...;
    query.getFetchPlan().addField("com.example.User.permissions");
    List<User> users = (List<User>) query.getResultList();
    </pre>
    Note that in all cases, you could also make this change to the current PersistenceManager / EntityManager's fetch plan instead, to make the change happen for the duration of the use of that manager. In that environment, the change would need to happen before the query was created.
    (And no, I have no idea why the edit box has a 'pre' button but does not seem to do anything with the resulting tags.)
    -Patrick

  • Should I Optimize for "Memory" or "Performance" in Preferences?

    I've been rendering the timeline prior to export and finding that it renders in just over an hour for a 30 minute project. Then, almost miraculously, the MPEG2-DVD export only takes about two hours or so. This is with "Maximum Render Quality" selected in the Sequence Settings. When this option is selected, a pop-up warns that it is "highly recommended" to set "Optimize for Memory" in Preferences>General. I did this, and my timeline render increased exponentially (I estimate triple, since I stopped it after it ran for 1/2 an hour and it said there was still about two hours to go. So I am prefering the "Performance" setting for rendering the timeline.
    But export may be a different matter. I'd rather not experiment with this if I don't have to, so I'm asking if anyone knows if the Preferences>General should be set to "Optimize for Memory" for export since that may be different than rendering the timeline for some important reasons.
    This question is really about time and quality in the final MPEG2-DVD. Are either affected, one way or the other, by the various options for settings in both the timeline render and the export encode? In the past, I've always used Max Render Quality with Optimize set for Performance and never had any issues. This latest discovery of reducing my export time (maybe in the range of 80%+) by rendering the timeline first is tempting to continue since the final MPEG2-DVD quality appears identical to exporting without first rendering the timeline. I did do a test today exporting without rendering the timeline first (after deleting all the preview files) and that export took 4-1/4 hours, a net loss of about an hour.
    Thanks, everyone.
    Update on statement in paragraph one. Since writing this, I exported after deleting the preview files and using Optimize for Memory in Preferences>General. Total export time was 4:15.

    When it comes to exporting, what type of encoding you use greatly effects how much time it takes to render the file. For example i recently tried to export a 12 minute file. It takes me about 45 minutes in an AVI format but it takes over 8 hours in FLV format. (FLV is a poor example but none the less point can be made from this).
    When it comes to optimizing for memory vs performance....It all depends on what you have available on your computer. If your memory is un the range of say 2-4 GB and you're using a windows 7 or vista OS, it's probably in your best interest to optimize for memory. This allows a machine with less memory to render much smoother than it would if it were trying to render based off of a performance based setting.
    Sometimes what happens as a result of the performance setting is the program tries to render the video much quicker than what the memory your computer can allocate can tolerate. Try it out, it might help with some of the "skipping over frame" errors.
    Cheers,
    -MBTV

  • (10g) 자동 통계정보 수집(AUTOMATIC OPTIMIZER STATISTICS COLLECTION)

    제품 : ORACLE SERVER
    작성날짜 : 2006-07-21
    PURPOSE
    이 문서는 10g의 new feature인 자동 통계정보 수집(Automatic Optimizer
    Statistics Collection)에 대한 소개와 기능에 대한 자료이다.
    Explanation
    1. 개요
    Optimizer statistics는 GATHER_STATS_JOB에 의해서 자동으로
    수집된다. 이 JOB은 SYS 소유로서 OBJECT_TYPE이 JOB이다.
    이 JOB은 통계정보가 없거나 stale 상태의 통계정보를 갖는 DB 내의
    모든 OBJECT들에 대한 통계정보들을 수집한다.
    2. 자동 통계정보 수집을 위한 설정과 방식
    1) STATISTICS_LEVEL = TYPICAL | ALL
    2) 통계정보들은 predefined GATHER_STATS_JOB에 의해 수집된다.
    3) JOB이 수행될 때 JOB은 다음과 같은 사항들을 결정한다.
    - missing 또는 stale 상태의 통계정보를 갖는 object를 결정한다.
    - 좋은 통계정보를 생성하기 위해 필요한 적당한 sampling percentage.
    - histogram과 histogram의 사이즈를 요구하는 적절한 column.
    - 통계정보 수집에 대한 parallelism의 degree.
    - 어느 object에 대한 통계정보를 수집할지에 대한 우선순위
    3. GATHER_STATS_JOB에 대한 설명
    이 job은 데이타베이스 생성 시점에 생성되고 스케줄러에 의해 관리된다.
    GATHER_STATS_JOB 은 DBMS_STATS.GATHER_DATABASE_STATS_JOB_PROC procedure를
    call함으로써 통계정보를 수집한다.
    이 프로시져는 'GATHER AUTO' 옵션을 사용한 DBMS_STATS.GATHER_DATABASE_STATS
    procedure와 아주 유사한 형태로 동작한다. 이것과 다른 점은
    GATHER_DATABASE_STATS_JOB_PROC procedure는 통계정보를 수집해야 할
    Object에 대해 우선순위를 두고 순서대로 처리한다. 즉, 가장 많이
    통계정보가 update가 되어야 할 object를 가장 먼저 처리하는 것이다.
    이것은 maintenance window가 close되기 전에 가장 필요한 통계정보가
    먼저 수집되도록 하기 위함이다.
    4. Dictionary Objects에 대한 통계정보
    1) Oracle Database 10g부터 최적의 performance 결과를 얻기 위해 dictionary
    table들에 대한 통계정보도 수집할 수 있다.
    언제라도, DBMS_STATS.GATHER_SCHEMA_STATS procedure를 사용하여
    dictionary table들에 대한 통계정보를 수집하는 것이 가능하다.
    이 때 GATHER_SYS argument는 TRUE로 셋팅되어 있어야 한다.
    2) DBMS_STATS.GATHER_DICTIONARY_STATS라 하는 새로운 procedure도 사용
    하는 것이 가능하다. 이것을 사용하기 위해서는 ANALYZE ANY DICTIONARY
    라는 새로운 system privilege가 있어야 한다.
    이 권한은 만약 어떤 user가 SYSDBA 권한이 없는 경우 dictionary object와
    fixed object들을 analyze할 수 있도록 한다.
    3) GATHER_DATABASE_STATS라는 프로시져는 GATHER_FIXED라 불리우는 새로운
    argument를 가진다. 이 값은 default로 FALSE로 셋팅된다. 즉, 기본적으로
    fixed table들에 대해서는 통계정보를 생성하지 않도록 한다.
    전형적인 System WorkLoad가 있는 동안에는 fixed table들에 대하여
    한번만 analyze하면 충분하다.
    4) GATHER_FIXED_OBJECTS_STATS라는 procedure를 사용하여 fixed table들에
    대한 통계정보를 모으는 것도 가능하다. 또한 모든 fixed table들에 대하여
    통계정보를 delete하는 것도 가능하고, fixed table에 통계정보를
    export 또는 import하는 것도 가능하다.
    Example
    none
    Reference Documents
    <Note:266040.1>

    Hi,
    Please see here,
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14211/stats.htm#i41448
    If the table/s are changing very frequently than its better to gather the stats manually.This would lead teh volatile table coming up into the stats job again and again.
    For the system stats and data dictionary stats,they are not collected by default.So there is no choice but to gather them manually.
    Aman....

  • Performance Optimization Self Service- SAP help requirement

    Hi,
    I want to know whether for SAP's help is required for performing the self service of Performance Optimization.
    If we collect ST12 trace and use it to perform the self service then is the report which is generated from the self service sufficient to take further action or will I need some SAP expertise to implement / take corrective actions?
    In short, whether I can do the Performance Optimization by myself or I need help from SAP?
    Regards,
    Vishal

    hi,
    1) Is this service available to all the customer? (by all the customers I mean "Max Attention", "Enterprise Support" etc)
    i answer this above is it, from mz above reply, have you checked
    enterprise support customers can get five EGI sessions as free per year. please check
    http://service.sap.com/esacademy
    - click browse egis
    for your second question also I answered above
    Does the report itself gives suggestions or we need to provide the report to SAP
    here my reply above
    because Guided procedure itself the proven methodlogy from SAP, the report provides the lots of suggestions against the SAP best practices.
    you can use it yourself most of the time. if still you need expert guidance from SAP, book for EGI sessions. they called as expert guided implementations, remote support. duration might vary based on the session.
    again, service report is the source, you have to review yourself, if you are in EGI, sap use that report for guiding. Please review
    Thanks
    Jansi

  • Performance Tuning for a report

    Hi,
    We have developed a program which updates 2 fields namely Reorder Point and Rounding Value on the MRP1 tab in TCode MM03.
    To update the fields, we are using the BAPI BAPI_MATERIAL_SAVEDATA.
    The problem is that when we upload the data using a txt file, the program takes a very long time. Recently when we uploaded a file containing 2,00,000 records, it took 27 hours. Below is the main portion of the code (have ommitted the open data set etc). Please help us fine tune this, so that we can upload these 2,00,000 records in 2-3 hours.
    select matnr from mara into table t_mara.
    select werks from t001w into corresponding fields of table t_t001w .
    select matnr werks from marc into corresponding fields of table t_marc.
    loop at str_table into wa_table.
    if not wa_table-partnumber is initial.
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
       EXPORTING
         INPUT         =  wa_table-partnumber
         IMPORTING
        OUTPUT        = wa_table-partnumber
    endif.
    clear wa_message.
    read table t_mara into wa_mara with key matnr = wa_table-partnumber.
    if sy-subrc is not initial.
    concatenate 'material ' wa_table-partnumber ' doesnot exists'
    into wa_message.
    append wa_message to t_message.
    endif.
    read table t_t001w into wa_t001w with key werks = wa_table-HostLocID.
      if sy-subrc is not initial.
      concatenate 'plant ' wa_table-HostLocID  ' doesnot exists' into
      wa_message.
      append wa_message to t_message.
      else.
      case wa_t001w-werks.
    when 'DE40'
    or  'DE42'
    or  'DE44'
    or  'CN61'
    or  'US62'
    or  'SG70'
    or  'FI40'
    read table t_marc into wa_marc with key matnr = wa_table-partnumber
                                            werks = wa_table-HostLocID.
    if sy-subrc is not initial.
    concatenate 'material' wa_table-partnumber  ' not extended to plant'
    wa_table-HostLocID  into  wa_message.
    append wa_message to t_message.
    endif.
    when others.
    concatenate 'plant ' wa_table-HostLocID ' not allowed'
      into wa_message.
    append wa_message to t_message.
    endcase.
    endif.
        if wa_message is initial.
          data: wa_headdata type BAPIMATHEAD,
          wa_PLANTDATA type BAPI_MARC,
          wa_PLANTDATAx type BAPI_MARCX.
          wa_headdata-MATERIAL = wa_table-PartNumber.
          wa_PLANTDATA-plant = wa_table-HostLocID.
          wa_PLANTDATAX-plant = wa_table-HostLocID.
          wa_PLANTDATA-REORDER_PT = wa_table-ROP.
          wa_PLANTDATAX-REORDER_PT = 'X'.
          wa_plantdata-ROUND_VAL = wa_table-EOQ.
          wa_plantdatax-round_val =  'X'.
          CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
            EXPORTING
              HEADDATA                   = wa_headdata
             PLANTDATA                  = wa_PLANTDATA
             PLANTDATAX                 = wa_PLANTDATAX
          IMPORTING
             RETURN                     =  t_bapiret
          CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
    write t_bapiret-message.
    endif.
    clear: wa_mara, wa_t001w, wa_marc.
    endloop.
    loop at t_message into wa_message.
    write wa_message.
    endloop.
    Thanks in advance.
    Peter
    Edited by: kishan P on Sep 17, 2010 4:50 PM

    Hi Peter,
    I would suggest few changes in your code. Please refer below procedure to optimize the code.
    Steps:
               Please run SE30 run time analysis and find out if ABAP code or Database fetch is taking time.
               Please run extended program check or code inspector to remove any errors and warnings.
               Few code changes that i would suggest in your code
              For select query from t001w & marc remove the corresponding clause as this also reduces the performance. ( For this you can define an Internal table with only the required fields in the order they are specified in the table and execute a select query to fetch these fields)
              Also put an initial check if str_table[] is not initial before you execute the loop.
              where ever you have used read table. Please sort these tables and use binary search.
              Please clear the work areas after every append statment.
              As i dont have a sap system handy. i would also check if my importing parameters for the bapi structure is a table. Incase its a table i would directly pass all the records to this table and then pass it to the bapi. Rather than looping every records and updating it.
    Hope this helps to resolve your problem.
    Have a nice day
    Thanks

  • Event ID 1008 The Open Procedure for service "BITS" in DLL "C:\Windows\System32\bitsperf.dll" failed. Performance data for this service will not be available. The first four bytes (DWORD) of the Data section contains the error code.

    I keep getting the above error on all my SQL 2012 deployments (standard and enterprise) on all Windows Server 2012 Standard machines. I have already tried the following commands in administrator mode to resolve without success:
    lodctr bitsperf.dll
    lodctr /R
    Any other suggestions?
    Diane

    Hi Diane Sithoo,
    You post the same question 2 times. Please avoid this practice on Forum, I 
    have merged the same thread. Thanks for your understanding.
    According to your description, we need to verify when the error occurs, and if when the error happen, your SQL Server does not work, right ? If yes, we need to you help us to collect the detailed error log in SQL Server management Studio (SSMS). Please refer
    to the following steps for collecting the error log.
    In SSMS, expand Management, and SQL Server Logs, then
    right-click a log and click View SQL Server Log.
    If the SQL Server can run well, there are some error on Windows Server service, I recommend you post the question on the Windows Server General Forum(http://social.technet.microsoft.com/Forums/windowsserver/en-US/home?forum=winservergen
    ) , It is appropriate and more experts will assist you.
    In addition, about Event ID 1008, you may need to reload the
     performance library when it is not properly initialized during installation. Then you can use Windows Reliability and Performance Monitor to verify that performance counters are properly collected and displayed in a Performance Monitor graph. For
    more information, you can review the following article.
    http://technet.microsoft.com/en-us/library/cc774913(v=ws.10).aspx
    Regards,
    Sofiya Li
    If you have any feedback on our support, please click here. 
    Sofiya Li
    TechNet Community Support

  • One performance view for two classes, possible?

    Hi there,
    My system is still running 2007 R2. I am writing a MP now which contains two classes. There are few performance collection rules targeting those two classes. I want to create one performance view to display performance data for BOTH classes. Is it possible?
    I already created an instance group and added both classes as member of the group. By using the group, I can created one alert view to display alerts from either class. Can I use the same trick for the performance view? Thanks!

    In addition, we also can add a dashboard view with two columns for the two classes, and add performance widget for each column.
    Regards,
    Yan Li
    Please remember to mark the replies as answers if they help and unmark them if they provide no help.

  • Optimizer Statistics collection after upgrade from 8i to 10R2

    I just upgraded database from 8.1.7 to 10R2 .
    What would the best approach for Optimizer Statistics collection. We would like to open database for test , but I afraid some quries going to run slow without latest stats. Should I run it manually or let Oracle run it’s default stats collection job later on.
    Any suggestions?

    user594143
    You really need a strategy before an upgrade like this, but you have two options -
    a) try to make the 10g stats collection identical to the 8i stats collection. Check the code you used to run, check the 8i default values for the parameters in your current dbms_stats() calls, and write them in explicitly when you run the code under 10g.
    OR
    b) do a full 10g conversion. Get rid of your own collection code, clear out most of the old settings you had in your parameter file for fiddling with the optimizer, do a 'gather_schema_stats' then leave 10g to do its default thing and fix any problems that appear.
    If you have testing time on a non-production system, then (b) is the strategic option - although personally I think it tends to collect too many histograms and still needs some refinement; if you don't have any testing time and you're going straight into production then (a) is the least threatening option (and if someone's made you do that, you might also set the optimizer_features_enable to 8.1.7 until you can do some proper upgrade tests).
    Regards
    Jonathan Lewis
    http://jonathanlewis.wordpress.com
    http://www.jlcomp.demon.co.uk

  • Top 5 Performance Counters for SSAS Multidimentional

    I have SharePoint 2013 and SQL Server 2014 based reporting solution on Azure.
    I have dedicated VM for SSAS.
    I would like to collect performance data (CPU, RAM etc...)
    What are top 5 most important performance counter for SSAS? (PERFMON)
    Kenny_I

    Hi ,
    Every Counter has there own use and significance and most important will depend on your requiremnt like whta you exactly want to Monitor.
    Apart from above ; The most useful counters here would be in the MSASXX:Processing group, Rows read/sec and Rows written/sec, which will give you an idea of how fast SSAS is reading data from the data source. As a general rule you should
    expect to get 40-60000 rows/sec from SQL Server, and if you're getting less you should try to tune the database you're getting data from rather than SSAS itself. Partitioning measure groups in SSAS and processing partitions in parallel is also a good way of
    improving processing performance.
    Please Refer below link for better understanding of performance counter :
    http://ms-olap.blogspot.in/2009/04/performance-counters-for-ssas-2008.html
    Thanks

  • 1636215 - Performance monitoring for the Advanced Adapter Engine

    Hi ALL,
    How can we monitor the performance by changing the parameter to true in the above note please please help me
    Thanks,
    Raj

    Hi Dheeraj,
    Since you've gone through the note, I'm assuming you know how to activate the performance monitoring for adapter engine.
    However, there are reservations against doing it in any system Dev/Quality/Production. Reason being, it becomes performance/load intensive when the perf. monitoring is on.
    Still, you can diagnose bottlenecks and execution trends in Dev/Quality using performance monitoring as suggested in the note but be aware that your box is coping well with it. Please note, once activated, you don't have to do anything else. System will automatically collect performance data and you will be able to browse through it using the performance monitoring link.
    How to use it???
    You see a performance monitoring link in Monitoring Home (navigate via Configuration and Monitoring home from PI/PO homepage).
    It will help you analyze the flow of message on periodical basis (hourly/daily etc) and will give you more stats on number of messages in that time duration, average time taken, message sizes (min, max and avg), interface-by-interface detail of runtime consumption at every stage etc.
    Please go through standard SAP help for more clarity:
    Monitoring Performance - Advanced Adapter Engine - SAP Library
    Apart from this, there is this great blog by Suseelan around monitoring:
    http://scn.sap.com/docs/DOC-35805
    Hope this helps.
    Regards,
    Abhishek

  • ABAP Performance Optimizer

    dear abapers,
    i have a problem in optimize abap code, especially report program.
    is there any documentation / step by step guide / tips n tricks for best practise in abap performance optimizer ?
    if there any, could you please send to my mail : [email protected]
    many thanks in advance
    regards
    eddhie

    Hi,
    Take a look at the links below they have useful info and tips.
    http://www.sapdevelopment.co.uk/perform/performhome.htm
    http://www.thespot4sap.com/Articles/SAPABAPPerformanceTuning_PerformanceAnalysisTools.asp
    http://help.sap.com/saphelp_nw2004s/helpdata/en/6b/4b5ceb971afa469a02268df33f545f/content.htm
    http://www.asug.com/client_files/Calendar/Upload/ACF3DBF.ppt#264,1,Slide 1
    Cheers
    VJ

Maybe you are looking for

  • How to retrive profile picture with DotNetOpenAuth

    Hi, I'm using VS 2013 and DotNetOpenAuth to authenticate users with facebook and I would like to retrive the profile picture of the user, so I can show the picture of the logged in user in my site. I know that I can send extra data to facebook with t

  • Account's APIs...

    Hi Experts, I am working on Accounts topic so i want to know that SAP provides Accounts APIs or not... Thanx in Advance. Kind Regards Yogesh

  • Upgrading to Mac OS X Leopard

    So I want to upgrade to Leopard but my PowerBook runs on OS X 10.3.9 and it says I need at least 10.4 to upgrade? Does this mean I have to upgrade to Tiger first?? Is there an update I can download to get to 10.4 so that I can upgrade to Leopard? Tha

  • My iphone5 was accidentally blocked and i can't unblocked it because the touch screen is not working. what should i do?

    my iphone5 was accidentally blocked and i can't unblocked it because the touch screen is not working. what should i do?

  • Mail does not make sounds.

    I am using Lion on a new MacBook Air, and after migrating my old data from my old MBP, I noticed that Mail no longer makes sounds for new mail or sent mail.  I did some research and found something related to IMAP settings, but I use POP, and still d