Report Performance Testing in BEX or Portal + process Chain testing

Hi Guru,
i am looking for a tool in BW where BEX reports or Portal reports can be tested.
What i looking for is a possibility to log on virtual users (100.+) for a report and check how faster that report can be runned.
It is a tool where i can test the process chain perormance too?
a How to could be very helpful too.
Thank in advance
SLT

I found a solution.

Similar Messages

  • Pointers for optimizing system performance (run time) while running DP process chain with parallel processing

    Hi Experts,
    We are running APO DP process chain with parallel processing in our company, we are experiencing some issues regarding run time of process chain, need your help on below points;
    - What are the ways we can optimize process chain run time.
    - Special points we need to take care of in case of parallel processing profiles used in process chain.
    - Any specific sequence to be followed for different processes in process chain - if there is some best practice followed.
    - Any notes suggesting ways to improve system performance for APO version 7 with different enhancement packs 1 and 2.
    Any help will be really appreciated.
    Regards

    HI Neelesh,
    There are many ways to optimize performance of the process chains (background jobs) in APO system.
    Firstly I would recommend you to identify the pain areas (steps) which are completing with more runtimes. Then each one of the step has got different approaches to decrease the runtime.
    Like you may end up with steps like infopackage executions, DTPs, DP mass processing jobs etc which might be running with more runtimes. So now target each one of them differently and find out the ways to optimize. At the same time the approach you follow should be technically possible with basis perspective (system load and utilization) as well.
    And coming to parallel processing, you can use parallel processing for different for different jobs. You can further r explore on the same using parallel processing. Like loading an infocube, mass processing, infopackage execution, DTP, TSCOPY etc.
    Check the below link for more info
    Performance problems in DP mass processing
    Let me know if you require further info.
    Regards,
    Raj

  • Including an Event Data Change in a Process Chain

    There is a step called "_Execution with Data Change in InfoProvider_" in the process of including event data change in a process chain.
    How do we carryout this? should this be done in Bex or in process chain itself?

    Dear Eshwari,
    1) To get the 'Execution on data change' option, you need to have aprocess chain with the 'Event data
    change' process type which contains 'the infoprovider' (The infoprovider, over which the query is created
    which you are trying to broadcast) in the process variant of it.
    2) Further more, additional authorization is also needed in order for the user to see such scheduling options in broadcaster.
    Please assign this authorization object to the affected users.
    BEx Broadcasting Authorization to Schedule  S_RS_BCS
         Activity                       *
         Event ID in Broadcasting Frame *
         Event Type in Broadcasting Fra <== Here you should select*
         ID of a BI Reporting Object in *
         Type of BI Reporting Object in *
    3) Please also ensure the user has not only the Busines Explorer role but also Business Intelligence role in Portal,
    Regards,
    Arvind

  • Question about repeating in the process chain

    Hello Gurus,
            my DTP is set as "valid records update, no reporting (request red)", and put into a process chain. the process chain goes wrong on this DTP due to some erroneous records.  my questions are:
       (1) based on DTP configuration, all the records are transferred to target successfully except for those erroneous records. is that right?
       (2) after I correct those erroneous records in PSA, I repeat this DTP process in the process chain.  what exact data are transferred by this repeating operation on this DTP?  do we need to take care of all the previously loaded data for this subsequent repeat loading ?
    Many thanks.

    Hi ,
    Please refer my article for your answer, also this has lots of good DTP features.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/10339304-7ce5-2c10-f692-fbcd25915c36?quicklink=index&overridelayout=true
    (1) based on DTP configuration, all the records are transferred to target successfully except for those erroneous records. is that right?
    Yes. All the erroneous records go in the error stack and then you can fix the errorneous records there and execute the error DTP.This setting ensures, incorrect/incomplete data is not available for reporting.
    (2) after I correct those erroneous records in PSA, I repeat this DTP process in the process chain. what exact data are transferred by this repeating operation on this DTP? do we need to take care of all the previously loaded data for this subsequent repeat loading ?
    No records are transferred on repeating teh job. We need to manually set the status of request to Green.Yes we need to take care of all the subsequent steps.
    -Vikram

  • To find the time required by the process chain to complete

    Hi Experts,
    I am calulating the average time required by the process chain to compete.
    Is there any way to find the time required by the process chain to complete..
    Thanks in advance.
    Regards,
    Ashwin

    Hi,
    There is a Tool provided by SAP to do the Process Chain Analysis.
    It is basically a ABAP Program /SSA/BWT which provides the following BW Tools:
    a)Process Chain Analysis : this tool is used to perform the Runtime analysis of the Process Chains. The analysis can be performed not only at Process Chain level but also at the Process Type level.
    b)Detailed Request Analysis
    c)Aggregate Toolset
    d)Infoprovider BPPO Analysis
    So you can go through the program and analyse the runtime of your Process Chains.
    Regards,
    Abhishek
    Edited by: Abhishek Dutta on Aug 13, 2008 7:13 AM

  • Transportation of process chains with ABAP variants

    Hi,
    Could you please help me in finding out how to transport process chains with ABAP variants.
    I can transport them, however when transporting them, I overwrite the already existing variants, which is very unfortunate...
    Could you please advice.
    Thank you in advance!
    /Brian

    Hi Ramesh,
    Option 1:
    You can directly create the process chain in Production. Most of the times ppl do have access to create a process chain in Prod. Kindly check whether you have the authorization to do so.
    Option 2:-
    I did not suggest to delete the old info-packages in QA and Prod. Let them continue to exist but you may not use them once the process chain is in place. You create the process chain, test it and move it up to Prod. I dont' think the new infopackages will hamper your delta load. The delta is not dependent on the infopackage.
    If I were you, I would try the second option and if it does not work during testing then go for the first option.
    Bye
    Dinesh.

  • PSA Deletion Process in Process Chains Transport Question

    BI Guru's
    I am working on investigating a method of deleting PSA Requests this will be a stand alone process so no loads will be done in this process chain.  My question I cannot use the Object Type of InfoPackage  or DTP since these objects wont be in the Process Chain leaving me PSA Table option but since I will be developing this in Dev and transporting through environment into prod how will this be handled since PSA tables change from system to system?  Does the system automatically translate it for me or will the value stay the same?

    Hi Alex,
    System will automatically identifies the table in quality and then in Production system. You do not have to worry. I had same question and I tested it with transporting the Process chain from developement to quality. In quality system automatically identified the table corresponding to data target/Master data.
    Regards,
    Kams

  • Transportation of Process chains

    Hi Experts,
        Earlier we haven't transported info packages along with inforsources & sourcesystems from dev --> production. So we directly created info packages at production.
        Now my question is i have created one test process chain in developement. I would like to transport to the production. suggest me the best possible way to create process chain in dev and tranport to the production.
    Regards,
    Ramesh.

    Hi Ramesh,
    Option 1:
    You can directly create the process chain in Production. Most of the times ppl do have access to create a process chain in Prod. Kindly check whether you have the authorization to do so.
    Option 2:-
    I did not suggest to delete the old info-packages in QA and Prod. Let them continue to exist but you may not use them once the process chain is in place. You create the process chain, test it and move it up to Prod. I dont' think the new infopackages will hamper your delta load. The delta is not dependent on the infopackage.
    If I were you, I would try the second option and if it does not work during testing then go for the first option.
    Bye
    Dinesh.

  • Generic Process-Chain to load many individual files - How?

    Hello All,
    There is a requirement to extract 5 csv files to a single ODS every week.
    The files are individualy named and can appear all at the same time or spread out over 1 week.
    Current loading practice is to employ 5 process-chains, one for each file. The Process-Chains are very similar except that each calls a different InfoPackage which points to a specific file.
    I would like to be able to perform the load with only 1 process-chain.
    Initialy I though I could achieve this by running a script that renames the source files to genericname.csv for example, and then creating an InfoPackage that uses a logical filename. However, for audit  purposes, the source file names must remain unchanged.
    So, does anyone have any suggestions how to achieve the objective preferably without using OS-level scripting?
    Many thanks,

    Hi,
    There is no simple way to achieve this. You can create a master process chain that has the 5 parallel file CSV load including the DTPs and upstream data loads. You can specify the next steps start irrespective of error (versus only if successful which is default ) inside the process chain. Upon successful data load, delete the CSV file so anytime the PC is run again, its not available. You can create an OS command prompt in side the PC at the end to achieve it.
    So only the file that needs be extracted is available, eg: if csv1 needs be loaded to DSO1, the csv2-csv5 that load to (DSO2-5 or can be the same DSO) will not be available. So the process step1 will be successful, while  the remaining will error, however given that the PC will complete since the next steps aren't dependent on success/failure of each extract.
    Alternately
    If you have external job scheduling agent like Maestro, is to schedule the individual PC (5 pcs as they exist today, each run by the Maestro scheduler) load from there.
    Alternately, if the data is getting loaded from a flat file generated from R/3 or other SAP system, you can trigger the event from RFC which will start the respective process chain.
    Hope it helps.
    Samir

  • Report performance while creating report on BEx

    All all!
    I am creating a report on BOE 4.0 on top of BEx connection as a source. I have developed reports on top of universe in the past and i know that if we keep calculations on reporting end it hampers the report performance. Is this the same case with BEx? if we are following the best practices is it ok to say that we should keep all heavy calculations/ aggregation on BEx or backend for better report performance.
    Can you guys please provide your opinion based on your experiance and knowledge.  Any feedbacks will help! Thanks.

    Hi,
    Definitely  best-practice to delegate a maximum of CKF to the Cube where possilble,  put RKF in the BEx query, and Filters too.
    also, add Default Values to your Variables (this will speed up generation of the bics transient universe)
    also, since Patch2.10, we are seeing some significant performance improvements  reducing 'document initialization' and  'time to prompts'  by up to 50% (step such as these often took 1.5 minutes, even on sized systems)
    Also, make sure you have BW corrections like this implemented:  1593802    Performance optimization when loading query views 
    In the BusinessObjects landscape - especially with BI 4.0 - it's all about Sizing and Tuning . Here is your bible the 'sizing companion' guide : http://service.sap.com/~form/sapnet?_SHORTKEY=01100035870000738725&_OBJECT=011000358700000307202011E
    Pay particular attention to BICSChunkSize registry settings
    Also, the  -Xmx JVM Heap Size for the Adaptive Processing Server  that is running the DSL_Bridge service.
    Regards,
    H

  • Bex Report Performance

    Dear Friends,
    I would like to know is the complex authorizations can also cause the Bex report performance.
    One of my scenerio is like there are two users A & B
    A is having relevant authorizations for reporting, Drill down etc which are required.
    B is having SAP All authorization.
    When the same report has been executed by both users on the same system.
    the data retrieved by user B(SAP_ALL authorization) is quite faster than User A.
    Its like ther diffference of about 10 minutes.
    There are some exsclude selections in report.
    So my conclusion is like the complex authorizations do also hampers the query performance.
    Please confirm & share your views.
    Thanks & Best Regards,
    Vivek Tripathi
    +91-9372313000

    Hi Vivek
         Can you help us understand what was the exact problem and how you resolved it / solution at Extraction / Modeling / Reporting end.
         I have a quite similiar issue with my report i have Header + Item report on Infoset
    u2022     Header report takes seconds and item report takes minutes
    u2022     The same report executed with exact parameter has inconsistent performance results meaning one time it takes 1 minutes next time same report same user and same authorization takes 5 minutes.
        Any help on this would be really greatfull. Suspecting is not an issue with the report at all , as no changes happened between the pre and post check.
    _Additional Information : _
    We Create Secondary -Bitmap index every week end i do not see that is one of the route cause.
    Except that we have our regular daily loads that are running for master data loads and transaction data loads in series.
       Thanks in Advance.
    Much Regards
    Jagadish Thirumalachetty.
    Edited by: Jagadish Thirumalachetty on Jul 14, 2010 1:35 PM

  • BI content to view reports in BEx and Portal

    Hi,
    I want to install all the BI content related to Query and Reporting.
    Can you pls help  how to proceed with this like what and all I need to activate.
    My requirement is to generate all the reports on 'Vendor Evaluation (from ERP) and 'Tender Evaluation (Bidding Engine) from SRM'..customers should see all the reports in the Netwever portal and as well as in BEx.
    Pls help ..
    Also I am quite confused with the words :Web Template & BEx web item and BEx web item and Web item..
    To view the reports (in BEx and portal) which are extracted from ERP what and all we need to activate?
    To view the reports (in BEx and portal) which are extracted from SRM what and all we need to activate?
    Please help in this regard..
    Regards
    Sharif.

    Hi Praveen,
    Thanks for your immediate response..
    My requirement is that I will have to develop reports on Vendor Evaluation from (SS is ERP) and Tender Evaluation (Bidding Engine : SS is SRM)..
    To develop the reports for both of the functionalities in BI and to view them in excel mode and in the potal,what is the BI content which I need to activate in BI? -- This is my question.
    Whether we need to activate Web template alone or Web template and Web Item both?
    In case we need to activate both then which of the following shoud I activate
    Either Web Item or BEx item
    Web Template or BEx web template
    Please provide me the necessary information..<REMOVED_BY_MODERATOR>
    thanks in advance..
    Regards
    Sharif.
    Edited by: Pravender on Sep 22, 2010 11:05 AM

  • Reg: Process Chain, query performance tuning steps

    Hi All,
    I come across a question like,  There is a process chain of 20 processes.out of which 5 processes are completed at the 6th step error occured and it cannot be rectified. I should start the chain again from the 7th step.If i go to a prticular step i can do that particular step, How can i start the entair chain again from step 7.i know that i need to use a function module but i dont know the name of FM. Please somebody help me out.
    Please let me know the steps involved in query performance tuning and aggregate tuning.
    Thanks & Regards
    Omkar.K

    Hi,
    Process Chain
    Method 1 (when it fails in a step/request)
    /people/siegfried.szameitat/blog/2006/02/26/restarting-processchains
    How is it possible to restart a process chain at a failed step/request?
    Sometimes, it doesn't help to just set a request to green status in order to run the process chain from that step on to the end.
    You need to set the failed request/step to green in the database as well as you need to raise the event that will force the process chain to run to the end from the next request/step on.
    Therefore you need to open the messages of a failed step by right clicking on it and selecting 'display messages'.
    In the opened popup click on the tab 'Chain'.
    In a parallel session goto transaction se16 for table rspcprocesslog and display the entries with the following selections:
    1. copy the variant from the popup to the variante of table rspcprocesslog
    2. copy the instance from the popup to the instance of table rspcprocesslog
    3. copy the start date from the popup to the batchdate of table rspcprocesslog
    Press F8 to display the entries of table rspcprocesslog.
    Now open another session and goto transaction se37. Enter RSPC_PROCESS_FINISH as the name of the function module and run the fm in test mode.
    Now copy the entries of table rspcprocesslog to the input parameters of the function module like described as follows:
    1. rspcprocesslog-log_id -> i_logid
    2. rspcprocesslog-type -> i_type
    3. rspcprocesslog-variante -> i_variant
    4. rspcprocesslog-instance -> i_instance
    5. enter 'G' for parameter i_state (sets the status to green).
    Now press F8 to run the fm.
    Now the actual process will be set to green and the following process in the chain will be started and the chain can run to the end.
    Of course you can also set the state of a specific step in the chain to any other possible value like 'R' = ended with errors, 'F' = finished, 'X' = cancelled ....
    Check out the value help on field rspcprocesslog-state in transaction se16 for the possible values.
    Query performance tuning
    General tips
    Using aggregates and compression.
    Using  less and complex cell definitions if possible.
    1. Avoid using too many nav. attr
    2. Avoid RKF and CKF
    3. Many chars in row.
    By using T-codes ST03 or ST03N
    Go to transaction ST03 > switch to expert mode > from left side menu > and there in system load history and distribution for a particual day > check query execution time.
    /people/andreas.vogel/blog/2007/04/08/statistical-records-part-4-how-to-read-st03n-datasets-from-db-in-nw2004
    /people/andreas.vogel/blog/2007/03/16/how-to-read-st03n-datasets-from-db
    Try table rsddstats to get the statistics
    Using cache memoery will decrease the loading time of the report.
    Run reporting agent at night and sending results to email.This will ensure use of OLAP cache. So later report execution will retrieve the result faster from the OLAP cache.
    Also try
    1.  Use different parameters in ST03 to see the two important parameters aggregation ratio and records transferred to F/E to DB selected.
    2. Use the program SAP_INFOCUBE_DESIGNS (Performance of BW infocubes) to see the aggregation ratio for the cube. If the cube does not appear in the list of this report, try to run RSRV checks on the cube and aggregates.
    Go to SE38 > Run the program SAP_INFOCUBE_DESIGNS
    It will shown dimension Vs Fact tables Size in percent.If you mean speed of queries on a cube as performance metric of cube,measure query runtime.
    3. --- sign is the valuation of the aggregate. You can say -3 is the valuation of the aggregate design and usage. ++ means that its compression is good and access is also more (in effect, performance is good). If you check its compression ratio, it must be good. -- means the compression ratio is not so good and access is also not so good (performance is not so good).The more is the positives...more is useful the aggregate and more it satisfies the number of queries. The greater the number of minus signs, the worse the evaluation of the aggregate. The larger the number of plus signs, the better the evaluation of the aggregate.
    if "-----" then it means it just an overhead. Aggregate can potentially be deleted and "+++++" means Aggregate is potentially very useful.
    Refer.
    http://help.sap.com/saphelp_nw70/helpdata/en/b8/23813b310c4a0ee10000000a114084/content.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/60/f0fb411e255f24e10000000a1550b0/frameset.htm
    4. Run your query in RSRT and run the query in the debug mode. Select "Display Aggregates Found" and "Do not use cache" in the debug mode. This will tell you if it hit any aggregates while running. If it does not show any aggregates, you might want to redesign your aggregates for the query.
    Also your query performance can depend upon criteria and since you have given selection only on one infoprovider...just check if you are selecting huge amount of data in the report
    Check for the query read mode in RSRT.(whether its A,X or H)..advisable read mode is X.
    5. In BI 7 statistics need to be activated for ST03 and BI admin cockpit to work.
    By implementing BW Statistics Business Content - you need to install, feed data and through ready made reports which for analysis.
    http://help.sap.com/saphelp_nw70/helpdata/en/26/4bc0417951d117e10000000a155106/frameset.htm
    /people/vikash.agrawal/blog/2006/04/17/query-performance-150-is-aggregates-the-way-out-for-me
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/ce7fb368-0601-0010-64ba-fadc985a1f94
    http://help.sap.com/saphelp_nw04/helpdata/en/c1/0dbf65e04311d286d6006008b32e84/frameset.htm
    You can go to T-Code DB20 which gives you all the performance related information like
    Partitions
    Databases
    Schemas
    Buffer Pools
    Tablespaces etc
    use tool RSDDK_CHECK_AGGREGATE in se38 to check for the corrupt aggregates
    If aggregates contain incorrect data, you must regenerate them.
    Note 646402 - Programs for checking aggregates (as of BW 3.0B SP15)
    Thanks,
    JituK

  • Performance Management - Transporting "Define Tabs and Process config"

    Good Morning Experts,
    We are moving from Development into our PPT (pre-production test) environment and have successfully transported the Performance management category settings/values, as well as the Template itself.
    Upon executing the Performance Management toolset from the portal side, I see that my tab configuration for the template did not transport.  I have searched high and low for any reference to transporting this config step, but have been unsuccessful, thus this post.
    Does anyone have any suggestions for transporting this configuration.  As you know the IMG Personnel Management > Personnel Development > Objective Setting and Appraisals > Define Tabs and Process Configuration for Template launches a webdynpro app to make this configuration step, so saving these setup selections is the only option.   How can I force these changes across ? I look forward to your responses !
    Chris Thomas
    Duke University and Health System

    All - the solution:
    SAP Note 1428054, implement the corrections according to the note.
    Run the report, RHHAP_Transport_tab_config which will generate the transport containing entries from tables:
    hrhap_tab
    hrhap_tab_data
    hrhap_tab_t
    Basis implemented, I ran report,  transported the tabs and and all is well. 
    Chris

  • Report Performance for GL item level report.

    Hi All,
    I have a requirements to get GL line items
    report based on GL Line items so have created data model like 0FI_GL_4->DSO-> cube and tested everything is fine but when execute in production the report performance is very bad.
    Report contains document number, GL act, comp.code, posting date objects.
    I have decided to do as follows to improve reporting performance
    ·         Create Aggregate on Document, GL characteristic
    ·         Compression.
    Can I do aggregates 1st then do the compression.
    Please let me know if I missing out anything.
    Regards,
    Naani.

    Hi Naani,
    First fill the Aggrigates then do Compression,run SAP_INFOCUBE_DESIGN Check the size of Dimension maintain Line item, High cordinality to the dimension, Set Cahe for query in RSRT,
    Try to reduce Novigational Attr in report. Below document may help you.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/6071ed5f-1057-2e10-deb6-d3426fec0219?QuickLink=index&…
    Regards,
    Jagadeesh

Maybe you are looking for

  • SAP Fields not getting displayed on the BDT screen

    Hello Gurus, I am trying to add 2 custom fields on to the claims management screen. The following steps were followed to add 2 fields on to the claims Management subclaim screen. 1. Create a field group Zxxxxx1 with the two fields attached to it as i

  • IP Communicator Not Working With UC_540

    Hello All, I am running Cisco IP Communicator with a UC_540.  I have configured the UC_540 for the IPC Extension with the following specs. MAC 5427.1E0A.0810 - MAC of Network Adapter on the Laptop Phone Type - CIPC Expansion Module - None Preferred C

  • Cannot change printer properties when printing PDF files in IE8

    Whenever I try to print a PDF file displayed within IE8,  I cannot change the printer properties from the print dialogue box.  the properties button appears as if it as if it is active because it visually confirma a click; however no properties dialo

  • Acrobat still can't duplex print?

    This has been a problem for a long time, judging by the number and dates of the posts. Did Adobe ever contrive a patch for Acrobat 8 that enables it to actually print two-sided (duplex)? I know I've never been able to do it on a Brother 5250 DN, whic

  • Internet connection freezes since activation of iTunes wifi-sync

    Hi there! First off: iTunes wifi-sync is working great! ...but: since i activatet this option, my internet connection (over wifi) freezes after a few minutes. no matter if my iphone is actually syncing or not. only deactivating the wifi-sync will de-