Portal Performance Trend Report

Hi Portal Gurus,
I have one request.
For ABAP Based system we can get historical performance report.
But for Portal , how we can get daily / weekly or monthly based report for performance.
We have Mercury Tool to get report for few 'Pages' based on users & response time.
I have checked response summary from monitoring. But we have multiple dialog instance & node.
Collecting data from there is not choice.
I'm looking for better way to way to get portal performance.
Regards
Sumanta Chatterjee

Hi
There are some portal activity report available in portal.chk user administration->activity report.Also you can customize this report according to your requirement.Basically a user activity report.
Also in netweaver administration (http://<portal url>:<port>/nwa)you can access all kinds of usage data.
Shankar

Similar Messages

  • Portal performance monitoring scripts : (Unable to generate reports)  HELP

    Hi,
    Using 10.1.2.0.0
    I followed README.html document to load the logs files to generate reports for Portal Performance.
    First of all while running loadlogs.pl I keep getting the following error. I even tried adding -nodirect but still gets the same error. Don't know why. But it looks like there is some data loaded in OWA_LOGGER table
    C:\ORACLE_PRODUCTS\PORTAL_AS\portal\admin\plsql\perf\loader>perl loadlogs.pl -logical_host localhost -connection owa_perf/owa_perf@orcl -http_logfile C:\ORACLE_PRODUCTS\PORTAL_AS\Apache\Apache\logs\error_log.1130457600 -webcache_logfile C:\ORACLE_PRODUCTS\PORTAL_AS\webcache\logs\access_log -oc4j_logfile C:\ORACLE_PRODUCTS\PORTAL_AS\j2ee\OC4J_Portal\application-deployments\portal\OC4J_Portal_default_island_1\application -nodirect
    25-Oct-05 13:20:17, Copying abc:C:\ORACLE_PRODUCTS\PORTAL_AS\Apache\Apache\logs
    \error_log.1130241600
    25-Oct-05 13:20:17, Loading C:\DOCUME~1\whitesox\LOCALS~1\Temp\abc_error_log.1130
    241600.20051025.132017
    25-Oct-05 13:20:21, Copying abc:C:\ORACLE_PRODUCTS\PORTAL_AS\j2ee\OC4J_Portal\a
    pplication-deployments\portal\OC4J_Portal_default_island_1\application
    25-Oct-05 13:20:21, Loading C:\DOCUME~1\whitesox\LOCALS~1\Temp\abc_application.20
    051025.132021 -nodirect
    SQL*Loader-350: Syntax error at line 127.
    Token longer than max allowable length of 258 chars
             end",
            ^
    25-Oct-05 13:20:22, Copying abc:C:\ORACLE_PRODUCTS\PORTAL_AS\webcache\logs\acce
    ss_log
    25-Oct-05 13:20:31, Loading C:\DOCUME~1\whitesox\LOCALS~1\Temp\abc_access_log.200
    51025.132022Then I ran reports.sql but I don't see any reports being generated, but running this script did populate some other tables. I tried running some other scripts also but somehow I don't see any reports being generated as opposed to what is said in the README.HTML document i.e. "A sample web page (reports.html) is included which provides links to the generated reports.". How really I get to see the reports, where are the reports generated, is it something else that I am missing. No matter what script I run I don't see any report being generated. The document is not so clear. Can someone please help me out here. Thanks

    Hi!
    You have to change to directory
    ORACLE_HOME$/portal/admin/plsql/perf/scripts
    (you can find reports.sql in it) before you run reports.sql script!
    It will produce several .txt files.
    After running the script just open reports.html, that will point the generated files.
    A better place to ask questions like this:
    Portal Performance and Scalability
    http://forums.oracle.com/forums/forum.jspa?forumID=15

  • Portal performance report

    Hi Portal GURUs,
    We have configured Portal Activity Reports for monitoring hits but we required to monitor Portal Performance.
    Please suggest me if there is any in built features to get these reports?
    Regards
    Kiran

    Hi,
    you can monitor the portal performance by using  CCMS (and GRMG).
    Below link will also help you.
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/48fead90-0201-0010-6e83-b43f5dd4d338https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3ba6a290-0201-0010-d684-c94b1c765ae9
    Raghu

  • Monthly Trend Report with YTD.

    Hi gurus,
    In my query, I want the previous year sales to be displayed along with current year sales and growth.
    Now, I have to create a monthly trend report along with YTD values in the end. Months should be drilled across by default. In the end, there should be YTD values as well (for the same Key Figures).
    The report output should look as follows (run for only two months say April 09 and May 09, YTD would be till April 09 to May 09)
                           Apri 09                 May09                YTD Value
    Sales District 1       KF1   KF2  KF3          KF1  KF2  KF3        KF1  KF2  KF3   
    Considering that for Previous Year, I just want Net sales (not other KF's) & that the report should have the complete month wise trend along with YTD, I cannot create a structure as well.
    Can anyone please tell me how to go about achieving it.
    Thank & rgds,
    Sree

    Hi,
    Using Customer Exit we can do it, see the Blogs and Articles related to Customer Exit Variables..
    http://wiki.sdn.sap.com/wiki/display/profile/Surendra+Reddy
    Using Text Variables with Customer Exits in Report Headings
    https://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/13221; [original link is broken] [original link is broken] [original link is broken]
    Customer Exit Variables in BW/BI Reports
    Using Customer Exit Variables in BW or BI Reports Part - 1
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/s-u/using%20customer%20exit%20variables%20in%20bw%20or%20bi%20reports%20part%20-%201.pdf
    How to use Customer Exit Variables in BW Reports: Part - 2
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/g-i/how%20to%20use%20customer%20exit%20variables%20in%20bw%20reports%3a%20part%202.pdf
    Using Customer Exit Variables in BW/BI Reports Part - 3
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/10fc4382-afa6-2c10-1380-fa224fe4324f&overridelayout=true
    Using Customer Exit Variables in BW/BI Reports: Part - 4
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/f0fefc77-40e3-2c10-8da3-d4bfcb013387?quicklink=index&overridelayout=true 
    Calculating the Ageing of the Materials
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/a-c/calculating%20the%20ageing%20of%20the%20materials.pdf
    Thanks
    Reddy

  • Portal performance KPIs

    I need to define an SLA for Portal performance.
    What would be the top KPIs or metrics for this?
    What are the tools/reports to measure it?

    Hi,
    Usually customer expect in SLA to be determined :
    a)  average response time (or processing time) KPI
    This KPI should be carefully defined for well-described process dialog steps (for UI based scenarios), as well as for well-defined amount of processed data.
    For background jobs processing time can be defined, with direct relation to the volume of data to be processed.
    The network environment (LAN/WAN) has significant impact on response time. The system hardware (CPU, Memory, Disk I/O) too. This means that you have to describe the conditions (fast CPU, enough memory, good network, etc) with as concrete as possible parameters, directly in the SLA together with the KPI.
    b) processing throughput or capacity , e.g. number of concurrent users, parallel tasks, orders per hour, and so on
    The KPI of how many users I can run on the system sometimes is a KPI that customers require. This, similar to the response time KPI, should be documented together with the hardware requirements and the volume of data requirements.
    Regards,
    Markus

  • Popularity Trends report always returns zero

    Hello,
    I have a SharePoint installed on “Windows server 2008 -R2”.
    I have a SQL Data base installed on the same machine.
    I create a new web application with port “2020”. Then I create w new site collection “Publishing”.
    I activate the feature “Reporting” in the site collection level.
    I Open central admin “Monitoring >> Configure usage and health data collection”. I checked “Enable usage data collection” Check box. And I Checked All “Events to Log” check boxes.
    I have configures the following services applications :
           -Business Connectivity Service
           -Excel Services Application
          -Search Service Application
         -Security Token Service Application
         -Application Discovery and Load Balancer Service Application
         -WSS_UsageApplication.
    I run the crawl search. And it is completed successfully.
    The search service account is member of “WSS_WPG” group.
    I have checked the following values from SharePoint PowerShell:
             AppEventTypeId          : 00000000-0000-0000-0000-000000000000
             EventTypeId                 : 1
             EventName                   : Views
             LifeTimeManagedPropertyName : ViewsLifeTime
             RecentManagedPropertyName   : ViewsRecent
            ApplicationName             :
            RecommendationWeight        : 1
           RelevanceWeight             : 1
          RecentPopularityTimeframe   : 14
          AggregationType             : Count, UniqueUsers
         Rollups                     : SiteSubscriptionId, SiteId, ScopeId
        TailTrimming                : 2
       Options                     : AllowAnonymousWrite
        IsReadOnly                  : False
    I open the “default.aspx” page on the portal (I have opened it more than 10 times in different browser window).
    Next day I open “popularity Trends” report, I found that it returns zero.
    I open the “Analytics Report” data base. Then I open “AnalyticsItemData” table. There are already items in the table.
        So I need to know why the “popularity Trends” excel sheet report returns zeros all the time.
    ASk

    Hi,
    According to your description, the popularity Trends report always returns no records.
    Please check the status of the 3 timer jobs: Microsoft SharePoint Foundation Usage Data Import, Microsoft SharePoint Foundation Usage Data Processing and Web Analytics
    Trigger Workflows to see if they are configured to run at regular intervals.
    Also you can take a look at the two links about the similar issue for more information:
    http://www.myriadtech.com.au/blog/Ben/Lists/Posts/Post.aspx?ID=7
    http://sharepoint.stackexchange.com/questions/66476/whats-popular-webpart-is-empty
    Feel free to reply if there any progress.
    Best regards,
    Patrick
    Patrick Liang
    TechNet Community Support

  • Improve Portal performance with BI-Java by adding additional server node.

    Hi SAP Expert
    I wonder anyone might have attempted this before,
    In an environment with BI-Java and Portal Java stack installed on the same server, is it possible to improve the system performance by adding additional portal server node? or is there any improvement seen by adding the extra server node?
    I am after suggestion and experience on how the portal performance can be improve, with consideration of BI-Java sharing Portal resources.
    Any comment will be most appreciated.

    Hi Jim,
    We've this configuration at our site, Portal and BI running together (not federated).  Recommendations would be to:
    1. Set a sensibly large max heap size on each server node of Portal (at least 2GB if not larger) and implement several nodes at least.  For example we have 4 x physical nodes, each running 3 server nodes of 2GB max heap size apiece so a total of 12 nodes.
    2. Implement the BI Safety Belt for large results sets (1127156)
    3. Implement the latest JVM / JDK you can on your environment.  We found much improved performance after implementing JDK SR10 with the J9/2.3 options enabled e.g.,  -Xjvm:j9vm23
    -Xsoftrefthreshold0
    4. Patch your BI components on the Portal (BIBASES, BIWEBAPP etc) to the latest available patch level for your Portal SP level
    5. Make sure all your RFC connections are load balanced to the backend and tuned for the kind of load you expect on your reports, and the BI is sized appropriately in terms of app servers and dialog work processes, RFC-enabled login groups in SMLG etc.
    6. Consult notes 1048691 (especially useful, opions &PROFILING=X&TRACE=X added to reports for example for performance tracing), 937697, 1021921, 948158 for information about problem analysis for this scenario
    7. Implement SAP Web Dispatcher for Portal to reduce load on static mime files
    8. Tune the Portal application.  There's plenty information on SDN related to Portal performance tuning.
    I hope this helps!
    Cheers,
    Marc

  • How to improve query performance when reporting on ods object?

    Hi,
    Can anybody give me the answer, how to improve my query performance when reporting on ODS object?
    Thanks in advance,
    Ravi Alakuntla.

    Hi Ravi,
    Check these links which may cater your requirement,
    Re: performance issues of ODS
    Which criteria to follow to pick InfoObj. as secondary index of ODS?
    PDF on BW performance tuning,
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/cccad390-0201-0010-5093-fd9ec8157802
    Regards,
    Mani.

  • No Daily Sales/Trend Report for today?

    Has anyone received your iTunes Connect Daily Sales/Trend Report for today (1/25/10)? I have not received mine yet. First time it hasn't been up at this hour.
    It usually arrives at 2:30am (it's 9:30am here now) or if it's delayed, usually in another region, there's a note in big red text on the reports page.

    As I recall (back when I only had a few apps just starting in the store), yes, no daily sales means no daily report. Wait for the weekly report to confirm.

  • Portal Performance is Very Slow..Pls Help

    Hi to all.
    Our machine is Sun280R - Sun Solaris 8..
    Using Oracle 9iAS 9.0.2.0.1..
    Our portal is 9.0.2.0.1 Release 2.
    All installed in one machine (infs/apps)
    We have difficulties with the Portal performance.
    It takes slow time to login, and view the pages.
    and when we click the Customize button, it will be more slower (50 sec)...
    Can anybody help?
    How to check what is wrong here?
    Pls..this is urgent..
    Thanks.

    you need to type in the name of a valid tablespace in the database where you are trying to install the owa_perf tables.
    Like it says in the setup guide (readme.htm) that comes with the scripts..
    "During the installation process, you will be asked for three tablespace destinations for:
    * the log tables
    * the log indices
    * the log materialized views
    It is recommended, though not mandatory, that you create and specify individual tablespaces for the storage of these elements. You cannot use DEFAULT as a valid tablespace for these responses."
    ...so you need to type in the name for a valid tablespace for each entry, you can type the same name but it asks you for three choices because each one can get large when your logs are large, asking for them separately would allow you to place them on separate disks or tablespaces that you could drop independently of other data.

  • AR Aging Trend report data through backend tables

    Hi,
    We have the AR Aging Trend report live and running. The report is optimized to the maximum but during the month end closing time the report generate a lot of timeout/proxy errors. Later we understood that the users are actually using the report to dump the data out to an excel. As this report is giving problem, there is a requirement which has come to give the AR Aging data from the back end tables. As per our current flow we have the 0FI_AR_4 data source giving data to a DSO called Customer Line Items. Now i need to generate the data from this DSO to another DSO with a particular company code and open items in filter and day should always point to last wednesday. That means, even with the AR aging trend report user is running the report using always last wednesday as the key date. Now if i want to load the data from the DSO to another DSO how can i load the data for last wednesday given any system date.
    Regards
    Vijay

    solved

  • Can the Performance Detail reports be exported as PDF without the Detail Table?

    We previously generated significant numbers of Performance Detail reports out of SCOM 2007 R2 as PDF files, and the 'Detail Table' did not show.
    Now that we are on SCOM 2012 R2, the same reports show the 'Detail Table' when exported to PDF. We report on 6 to 12 months of data, so these tables are huge, and are also useless for our purposes.
    Is there some way to suppress the Detail Table when exporting a Performance Detail report to PDF?

    Hello,
    Please see if the method in the following post can meet your requirements:
    SCOM reports on performance counters for large groups of servers
    http://www.bictt.com/blogs/bictt.php/2010/11/28/scom-reports-on-performance-counters-for-large-groups-of-servers

  • Sale and trend report is too late?

    Hello every body!
    I often get Apple's sale and trend report is often on the middle of the month (around 10th to 15th). My friend got sale and trend report for January of 2012 a week ago, it's on 9th. But now I still got nothing.
    This make me a little bit worry, is this normal and often happening to you?
    Sorry about my bad English

    The Monthly Financial Reports always show up around the same time....don't panic yet.
    It is best to be patient...always

  • Multiple lines for same app on daily trend report?

    Hey All,
    This is my first app and my first time reading the reports so pardon me if I am confused. I noticed that there are several lines (about 20) on my trend report all for the same App. From reading about the trend reports it seemed like each app was supposed to have only 1 line. Can anyone tell me whats going on? The only thing I can think of is that I switched my primary category so maybe while the app was transitioning strange things happened. Thanks.

    Typical behaviour is that it'll start to drop off a bit. You'll then probably get a boost at the first update assuming your release date gets bumped. Big boosts come from being featured but for most developers that remains a dream.
    Basically it's all about visibility. When you're on the first page of your category people are a lot more likely to buy than when you're on the 10th.

  • Split of Cubes to improve the performance of reports

    Hello Friends . We are now Implementing the Finance GL Line Items for Global Automobile operations in BMW and services to Outsourced to Japan which increased the data volume to 300 millions records for last 2 years since we go live. we have 200 Company codes.
    How To Improve performance
    1. Please suggest if I want to split the cubes based on the year and Company codes which are region based. which means european's will run report out of one cube and same report for america will be on another cube
    But Question here is if I make 8 cube (2 For each year : 1- current year comp code ABC & 1 Current Year DEF), (2 For each year : 1- Prev year comp code ABC & 1 Prev Year DEF)
    (2 For each year : 1- Arch year comp code ABC & 1 Archieve Year DEF)
    1. Then what how I can I tell the query to look the data from which cube. since Company code is authorization variable so to pick that value  of comp code and make a customer exit variable for infoprovider  will increase lot of work.
    Is there any good way to do this. does split of cubes make sense based on company code or just make it on year.
    Please suggest me a excellent approach step by step to split cubes for 60 million records in 2 years growth will be same for
    next 4 years since more company codes are coming.
    2. Please suggest if split of cube will improve performance of report or it will make it worse since now query need to go thru 5-6 different cubes.
    Thanks
    Regards
    Soniya

    Hi Soniya,
    There are two ways in which you can split your cube....either based on Year or based on Company code.(i.e Region). While loading the data, write a code in the start routine which will filter tha data. For example, if you are loading data for three region say 1, 2, and 3, you code will be something like
    DELETE SOURCE_PACKAGE WHERE REGION EQ '2' OR
    REGION EQ '3'.
    This will load data to your cube correspoding to region 1.
    you can build your reports either on these cubes or you can have a multiprovider above these cubes and build the report.
    Thanks..
    Shambhu

Maybe you are looking for