SP3 - Secure portal SSL performance improvements ?

Portal Gurus,
One of the major enhancements I was looking for in SP3 was an improvement in
the SSL gateway performance. However testing I've done so far only shows a
30% improvement in Requests/second and in open mode SP3 actually seems a
little slower than SP2. I realize there are environment and specific
workload factors at work, but under near identical conditions comparing SP2
to SP3 secure mode, the performance increase wasn't what I had hoped for.
I followed the tuning instructions in the SP3 release notes and noticed a
small improvement, ~5%, and was wondering what other people are seeing.
Given the numbers I'm seeing I have to wonder if using SSL is really viable
for a busy portal site.
Anyone seeing a big improvement in SSL performance with SP3 ?
Cheers,

I would recommend applying sp3a. The ssl have changed only in sp3a, this should give you much better and faster performance.
Else, these tuning parameters should help the performance. Goto Admin Console | Gateway Management | Manage Gateway Profile | select "Show Advanced Options" in the bottom of the page and change the following...
1) Increase the value of "Maximum Thread Pool Size". The default is 200, and it can be increased to 800.
2) Also increase the Gateway Timeout. The default is 120000. This can be increased to 125000. Then click Submit
3) Finally on the Gateway server, modify the /opt/SUNWips/bin/ipsgateway script. Find the line that defines the CMD environment variable and change the '-mx128m' parameter to '-mx256m'.

Similar Messages

  • Webcenter Portal Application Performance improvement

    Hi All,
    We are using jdeveloper version 11.1.1.6.0.
    We are using weblogic server 11g.
    We have deployed webcenter portal application on it.
    Want to know how to improve performance of the web application.
    What are the ways to improve performance factor

    generally you need to look into all the areas individually inside webcenter portal application to increase the performance of your application,
    check the below links and see if you can follow and implement,
    http://download.oracle.com/docs/cd/E14571_01/core.1111/e10108/adf.htm
    http://download.oracle.com/docs/html/B25947_01/bcadvvo002.htm#sm0342
    http://www.oracle.com/technetwork/developer-tools/jdev/introduction-best-practices-131743.pdf
    http://technology.amis.nl/blog/1385/worst-practices-when-using-oracle-jdbc-drivers
    http://download.oracle.com/docs/cd/E11035_01/wls100/perform/topten.html
    http://download.oracle.com/docs/cd/E13222_01/wls/docs92/perform/WLSTuning.html
    http://download.oracle.com/docs/cd/E16764_01/core.1111/e10108/toc.htm
    http://download.oracle.com/docs/html/B25947_01/bcadvvo002.htm#sm0342
    http://amulyamishras-tech-blog.blogspot.com/2011/04/fine-tune-adf-faces-ui-layerperformance.html
    http://andrejusb.blogspot.com/2009/08/oracle-adf-tuning-preventing-sql-query.html
    http://andrejusb.blogspot.com/2011/01/adf-11g-performance-tuning-select-one.html

  • Performance Improvement between GDK and EDK portlets

    Are there any performance improvements to be expected by migrating a portlet from using the GDK library to EDK library? Not looking at what GDK and EDK offers, more on whether we would improve the load time of a portal page if we change a portlet from GDK to EDK.....

    With GDK, my pages inherit from "Plumtree.Remote.Csp.UI.Page" and under the hood, the context is created (SettingsManager) automatically. Apparently, this is not the case anymore with the EDK. Am I correct?
    According to the EDK doc, I need to call "PortletContextFactory.CreatePortletContext(Request,Response)" for such purpose. Still correct?
    -- Yes, correct. In the EDK, no SettingsManager is used, and the functionality is wrapped into IPortletRequest and IPortletResponse.
    The other more important change is that with the GDK, the language of the current thread is automatically set to the language passed by the portal in the "Accept-Language" HTTP header. This is not the case anymore, to my knowledge, and I found out that I need to insert this:
    String sLanguage = HttpContext.Current.Request.UserLanguages[0];System.Threading.Thread.CurrentThread.CurrentCulture=new System.Globalization.CultureInfo(sLanguage);
    Is this correct or did I miss something?
    -- You do not need to use the HttpContext object of .NET. The Plumtree EDK allows you to retrieve the language as follows: The portal language is stored in a User Pref named "strLocale". A remote portlet can read this User Pref.The only point to note is that, as with all User Prefs, you must ensure that the specific prefs are sent to the portlet in the Portlet Web Service registration.
    PortletRequest.GetSettingValue(Plumtree.Remote.Portlet.SettingType.User, "strTimeZone")

  • Webi performance improvement

    Hi All, can anyone share your experience on the steps that you did for performance improvement when webi documents are opening on the CRM portal. we are trying to display the data for customer and we have integrated 3 webis on the CRM portal. The back end is HANA. For few big customers it takes a lot of time and there is a timeout in the portal. Can any steps be taken from the webi end to improve the performance?

    Optimize the query in WEBI.
    When you run the WEBI, in launchpad , how much time does it take?
    You can also limit the number of rows returned in Universe.

  • Securing file download with standard web security and ssl

    Hi,
    I want to put some files for download in my webapp. At the same time, I want to protect these files using standard servlet security and ssl. So I added <security-constraint> in my web.xml and configured tomcat to allow SSL connection. Now I got the files protected as I expected. When I try to access the file directly from browser, tomcat shows me the login page. However, after correct login, I.E. pops up an error saying something like "Internet Explorer cannot download XXX from XXX. The file could not be written to the cache.". The log file showed the following exception:
    javax.net.ssl.SSLException: Connection has been shutdown: javax.net.ssl.SSLException: java.net.SocketException: Connection reset by peer: socket write error
         at com.sun.net.ssl.internal.ssl.SSLSocketImpl.checkEOF(SSLSocketImpl.java:1154)
         at com.sun.net.ssl.internal.ssl.AppInputStream.available(AppInputStream.java:40)
         at org.apache.tomcat.util.net.TcpConnection.shutdownInput(TcpConnection.java:90)
         at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.processConnection(Http11Protocol.java:752)
         at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:526)
         at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:80)
         at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:684)
         at java.lang.Thread.run(Thread.java:595)
    Caused by: javax.net.ssl.SSLException: java.net.SocketException: Connection reset by peer: socket write error
         at com.sun.net.ssl.internal.ssl.Alerts.getSSLException(Alerts.java:166)
         at com.sun.net.ssl.internal.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1476)
         at com.sun.net.ssl.internal.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1443)
         at com.sun.net.ssl.internal.ssl.SSLSocketImpl.handleException(SSLSocketImpl.java:1407)
         at com.sun.net.ssl.internal.ssl.AppOutputStream.write(AppOutputStream.java:64)
         at org.apache.coyote.http11.InternalOutputBuffer.realWriteBytes(InternalOutputBuffer.java:747)
         at org.apache.tomcat.util.buf.ByteChunk.flushBuffer(ByteChunk.java:403)
         at org.apache.coyote.http11.InternalOutputBuffer.endRequest(InternalOutputBuffer.java:400)
         at org.apache.coyote.http11.Http11Processor.action(Http11Processor.java:961)
         at org.apache.coyote.Response.action(Response.java:182)
         at org.apache.coyote.Response.finish(Response.java:304)
         at org.apache.catalina.connector.OutputBuffer.close(OutputBuffer.java:281)
         at org.apache.catalina.connector.Response.finishResponse(Response.java:473)
         at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:151)
         at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:825)
         at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.processConnection(Http11Protocol.java:738)
         ... 4 more
    Caused by: java.net.SocketException: Connection reset by peer: socket write error
         at java.net.SocketOutputStream.socketWrite0(Native Method)
         at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:92)
         at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
         at com.sun.net.ssl.internal.ssl.OutputRecord.writeBuffer(OutputRecord.java:283)
         at com.sun.net.ssl.internal.ssl.OutputRecord.write(OutputRecord.java:272)
         at com.sun.net.ssl.internal.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:663)
         at com.sun.net.ssl.internal.ssl.AppOutputStream.write(AppOutputStream.java:59)
         ... 15 more
    I've tried separating concerns, for example protect files but not require SSL, and enable SSL but do not protect files. Both works respectively but not together. I also tried using a download4j's DownloadServlet. Still doesn't work.
    Have any of you encouter the same situation? If so, could you enlight me what I did wrong? It maybe just a simple SSL configuration or something. Thanks in advance!
    Jack

    My environment setup is:
    JDK 1.5.01
    Tomcat 5.5.7
    For downloading files, I just use plain old <a href> method. I simply right-click the link and choose "save target as...".
    Thanks,
    Jack

  • Tabular Model Performance Improvements

    Hi !
    We have a bulitv tabular model inline which has a fact table and 2 dimension tables .The performance of SSRS report is very slow and we have bottle neck in deciding SSRS as reporting tool.
    Can you help us on performance improvements with Tabular Inline
    Regards,

    Hi Bhadri,
    As Sorna said, it hard to give you the detail tips to improve the tabular model performance according the limited information. Here are some useful link about performance Tuning of Tabular Models in SQL Server 2012 Analysis Services, please refer to the
    link below.
    http://msdn.microsoft.com/en-us/library/dn393915.aspx
    If this is not what you want, please elaborate the detail information, so that we can make further analysis.
    Regards,
    Charlie Liao
    TechNet Community Support

  • DS 5.2 P4 performance improvement

    We have +/- 300,000 users that regularly authenticate using our DS. The user ou is divided in ou=internal (20,000 ids) and ou=external (280,000) uids. Approximately 85-90% percent of the traffic happens on the internal ou. The question is: Could I get any performance improvement by separating the internal branch into its own suffix/database? Would running two databases adversely affect the performance instead? We see performance impacts when big searches are performed on the ou=external branch. Would the separation isolate the issue, or those searches will most likely affect the DS as a whole?
    Thanks for your help!
    Enrique.

    Thank you for the info. Are u a Sun guy - do you work
    for sun?Yes I am. I'm the Architect for Directory Server Enterprise Edition 6.0. Previously I worked on all DS 5 releases (mostly on Replication).
    You are getting the Dukes!Thanks.
    Ludovic.

  • Performance improvement in a function module

    Hi All,
    I am using SAP 6.0 version. I have a function module to retrive the PO's . for just 10,000 records its taking long time.
    Can any one sugguest the ways to improve the performance.
    Thanks in advance.

    Moderator message - Welcome to SCN.
    But
    Moderator message - Please see Please Read before Posting in the Performance and Tuning Forum before posting
    Just 10,000 records? The first rule in performance improvement is to reduce the amount of selected data. If you cannot do that, it's going to take time.
    I wouldn't bother with a BAPI for so many records. Write some custom code to get only the data you need.
    Tob

  • Pls help me to modify the query for performance improvement

    Hi,
    I have the below initialization
    DECLARE @Active bit =1 ;
    Declare @id int
    SELECT @Active=CASE WHEN id=@id and [Rank] ='Good' then 0 else 1 END  FROM dbo.Students
    I have to change this query in such a way that the conditions id=@id and [Rank] ='Good' should go to the where condition of the query. In that case, how can i use Case statement to retrieve 1 or 0? Can you please help me to modify this initialization?

    I dont understand your query...May be below? or provide us sample data and your output...
    SELECT *  FROM dbo.students
    where @Active=CASE
    WHEN id=@id and rank ='Good' then 0 else 1 END
    But, I doubt you will have performance improvement here?
    Do you have index on id?
    If you are looking for getting the data for @ID with rank ='Good' then use the below:Make sure, you have index on id,rank combination.
    SELECT *  FROM dbo.students
    where  id=@id
    and rank ='Good' 

  • Performance improvement in OBIEE 11.1.1.5

    Hi all,
    In OBIEE 11.1.1.5 reports takes long time to load , Kindly provide me some performance improvement guides.
    Thanks,
    Haree.

    Hi Haree,
    Steps to improve the performance.
    1. implement caching mechanism
    2. use aggregates
    3. use aggregate navigation
    4. limit the number of initialisation blocks
    5. turn off logging
    6. carry out calculations in database
    7. use materialized views if possible
    8. use database hints
    9. alter the NQSONFIG.ini parameters
    Note:calculate all the aggregates in the Repository it self and Create a Fast Refresh for MV(Materialized views).
    and you can also do one thing you can schedule an IBOT to run the report every 1 hour or some thing so that the report data will be cached and when the user runs the report the BI Server extracts the data from Cache
    This is the latest version for OBIEE11g.
    http://blogs.oracle.com/pa/resource/Oracle_OBIEE_Tuning_Guide.pdf
    Report level:
    1. Enable cache -- change nqsconfig instead of NO change to YES.
    2. GO--> Physical layer --> right click table--> properties --> check cacheable.
    3. Try to implement Aggregate mechanism.
    4.Create Index/Partition in Database level.
    There are multiple other ways to fine tune reports from OBIEE side itself:
    1) You can check for your measures granularity in reports and have level base measures created in RPD using OBIEE utility.
    http://www.rittmanmead.com/2007/10/using-the-obiee-aggregate-persistence-wizard/
    This will pick your aggr tables and not detailed tables.
    2) You can use Caching Seeding options. Using ibot or Using NQCMD command utility
    http://www.artofbi.com/index.php/2010/03/obiee-ibots-obi-caching-strategy-with-seeding-cache/
    http://satyaobieesolutions.blogspot.in/2012/07/different-to-manage-cache-in-obiee-one.html
    OR
    http://hiteshbiblog.blogspot.com/2010/08/obiee-schedule-purge-and-re-build-of.html
    Using one of the above 2 methods, you can fine tune your reports and reduce the query time.
    Also, on a safer side, just take the physical SQL from log and run it directly on DB to see the time taken and check for the explain plan with the help of a DBA.
    Hope this help's
    Thanks,
    Satya
    Edited by: Satya Ranki Reddy on Aug 12, 2012 7:39 PM
    Edited by: Satya Ranki Reddy on Aug 12, 2012 8:12 PM
    Edited by: Satya Ranki Reddy on Aug 12, 2012 8:20 PM

  • MV Refresh Performance Improvements in 11g

    Hi there,
    the 11g new features guide, says in section "1.4.1.8 Refresh Performance Improvements":
    "Refresh operations on materialized views are now faster with the following improvements:
    1. Refresh statement combinations (merge and delete)
    2. Removal of unnecessary refresh hint
    3. Index creation for UNION ALL MV
    4. PCT refresh possible for UNION ALL MV
    While I understand (3.) and (4.) I don't quite understand (1.) and (2.). Has there been a change in the internal implementation of the refresh (from a single MERGE statement)? If yes, then which? Is there a Note or something in the knowledge base, about these enhancements in 11g? I couldn't find any.
    Considerations are necessary for migration decision to 11g or not...
    Thanks in advance.

    I am not quit sure, what you mean. You mean perhaps, that the MVlogs work correctly when you perform MERGE stmts with DELETE on the detail tables of the MV?
    And were are the performance improvement? What is the refresh hint?
    Though I am using MVs and MVlogs at the moment, our app performs deletes and inserts in the background (no merges). The MVlog-based fast refresh scales very very bad, which means, that the performance drops very quickly, with growing changed data set.

  • Why GN_INVOICE_CREATE has no performance improvement even in HANA landscape?

    Hi All,
    We have a pricing update program which is used to update the price for a Material Customer combination(CMC).This update is done using the FM 'GN_INVOICE_CREATE'.
    The logic is designed to loop on customers, wherein this FM will be called passing all the materials valid for that customer.
    This process is taking days(Approx 5 days) to get executed and updated for CMC of 100 million records.
    Hence we are planning to move towards HANA for better improvement in performance.
    We designed the same programs in the HANA landscape and executed it in both systems for 1 customer and 1000 material combination.
    Unfortunately, both the systems gave same runtimes around 27 seconds for execution.
    This is very disappointing thinking the performance improvement we should have on HANA landscape.
    Could anyone throw light on any areas where we are missing out and why no performance improvement was obtained ?
    Also is there any configuration related changes to be done on HANA landscape for better performance.?
    The details regarding both the systems are as below.
    Suite on HANA:
    SAP_BASIS : 740
    SAP_APPL  : 617
    ECC
    SAP_BASIS : 731
    SAP_APPL  : 606
    Also see the below screenshots of the system details.
    HANA:
    ECC:
    Thanks & regards,
    Naseem

    Hi,
    just to fill in on Lars' already exhaustive comments:
    Migrating to HANA gives you lots of options to replace your own functionality (custom ABAP code) wuth HANA artifacts - views or SQLscript procedures. This is where you can really gain on performance. Expecting ABAP code to automatically run faster on HANA may be unrealistic, since it depends on the functionality of the code and how well it "translates" to a HANA environment. The key to really minimize run time is to replace DB calls with specific HANA views or procedures, then call these from your code.
    I wrote a blog on this; you might find it useful as a general introduction:
    A practical example of ABAP on HANA optimization
    When it comes to SAP standard code, like your mentioned FM, it is true that SAP is migrating some of this functionality to HANA-optimized versions, but this doesn't mean everything will be optimized in one go. This particular FM is probably not among those being initially selected for "HANAification", so you basically have to either create your own functionality (which might not be advisable due to the fact that this might violate data integrity) or just be patient.
    But again, the beauty of HANA lies in the brand new options for developers to utilize the new ways of pushing code down to the DB server. Check out the recommendations from Lars and you'll find yourself embarking on a new and exciting journey!
    Also - as a good starting point - check out the HANA developer course on open.sap.com.
    Regards,
    Trond

  • Will there performance improvement over separate tables vs single table with multiple partitions?

    Will there performance improvement over separate tables vs single table with multiple partitions? Is advisable to have separate tables than having a single big table with partitions? Can we expect same performance having single big table with partitions? What is the recommendation approach in HANA?

    Suren,
    first off a friendly reminder: SCN is a public forum and for you as an SAP employee there are multiple internal forums/communities/JAM groups available. You may want to consider this.
    Concerning your question:
    You didn't tell us what you want to do with your table or your set of tables.
    As tables are not only storage units but usually bear semantics - read: if data is stored in one table it means something else than the same data in a different table - partitioned tables cannot simply be substituted by multiple tables.
    Looked at it on a storage technology level, table partitions are practically the same as tables. Each partition has got its own delta store & can be loaded and displaced to/from memory independent from the others.
    Generally speaking there shouldn't be too many performance differences between a partitioned table and multiple tables.
    However, when dealing with partitioned tables, the additional step of determining the partition to work on is always required. If computing the result of the partitioning function takes a major share in your total runtime (which is unlikely) then partitioned tables could have a negative performance impact.
    Having said this: as with all performance related questions, to get a conclusive answer you need to measure the times required for both alternatives.
    - Lars

  • You do not have security rights to perform this operation exception occurred in CreateComputerVariable method

    I am getting an exception near computerSettings.Put();Its throwing an exception as You do not have security rights to perform this operation..Can i know exactly when does this error occur..
    Details of Error:
    ConfigMgr Error Object:
    instance of SMS_ExtendedStatus
    CauseInfo = "5";
    Description = "CSspMachineExtProperties: ERROR_ACCESS_DENIED: ";
    ErrorCode = 1112017925;
    File = "e:\\qfe\\nts\\sms\\siteserver\\sdk_provider\\smsprov\\sspmachineextprops.cpp";
    Line = 958;
    ObjectInfo = "";
    Operation = "PutInstance";
    ParameterInfo = "";
    ProviderName = "ExtnProv";
    StatusCode = 2147749889;
    stack trace:
     at Microsoft.ConfigurationManagement.ManagementProvider.WqlQueryEngine.WqlResultObject.Put(ReportProgress progressReport)
       at Microsoft.ConfigurationManagement.ManagementProvider.WqlQueryEngine.WqlResultObject.Put()
       at TestWqlManage.Program.CreateComputerVariable(WqlConnectionManager connection, String siteCode, List`1 variables, Int32 computerId) in /path/path/path
    ComputerVariable Method where exception occurs
    public static string CreateComputerVariable(WqlConnectionManager connection, string siteCode, List<ComputerVariableDC> variables, int computerId)
                try
                    // Get the computer settings.
                    IResultObject computerSettings = null;
                    IResultObject computerSettingsQuery = connection.QueryProcessor.ExecuteQuery(
                        "Select * from SMS_MachineSettings where ResourceId = '" + computerId + "'");
                    foreach (IResultObject settings in computerSettingsQuery)
                        computerSettings = settings;
                    if (computerSettings == null) // It doesn't exist, so create it.
                        computerSettings = connection.CreateInstance(@"SMS_MachineSettings");
                        computerSettings["ResourceID"].IntegerValue = computerId;
                        computerSettings["SourceSite"].StringValue = siteCode;
                        computerSettings["LocaleID"].IntegerValue = 1033;
                        computerSettings.Put();
                        computerSettings.Get();
                    // Create the computer variable.
                    List<IResultObject> computerVariables = computerSettings.GetArrayItems("MachineVariables");
                    foreach (ComputerVariableDC variable in variables)
                        IResultObject computerVariable = connection.CreateEmbeddedObjectInstance("SMS_MachineVariable");
                        computerVariable["Name"].StringValue = variable.Name;
                        computerVariable["Value"].StringValue = variable.Value;
                        computerVariable["IsMasked"].BooleanValue = false;
                        computerVariables.Add(computerVariable);
                    computerSettings.SetArrayItems("MachineVariables", computerVariables);
                    computerSettings.Put();
                    return computerId.ToString();
                catch (SmsException e)
                    Console.WriteLine("Failed to create computer variable: " + e.Message);
                    //throw;
                    //return e.Message;
                    throw e;

    Hi,
    What's the error when you create a computer variable manually?
    Please make sure you have give "Modify Resource" permission to this user. You could see it in Administration workspace -> Security -> Security Roles -> Full Administrator -> Collections -> Modify Resource.
    Best Regards,
    Joyce
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • DMA Performance Improvements for TIO-based Devices

    Hello!
    DMA Performance Improvements for TIO-based Devices
    http://digital.ni.com/public.nsf/websearch/1B64310FAE9007C086256A1D006D9BBF
    Can I apply the procedure to NI-DAQmx 9? These ini-files dont seem to exist anymore in the newer version.
    Best, Viktor

    Hi Viktor,
    this page is 7 years old and doesn't apply to the DAQmx.
    Regards, Stephan

Maybe you are looking for