Storing data in a session : best practice

Hi,
          We are designing a Servlet/JSP based application that has a web-tier
          separate from the middle tier.
          One of our apps have a lot of user inputs, average 500k and upto 2MB of data
          in the request.
          We do not have a way of breaking this application up (i.e the whole 2MB form
          data must be posted at ome time).
          We have 2 solutions and want to know what is the better one and wahy ...
          1. Use session and store all the information in the session.
          2. use Javascript to assemble all the data and submit it at one time.
          I prefer #2 because I don't want to use sessions and also becuase I don't
          want to use a database on the web-tier....
          Please help me explain to my cpollegues who are convinced that we have to
          use sessions to store this data..
          -JJ
          

Hello,
WHen you say you want to  load data to other cube, which means one cube holds data for 2 yrs and another for 2 yrs....so they tend to occupy the same table space.
WHen you say summarized loading, what exactly you mean by that....???
the data is summarized in the cube ae per the char present in it....so if you reduced the no of char in the second cubes the data will get aggergated on that level giving lesser no of records and occupying less table space.
Also you can reduce the table space by just compressing the requests in the cube.
Regards,
Shashank

Similar Messages

  • Storing user information into session best practice.

    I am developing an web application where user first have to login to be able to enter.
    When user correctly logged in an UserAccount object with all user data (except loginName and Password) is being stored into session. An Filter is checking session for UserAccount object and if user correctly logged in forwards the request to the next filter in the application.
    When user logged out, the session object is being destroyed. �Session invalidate()�
    I would like to know if there are better solutions for this.
    Thank you in advance.
    --Nermin B.

    You may want to also re-evaluate your "application" as a whole.
    In most cases - yours may be the exception - session objects are used to well, associate a particular web browser with a particular user. There usually is little need to retain additional information about that person, unless that information is frequently accessed. I think Shok used a poor example; a person's address, phone number, credit card number and so on is usually accessed once per visit, whereas the contents of that person's shopping cart is generally accessed every time the user changes web pages.
    The key concept here is you want to balance memory consumption verses database hits (or file i/o). Authorization info should be in the object, whereas general background info can be in the data source and accessed when needed.
    If on the other hand, you don't have a data source, you don't really have much of a choice and a session object (or similarly, a JavaBean) should just go ahead and contain all of the information about the user.
    To change the subject, session objects and JavaBeans are server side. As long as you keep a careful eye on the interfaces to those objects, you should be fine security wise. For example, if a person passes a parameter to your web page, make sure the parameter is anticipated and correct before you save it in the object. In other words, don't write a generic function that blindly accepts parameter names and values and sets them accordingly within the session object. The only place you should be able to set the password attribute is from the change-your-password JSP page.
    Cookies on the other hand are definitely stored on the client's machine, and yes, you want to be really paranoid and make sure that the cookie you are retrieving is the cookie you are expecting. I think the source of confusion is that session ids corresponding to session objects can be stored within a cookie - so you if can change the id, the server thinks you are someone else and uses that person's session object.

  • Import data from excel file - best practice in the CQ?

    Hi,
    I have question related to importing data from excel file and creates from those data a table in the CQ page. Is inside CQ some OOTB component which provides this kind of functionalities? Maybe somebody implement this kind of functionality or there is best practice to do this kind of functionalities?
    Thanks in advance for any answer,
    Regards
    kasq

    You can check a working example package [1] (use your Adobe ID to log in)
    After installing it, go to [2] for immediate example.
    Unfortunately it only supports the old OLE-2 Excel format (.xls and not .xlsx)
    [1] - http://dev.day.com/content/packageshare/packages/public/day/cq540/demo/xlstable.html
    [2] - http://localhost:4502/cf#/content/geometrixx/en/company/news/pressreleases/my_personal_bes ts.html

  • Data Migration and Consolidation Best Practices?

    Hi guys
    Do you know what the best practice for data migration to FCSvr is? We’re trying to consolidate all media on various firewire/internal drives to a centralised RAID directly attached to a dedicated server. We’ve found that dragging and dropping a FCP project file uploads its associated media. The problem is that if there are several versions or separate projects linking to the same media, associated media is re-uploaded everytime! This results in that media getting duplicated several times. It appears that the issue is due to FCSvr creating a subfolder for every project file being uploaded which contains all the project’s media.
    This behaviour is not consistent when caching assets, checking out a project file, making changes and checking it back in. FCSvr is quite happy for a project file to link to media existing at the root level of the media device.
    We are of course running the latest version of everything. Hope you can help as we’re pulling our hair out here!
    Regards
    Gavin

    Hi,
    Do you really need an ETL Tool for these loading processes. Have you considered doing it in sql/plsql. If the performance of the application is one of the main priority I would definitely consider doing it in sql/plsql.
    Because of the huge amount of data and because your source and target systems are Oracle DBs I wouldn't recommend you to use Informatica.
    Also because source and target are Oracle DBs and it should be near real time you should have a look at Oracle Streams.
    Regards
    Maurice

  • Data Element Length - Restriction / Best Practice

    Many databases in the market currently allows table/column names to be 128 characters long. Our developers are suggesting that 30-35 characters is the limit one should strive for.
    One thought is that Business Objects cannot handle queries bigger than 64K u2013 is this true?  If a query has several filters or results fields that are very long will that be a problem?
    Also, is there some potential usage issue / best practice by which using a shorter name would be advantageous in Business Objects?
    Our tool stack is SQL Server 2005 / Informatica 8.1.1 / Business Objects XI R2

    Hi Jennifer,
    I am not sure if the 64K limit is true. However, given that this is the query string that BO is passing to your SQL Server, it would be a limit
    The following query is 2k, so you can imagine that a 64k one would be difficult to read
    SELECT
         A_Long_Table_Name.TestField1,
         A_Long_Table_Name.TestField12,
         A_Long_Table_Name.TestField22,
         A_Long_Table_Name.TestField32,
         A_Long_Table_Name.TestField42,
         A_Long_Table_Name.TestField52,
         A_Long_Table_Name.TestField62,
         A_Long_Table_Name.TestField72,
         Another_Long_Table_Name.TestField1,
         Another_Long_Table_Name.TestField12,
         Another_Long_Table_Name.TestField13,
         Another_Long_Table_Name.TestField14,
         Another_Long_Table_Name.TestField15,
         Another_Long_Table_Name.TestField16,
         Another_Long_Table_Name.TestField17
    FROM
         A_Long_Table_Name,
         Another_Long_Table_Name
    WHERE
         A_Long_Table_Name.TestField1 =      Another_Long_Table_Name.TestField1
    UNION ALL
    SELECT
         A_Long_Table_Name2.TestField1,
         A_Long_Table_Name2.TestField12,
         A_Long_Table_Name2.TestField22,
         A_Long_Table_Name2.TestField32,
         A_Long_Table_Name2.TestField42,
         A_Long_Table_Name2.TestField52,
         A_Long_Table_Name2.TestField62,
         A_Long_Table_Name2.TestField72,
         Another_Long_Table_Name.TestField1,
         Another_Long_Table_Name.TestField12,
         Another_Long_Table_Name.TestField13,
         Another_Long_Table_Name.TestField14,
         Another_Long_Table_Name.TestField15,
         Another_Long_Table_Name.TestField16,
         Another_Long_Table_Name.TestField17
    FROM
         A_Long_Table_Name2,
         Another_Long_Table_Name
    WHERE
         A_Long_Table_Name2.TestField1 =      Another_Long_Table_Name.TestField1
    On the object names, I do not believe you will recieve any noticable difference in performance. It is better to keep the names of columns and tables legible, and your DBAs suggestion of 30-35 is about right.
    Regards
    Alan

  • HR data transfer toolkit_ SAP Best practice

    Hi All,
    I got to know the New toolkit for HR  Data Transfer in  which  SAP has provided the Built in Excel sheets for the certain Infotypes..I got the document through online. But am not able to use the transactions ..can any one help me out ..from where do i download the HR Data transfer toolkit and if any patches which i have install in the system.
    Can you please help me out in giving me the Links.or steps from where do i donwload the toolkit.
    Advance thanks for the help.
    Regards
    Rajeshwar

    Rajesh,
    You have to request basis to apply the sar file that is specified in best business practise documentation.
    Also note that data transfer tool for ECC 6.0 is different from ECC 5.0 or R/3.
    The data transfer tool is now (in ECC 6) shifted to LSMW.
    regards
    Sridhar

  • Data files available in Best Practice document for BPC

    Any idea how to retrieve data files avaialble under Miscellaneous folder on Best Preactice cd. These files are not available online and don't know how to retrieve the same.  Any help kindly appreciated.
    Sachin

    Download available through swdc.

  • Upcoming SAP Best Practices Data Migration Training - Chicago

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    SAP America, Downers Grove in Chicago, IL:
    November 3 u2013 5, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
    Logistics & How to Register
    Nov. 3 u2013 5: SAP America, Downers Grove,  IL
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 8AM u2013 3PM
                     Address:
                     SAP America u2013Buckingham Room
                     3010 Highland Parkway
                     Downers Grove, IL USA 60515
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please use the hyperlink below.
    http://service.sap.com/~sapidb/011000358700000917382010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Upcoming SAP Best Practices Data Migration Training - Berlin

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    Berlin, Germany: October 06 u2013 08, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    Logistics & How to Register
    October 06 u2013 08: Berlin, Germany
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 9AM u2013 4PM
                     SAP Deutschland AG & Co. KG
                     Rosenthaler Strasse 30
                     D-10178 Berlin, Germany
                     Training room S5 (1st floor)
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please follow the hyperlink below
    http://intranet.sap.com/~sapidb/011000358700000940832010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Item Master Data Best Practice

    hello all
    we are now using SBO for more than a year, and yet we still always add new items in our item master data. what is the best practice on maintaining the item master data. for you to understand this is the scenario. since in the Factory/Mill there are a lot of spare parts and equipments there, if some of this equipments is damage, we have to buy a new one, here the problem occur because if it only differ in Part Numbers we use another item code for it. with this practice, at later part we found out that we have more than 1 item code for only one item because of the naming convention. so we have to hold the other item code and use the other one coz we cant delete it anymore. sometimes 1 itemcode occurrs only once in the in the item history.
    please suggest what is the best Practice on this matter.
    1. Item Grouping
    2. Naming Convention
    etc..
    NOTE:
    our goal is minimize adding of items in item master data.
    FIDEL

    FIDEL,
    From what I understand, you have to replace broken / damaged component of an item like Bulldozer, Payloader and mill turbines.  This is the reason why you defined the parts as a new item.
    From your Item code examples, I am not clear why you have 2 different names for the same item.  and also what you mean by "this two item codes are actually the same,
    If you are just buying parts to replace components and if you do not need to track them then I would suggest you create generic itemcodes in the Item master and simply change the description when you buy / sell them.
    Example:  Same Item different description.
    REPL101  OIL FILTER
    REPL101  FUEL FILTER
    REPL101  xxxxx
    This way you are not going to keep creating items in the database and also you can see the description and know what it was.
    Simply change the ItemName in the marketing document and instead of pressing Tab to move to the next column Press CTRL+Tab so that SAP does not auto check then ewly typed name against the item master.
    Let me know if your scnenario is otherwise
    Suda

  • Storing data - best practice?

    Hi,
    I wonder if there is any best practice to store data in my EP6.0 portal? For instance, in a standard website if you have a list of events, each event can be stored in a related sql-database and can then be fetched and updated whenever necessary.
    What is the best way to do developing portal content? The reason I am asking is because I want to develop a WebDynpro application where I can select a date and then display all registered events on that day in my portal.
    Best regards
    Øyvind Isaksen

    Okey, and then using a RFC call from the webdynpro application to fetch data from the sap database?
    This answered my question:
    Best regards
    Øyvind Isaksen

  • Best practice for storing/loading medium to large amounts of data

    I just have a quick question regarding the best medium to store a certain amount of data. Currently in my application I have a Dictionary<char,int> that I've created, that I subsequently populate with hard-coded static values.
    There are about 30 items in this Dictionary, so this isn't presented as much of a problem, even though it does make the code slightly more difficult to read, although I will be adding more data structures in the future with a similar number of items.
    I'm not sure whether it's best practice to hard-code these values in, so my question is, is there a better way to store this information, retrieve and load it at run-time?

    You could use one of the following methods:
    Use the application.config file. Upside is that it is easy to maintain. Downside is a user could edit it manually as its just an xml file.
    You could use a settings file. You can specify where the setting file is persisted including under the user's profile or the application. You could serialize/deserialize your settings to a section in the settings. See
    this MSDN help section
    on details abut the settings.
    Create a .txt, .json, or .xml file (depending on the format you will be deserializing your data) in your project and have it be copied to the output path with each build. The upside is that you could push out new versions in the future of the file without
    having to re-compile your application. Downside is that it could be altered if the user has O/S permissions to that directory.
    If you really do not want anyone to access it and are thinking of pushing out a new application version every time something changes you could create a .txt, .json, .xml file (depending on the format you will be deserializing your data) just like the previous
    step but this time mark it as an embedded resource in your project (you can do this in the properties of the  file in visual studio). It will essentially get compiled in your application. Content retrieval is outlined in
    this how to from Microsoft and then you just deserialize the retrieved content the same as the previous step.
    As far as formats of your data. I recommend you use either XML or JSON or a text file if its just a flat list of items (ie. list of strings). Personally I find JSON much easier to read compared to XML and change and there are plenty of supported serializers
    out there. XML is great too if you need to be strict as to what the schema is.
    Mark as answer or vote as helpful if you find it useful | Igor

  • Obiee 11g : Best practice for filtering data allowed to user

    Hi gurus,
    I have a table of the allowed areas for each user.
    I want to show only the data facts associated with these allowed areas.
    For instance my user scott can see France and Italy data.
    I made a variable session. I put this session variable in a filter.
    It works ok but only one value (the first one i think) is taken in account (for instance, with my solution scott will see only france data).
    I need all the possible values.
    I tried with the row wise parameter of the variable session. But it doesn't work (error obiee).
    I've read things on internet about using stragg or valuelistof but neither worked.
    What would be the best practice to achieve this goal of filtering data with conditions by user stored in database ?
    Thanks in advance, Emmanuel

    Check this link
    http://oraclebizint.wordpress.com/2008/06/30/oracle-bi-ee-1013332-row-level-security-and-row-wise-intialized-session-variables/

  • Best practice how to retrieve & update data w/o any jsf-lifecycle-overhead

    I have a request scoped jsf managed bean called "ManagedBean". This bean has a method annotated with "@PostConstruct" that retrieves data from a database. The data is shown in a jsp "showAndEditData.jsp" in <h:inputText /> components - so the data is editable.
    The workflow is as follows:
    First, when navigating to "showAndEditData.jsp", the ManagedBean is created, the "@PostConstruct"-method is invoked, and the data retrieved from the database is shown to the user.
    Second, the user changes the data.
    Third, the user presses the submit button, the ManagedBean is created again, the "@PostConstruct"-method is invoked again, and the data is retrieved from the database again. Then the data is overridden by the changes the user made and passed to the business-tier (where it will be saved to the database).
    Every step that i marked with "*again*" is completely unneccessary and a huge overhead.
    Is there a way to prevent these unneccessary steps.
    Or asking in other words: Is there a best practice how to retrieve and update data efficently and without any overhead using JSF?
    I do not want to use session scoped managed beans, because this would be a huge overhead as well.

    The first "again" is neccessary, because after successfull validation, you need new object in request to store the submitted value.
    I agree to the second and third, really unneccessary and does not make sense.
    Additionally I think it�s bad practice putting data in session beansTotal agree, its a disadvantage of JSF that we often must use session.
    Think there is also an bigger problem with this.
    Dont know how your apps are working, my apps start an new database transaction per commit on every new request.
    So in this case, if you do an second query on postback, which uses an different database transaction, it could get different data as for the inital request.
    But user did his changes <b>accordingly</b> to values of the first snapshot during the inital request.
    If these values would be queried again on postback, and they have been changed meanwhile, it becomes inconsistent, because values of snapshot two, do not fit to user input.
    In my opionion zebhed has posted an major mistake in JSF.
    Dont now, where to store the data, perhaps page scope could solve this.
    Not very knowledge of that section, but still ask myself, if this data perhaps could be stored in the components and on an postback the data are rendered from components + submittedvalues instead of model.

  • ADF Faces : session timeout best practice

    hi
    I made these small modifications to the web.xml file in the SRDemoSample application:
    (a) I changed the login-config from this ...
      <login-config>
        <auth-method>FORM</auth-method>
        <form-login-config>
          <form-login-page>infrastructure/SRLogin.jspx</form-login-page>
          <form-error-page>infrastructure/SRLogin.jspx</form-error-page>
        </form-login-config>
      </login-config>... to this
      <login-config>
        <auth-method>BASIC</auth-method>
      </login-config>(b) I changed the session-timeout to 1 minute.
      <session-config>
        <session-timeout>1</session-timeout>
      </session-config>Please consider this scenario:
    (1) Run the UserInterface project of the SRDemoSample application in JDeveloper.
    (2) Authenticate using "sking" and password "welcome".
    (3) Click on the "My Service Requests" tab.
    (4) Click on a "Request Id" like "111". You should see a detail page titled "Service Request Information for SR # 111" that shows detail data on the service request.
    (5) Wait for at least one minute for the session to timeout.
    (6) Click on the "My Service Requests" tab again. I see the same detail page as in (4), now titled "Service Request Information for SR #" and not showing any detail data.
    question
    What is the best practice to detect such session timeouts and handle them in a user friendly way in an ADF Faces application?
    thanks
    Jan Vervecken

    Hi,
    no. Here's the content copied from a word doc:
    A frequent question on the JDeveloper OTN forum, and also one that has been asked by customers directly, is how to detect and graceful handle user session expiry due to user inactivity.
    The problem of user inactivity is that there is no way in JavaEE for the server to call the client when the session has expired. Though you could use JavaScript on the client display to count
    down the session timeout, eventually showing an alert or redirecting the browser, this goes with a lot of overhead. The main concern raised against unhandled session invalidation due to user
    inactivity is that the next user request leads to unpredictable results and errors messages. Because all information stored in the user session get lost upon session expiry, you can't recover the
    session and need to start over again. The solution to this problem is a servlet filter that works on top of the Faces servlet. The web.xml file would have the servlet configured as follows
    1.     <filter>
    2.         <filter-name>ApplicationSessionExpiryFilter</filter-name>
    3.         <filter-class>
    4.             adf.sample.ApplicationSessionExpiryFilter
    5.         </filter-class>
    6.         <init-param>
    7.             <param-name>SessionTimeoutRedirect</param-name>
    8.             <param-value>SessionHasExpired.jspx</param-value>
    9.         </init-param>
    10.     </filter>
    This configures the "ApplicationSessionExpiryFilter" servlet with an initialization parameter for the administrator to configure the page that the filter redirects the request to. In this
    example, the page is a simple JSP page that only prints a message so the user knows what has happened. Further in the web.xml file, the filter is assigned to the JavaServer Faces
    servlet as follows
    1.     <filter-mapping>
    2.             <filter-name>ApplicationSessionExpiryFilter</filter-name>
    3.             <servlet-name>Faces Servlet</servlet-name>
    4.         </filter-mapping>
    The Servlet filter code compares the session Id of the request with the current session Id. This nicely handles the issue of the JavaEE container implicitly creating a new user session for the incoming request.
    The only special case to be handled is where the incoming request doesn't have an associated session ID. This is the case for the initial application request.
    1.     package adf.sample;
    2.     
    3.     import java.io.IOException;
    4.     
    5.     import javax.servlet.Filter;
    6.     import javax.servlet.FilterChain;
    7.     import javax.servlet.FilterConfig;
    8.     import javax.servlet.ServletException;
    9.     import javax.servlet.ServletRequest;
    10.     import javax.servlet.ServletResponse;
    11.     import javax.servlet.http.HttpServletRequest;
    12.     import javax.servlet.http.HttpServletResponse;
    13.     
    14.     
    15.     public class ApplicationSessionExpiryFilter implements Filter {
    16.         private FilterConfig _filterConfig = null;
    17.        
    18.         public void init(FilterConfig filterConfig) throws ServletException {
    19.             _filterConfig = filterConfig;
    20.         }
    21.     
    22.         public void destroy() {
    23.             _filterConfig = null;
    24.         }
    25.     
    26.         public void doFilter(ServletRequest request, ServletResponse response,
    27.                              FilterChain chain) throws IOException, ServletException {
    28.     
    29.     
    30.             String requestedSession =   ((HttpServletRequest)request).getRequestedSessionId();
    31.             String currentWebSession =  ((HttpServletRequest)request).getSession().getId();
    32.            
    33.             boolean sessionOk = currentWebSession.equalsIgnoreCase(requestedSession);
    34.           
    35.             // if the requested session is null then this is the first application
    36.             // request and "false" is acceptable
    37.            
    38.             if (!sessionOk && requestedSession != null){
    39.                 // the session has expired or renewed. Redirect request
    40.                 ((HttpServletResponse) response).sendRedirect(_filterConfig.getInitParameter("SessionTimeoutRedirect"));
    41.             }
    42.             else{
    43.                 chain.doFilter(request, response);
    44.             }
    45.         }
    46.        
    47.     }
    This servlet filter works pretty well, except for sessions that are expired because of active session invalidation e.g. when nuking the session to log out of container managed authentication. In this case my
    recommendation is to extend line 39 to also include a check if security is required. This can be through another initialization parameter that holds the name of a page that the request is redirected to upon logout.
    In this case you don't redirect the request to the error page but continue with a newly created session.
    Ps.: For testing and development, set the following parameter in web.xml to 1 so you don't have to wait 35 minutes
    1.     <session-config>
    2.         <session-timeout>1</session-timeout>
    3.     </session-config> Frank
    Edited by: Frank Nimphius on Jun 9, 2011 8:19 AM

Maybe you are looking for

  • How to hide birthdays in Calendar except important ones ?

    Is it possible to hide most birthdays in Calendar, except a few, i.e., those of the VIP's, for instance, the spouse? ;-) I don't want to delete all the birthdays from AdressBook, but I'd like to see in Calendar only those that are really important. A

  • Error Msg: Apple sync notifier exe entry point not found

    Hi. I'm getting the above error message when I log-on to my laptop. I've looked for a solution on the web but cannot find one where the answer/link is still good. Can anyone help me? Thanks

  • Size of placed PDF Page

    After placing a page from a PDF, I want to display the size of the page in a text frame. The closest to the size of the PDF page that I get is when using app.pdfPlacePreferences.pdfCrop = PDFCrop.cropMedia; Sometimes I get and expected size like 8.5

  • How to use WebCenter REST API to like or comment a activity

    Hi all I want to know how to realize like or comment function with REST APIs. I execute a REST call with following url, but the 'like' counter is not be increased. http://cdcjp77vm3.cn.oracle.com:8888/rest/api/activities/services/oracle.webcenter.com

  • System errors

    Hi,   I recently got a problem while using my Music Player on a N97.I get a message, it says: System Error! Then, I can't read mp3 files but acces to the photo gallery was unnafected. I then updated my software and I can't even ACCES to my Music play