Implementing a "login" using best practices

I have a little bit of time now for my project where I'd like to refactor it a bit and take the opportunity to learn about the best practices to use with JSP/Servlets but I'm having some trouble thinking about what goes where, and how to organize things.
Here's my current login functionality. I have not seperated my "business logic" from my "presentation logic" as you can see in this simple starting example.
index.html:
<html>
<body>
<form action="login.jsp" method="post">
    <h1>Please Login</h1>
    User Name:    <input type="text" name="login"><br>
    Password: <input type="password" name="password"><br>
    <input type=submit value="Login">
</form>
</body>
</html>login.jsp:
<jsp:useBean id="db" type="database.DatabaseContainer" scope="session"/>
<%
if(session.getAttribute("authorized")==null || session.getAttribute("authorized").equals("no") || request.getParameter("login")!=null)
     String login = request.getParameter("login");
     String password = request.getParameter("password");
     if (login!=null && db.checkLogin(login,password))
         // Valid login
          session.setAttribute("authorized", "yes");
         session.setAttribute("user",login);
     else
          // Invalid login
             session.setAttribute("authorized", "no");
             %><jsp:forward page="index.html"/><%
else if(session.getAttribute("authorized").equals("no"))
    //System.out.println("Refresh");
    %><jsp:forward page="index.html"/><%   
else System.out.println("Other");
%>
<html>
<body>
<h1>Welcome <%= " "+session.getAttribute("user") %></h1>
//links to other jsps are here
</body>
</html>What should I be doing instead? Should I make the form action be a Servlet rather than a jsp? I don't want to be writing html in my servlets though. Do I do the authentication in a servlet that I make the form action and then make the servlet forward to some standard html page?

Ok, so I'm starting things off simply be just converting what I have to use better practices. For now I just want to get the basic flow of how I transition from page to servlet to page.
Here's my index.html page:
<html>
<body>
<form action="login" method="post">
    <h1>Please Login</h1>
    Phone Number:    <input type="text" name="login"><br>
    Password: <input type="password" name="password"><br>
    <input type=submit value="Login">
</form>
</body>
</html>I have a mapping that says login goes to LoginServlet, which is here:
import java.io.IOException;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.http.HttpSession;
import db.DatabaseContainer;
public class LoginServlet extends HttpServlet
    public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException
        HttpSession session = request.getSession();
        if(session.getAttribute("authorized")==null || session.getAttribute("authorized").equals("no") || request.getParameter("login")!=null)
            String login = request.getParameter("login");
            String password = request.getParameter("password");
            DatabaseContainer db = (DatabaseContainer)(request.getSession().getAttribute("db"));
            if (login!=null && db.checkLogin(login,password))
                // Valid login
                session.setAttribute("authorized", "yes");
                session.setAttribute("user",login);
                //forward to home page
            else
                // Invalid login
                session.setAttribute("authorized", "no");
                //forward back to login page
        else if(session.getAttribute("authorized").equals("no"))
            //forward back to login page
    public void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException
        doGet(request, response);
}If I'm not logged in, I want to simply forward back to the login page for now. If I am logged in, I want to forward to my home page. If my home page is a simple html page though, then what's to stop a person from just typing in the home page and getting to it? I would think it would need to be a jsp page but then the jsp page would have to have code in it to see if the user was logged in, and then I'd be back to where I was before.
Edited by: JFactor2004 on Oct 21, 2009 7:38 PM
Edited by: JFactor2004 on Oct 21, 2009 7:38 PM

Similar Messages

  • Use Best Practice to Import Master Data

    Hi,
    I am SAP beginner, i glad someone can guide me on my issue. How can I using best practice to import 1000++ material master data into SAP system ??? I already prepared the data in Excel spreadsheet. Can any one guide me on step?
    Thanks.

    Hi,
    LSMW is a very good tool for master data upload. The tool is very rich in features is also complex. Being a beginner you should check with a consultant to know how you can use LSMW to upload your 1000 + records. The tool too is quite intuitive. After entering the LSMW transaction you create the project, subproject and the object you are going to work on uploading. When u enter the next screen you see several radio buttons. Typically every upload required all the features behind those radio buttons and in the same sequence. It is not possible to give the details to be performed in each of these radio buttons in this forum. Please take a consultant's help in your vicinity.
    thanx
    Bala

  • How to use best practices installation tool?

    Hello!
    can anyone share me some useful links/doc which can guide me how to use best practices installation tool (/SMB/BBI) ?
    any responses will be awarded,
    Regards,
    Samson

    hi,
    will you please share the same ?
    thanks in advance

  • Job (C) use best practices

    Experts,
    This question is in regard to best practices/common ways that various companies ensure the proper use of the Job(C) object in the HCM systems.  For example, if there are certain jobs in the system that should only be assigned to a position if that position is in certain areas of the business (i.e. belongs to specific organizational areas), how is this type of restriction maintained?  Is it simply through business processes? Is there a way/relationship that can be assigned? Are there typical customizations and/or processes that are followed?
    I'm looking to begin organizing jobs into job families, and I'm currently trying to determine and maintain the underlying organization of our companies jobs in order to ensure this process is functional.
    Any insight, thoughts, or advise would be greatly appreciated.
    Best regards,
    Joe

    Hi Joe,
    You can embed the business area info into job description and this would be a part of best practice.
    What I mean is that:
    e.g. In your company you have 4 managers:
    HR Manager
    IT Manager
    Procurement Manager
    Production Manager
    Then, as part of the Best practice of SAP, you will have 4 positions (1 position per person)
    My advice is you should also have 4 jobs that describe the positions.
    Then, in order to group all managers, you may have one job family "Managers" and assign all four jobs to that family.
    By this way you can report all the managers as well as area-specific managers (e.g. HR manager)
    As far as I know, there is no standard relationship that holds business area info.
    For further info check table T778V via SM31.
    Regards,
    Dilek

  • Using Best Practices personalization tool

    hello,
    We wish to usr the Best Practices personalization tool for customer specific data sor baseline package.
    I do not understand from the documantation if its possible to use it after  installing the Baseline or its have to be done simutanioslly (meenung the personalized data have to be ready in the files before stsrting the baseline installaion)?!
    Thank you
    Michal

    Hi,
    Please Ref:
    http://help.sap.com/bp_bl603/BL_IN/html/index.htm
    Your personalized files to be done before implementaion as you will be using the file during the installation process.
    The xml file and the txt files you creat from the personalization tool is used to upload the scenario to the system, or else it will upload the default.
    Also ref note 1226570 ( here i am refering to IN ) you can check the same for other county version also.
    Thanks & Regards,
    Balaji.S

  • Login ID best practice. Windows 8, Office 365 with SkyDrive and SkyDrive PRO

    I have had many confusing issues with multiple and conflicting ID's. I have just spent 3 hours online with MS technical support to discover that a document can be edited in Word and saved to Sharepoint365 but if it is synched with the
    desktop and that one is edited then the local signon sets the wrong permissions and the upload centre fails. 
    I have two MSOL Office/365 accounts, both Enterprise level with SkyDrive Pro.
    I have two Live accounts, both with SkyDrive.
    I am using Windows 8 so need to signon with a live ID or local and non-linked account.
    Assuming I configured;
    1. W8 user account with Live ID AAA and only used MSOL/365 Account BBB in a session. 
    2. W8 user account with Live ID ZZZ and only used MSOL/365 Account XXX in a session.
    Would that be okay?
    1. Should I be able  to access AAA Skydrive and BBB SkyDrive Pro in the same session?
    2. If I synch SharePoint folders from a BBB session should I be able to edit them directly in word, with user account BBB active and be able to save okay?
    Cheers, Richard
    RC

    In Windows Live you could create alias account which means you have one Windows Live Account but login with multiple email. You create one account and then link Office365 account as alias to it. You could have one Windows 8 account and different Office365
    account too.
    You could also configure your Office365 account to use your Windows Live Id and login with that account.

  • Implementing a search box - best practices

    I'm implementing a simple search box, to allow visitors to search for merchandise, which is held in a table. I can see two main approaches, each with their pro's and cons:
    The merchandise data has several fields that could be potentially employed in the search. long description, short description and title.
    A thorough search would look through each long description field, which is 100 chars long. The downside being the speed hit, searching such a large field.
    A quick search would look through the title field - quick but not thorough
    Alternatively I could create a separate table, searchTags, which contains a list of keywords for each item of merchandis - quicker but not as thorough
    Just wondering what type of apporach people use ?

    Ah, ok, I'll do it with LIKE
    I plan to get it up and running and record the type of things people are searching for. Having seen some of the things people type into a search box, I'll need to employ some of CF's stirng and list functions to break down the search string in a series of words, then search for each one.
    For example if someone typed "Silver Jewellery", it wouldn't bring up any results, as there's no occurence of that string in the database.
    However, if I break that down into "Silver" and "Jewellery" that would produce results.
    Think I can use CF's string and list functions such as listToArray, for that.

  • VC table use, best practices?

    Hi,
    I'm updating a table in the back end with an RFC. I would like to send only the rows I've modified or added on the VC client to the RFC and not the whole table. Is this possible?

    Hey Joel,
    Make a condition(say check box) on changing the table row (user need to select what are all the rows he is modifying).
    while sending the values to the RFC, give the guard condition as valid only when check box is selected.
    Regards,
    Pradeep

  • Best practice in implementation of SEM-CPM

    Is someone have  the experiance of implementing SEM-CPM using best practice. and if so, does it reduces implementation time?

    We should be able to adopt the best pratices when the software finally gets integrated into netweaver.
    Ravi Thothadri

  • Best practice "changing several related objects via BDT" (Business Data Toolset) / Mehrere verbundene Objekte per BDT ändern

    Hallo,
    I want to start a
    discussion, to find a best practice method to change several related master
    data objects via BDT. At the moment we are faced with miscellaneous requirements,
    where we have a master data object which uses BDT framework for maintenance (in
    our case an insured objects). While changing or creating the insured objects a
    several related objects e.g. Business Partner should also be changed or
    created. So am searching for a best practices approach how to implement such a
    solution.
    One Idea was to so call a
    report via SUBMIT AND RETURN in Event DSAVC or DSAVE. Unfortunately this implementation
    method has only poor options to handle errors. Second it is also hard to keep LUW
    together.
    Another idea is to call an additional
    BDT instance in the DCHCK-event via FM BDT_INSTANCE_SELECT and the parameters
    iv_xpush_classic = ‘X’ and iv_xpop_classic = ‘X’. At this time we didn’t get
    this solution working correctly, because there is always something missing
    (e.g. global memory is not transferred correctly between the two BDT instances).
    So hopefully you can report
    about your implementations to find a best practice approach for facing such
    requirements.
    Hallo
    ich möchte an der Stelle eine Diskussion starten um einen Best Practice
    Ansatz zu finden, der eine BDT Implementierung/Erweiterung beschreibt, bei der
    verschiedene abhängige BDT-Objekte geändert werden. Momentan treffen bei uns
    mehrere Anforderungen an, bei deinen Änderungen eines BDT Objektes an ein
    anderes BDT Objekte vererbt werden sollen. Sprich es sollen weitere Objekte geänderte
    werden, wenn ein Objekt (in unserem Fall ein Versicherungsvertrag) angelegt
    oder geändert wird (zum Beispiel ein Geschäftspartner)
    Die erste unserer Ideen war es, im Zeitpunkt DSAVC oder DSAVE einen
    Report per SUBMIT AND RETURN aufzurufen. Dieser sollte dann die abhängigen Änderungen
    durchführen. Allerdings gibt es hier Probleme mit der Fehlerbehandlung, da
    diese asynchrone stattfinden muss. Weiterhin ist es auch schwer die Konsistenz der
    LUW zu garantieren.
    Ein anderer Ansatz den wir verfolgt hatten, war im Zeitpunkt
    DCHCK per FuBA BDT_INSTANCE_SELECT und den Parameter iv_xpush_classic = ‘X’ and
    iv_xpop_classic = ‘X’ eine neue BDT Instanz zu erzeugen. Leider konnten wir diese
    Lösung nicht endgültig zum Laufen bekommen, da es immer Probleme beim
    Übertragen der globalen Speicher der einzelnen BDT Instanzen gab.
    Ich hoffe Ihr könnt hier eure Implementierungen kurz beschreiben, dass wir
    eine Best Practice Ansatz für das Thema finden können
    BR/VG
    Dominik

  • SAP Best Practice for Chemicals in an oil refinery

    Does anyone have experience in implementing the SAP Best Practice for Chemicals in an oil refinery that will also use IS-OIL?
    What would be the pros and cons?
    What is the implementation price of considering a best practice solution for a large complex organization?
    Oded Dagan
    Oil Refineries LTD.
    Israel

    Hi Oded,
    I don't know about any Best Practice Chemicals implementation in an refinery so far.
    But you can use Best Practice Chemicals with SAP IS-Oil for the non IS-Oil functionality as well.
    Best Practice Chemicals gives you benefits within standard business processes, but for the
    IS-Oil business processes you have to consider the traditon implementation methods.
    Best Practice Chemicals gives you a broad business process set, usually used in Chemical
    corporation. If you can cover 50% of your needed business processes out of Best Practice Chemicals you save approx 50% implementation time. It is not only implemenentation you save a lot in documentation and training material as well. Most of our Best Practice Chemicals implemenations used
    60-80% out of Best Practice Chemicals. At a large corporation the percentage of standard ERP processes is normally smaller, because of other additional needed SAP solutions e.g. APO, SRM, CRM etc.
    regards Rainer

  • Storage Server 2012 best practices? Newbie to larger storage systems.

    I have many years managing and planning smaller Windows server environments, however, my non-profit has recently purchased
    two StoreEasy 1630 servers and we would like to set them up using best practices for networking and Windows storage technologies. The main goal is to build an infrastructure so we can provide SMB/CIFS services across our campus network to our 500+ end user
    workstations, taking into account redundancy, backup and room for growth. The following describes our environment and vision. Any thoughts / guidance / white papers / directions would be appreciated.
    Networking
    The server closets all have Cisco 1000T switching equipment. What type of networking is desired/required? Do we
    need switch-hardware based LACP or will the Windows 2012 nic-teaming options be sufficient across the 4 1000T ports on the Storeasy?
    NAS Enclosures
    There are 2 StoreEasy 1630 Windows Storage servers. One in Brooklyn and the other in Manhattan.
    Hard Disk Configuration
    Each of the StoreEasy servers has 14 3TB drives for a total RAW storage capacity of 42TB. By default the StoreEasy
    servers were configured with 2 RAID 6 arrays with 1 hot standby disk in the first bay. One RAID 6 array is made up of disks 2-8 and is presenting two logical drives to the storage server. There is a 99.99GB OS partition and a 13872.32GB NTFS D: drive.The second
    RAID 6 Array resides on Disks 9-14 and is partitioned as one 11177.83 NTFS drive.  
    Storage Pooling
    In our deployment we would like to build in room for growth by implementing storage pooling that can be later
    increased in size when we add additional disk enclosures to the rack. Do we want to create VHDX files on top of the logical NTFS drives? When physical disk enclosures, with disks, are added to the rack and present a logical drive to the OS, would we just create
    additional VHDX files on the expansion enclosures and add them to the storage pool? If we do use VHDX virtual disks, what size virtual hard disks should we make? Is there a max capacity? 64TB? Please let us know what the best approach for storage pooling will
    be for our environment.
    Windows Sharing
    We were thinking that we would create a single Share granting all users within the AD FullOrganization User group
    read/write permission. Then within this share we were thinking of using NTFS permissioning to create subfolders with different permissions for each departmental group and subgroup. Is this the correct approach or do you suggest a different approach?
    DFS
    In order to provide high availability and redundancy we would like to use DFS replication on shared folders to
    mirror storage01, located in our Brooklyn server closet and storage02, located in our Manhattan server closet. Presently there is a 10TB DFS replication limit in Windows 2012. Is this replicaiton limit per share, or total of all files under DFS. We have been
    informed that HP will provide an upgrade to 2012 R2 Storage Server when it becomes available. In the meanwhile, how should we designing our storage and replication strategy around the limits?
    Backup Strategy
    I read that Windows Server backup can only backup disks up to 2TB in size. We were thinking that we would like
    our 2 current StoreEasy servers to backup to each other (to an unreplicated portion of the disk space) nightly until we can purchase a third system for backup. What is the best approach for backup? Should we use Windows Server Backup to be capturing the data
    volumes?
    Should we use a third party backup software?

    Hi,
    Sorry for the delay in reply.
    I'll try to reply each of your questions. However for the first one, you may have a try to post to Network forum for further information, or contact your device provider (HP) to see if there is any recommendation.
    For Storage Pooling:
    From your description you would like to create VHDX on RAID6 disks for increasment. It is fine and as you said it is 64TB limited. See:
    Hyper-V Virtual Hard Disk Format Overview
    http://technet.microsoft.com/en-us/library/hh831446.aspx
    Another possiable solution is using Storage Space - new function in Windows Server 2012. See:
    Storage Spaces Overview
    http://technet.microsoft.com/en-us/library/hh831739.aspx
    It could add hard disks to a storage pool and creating virtual disks from the pool. You can add disks later to this pool and creating new virtual disks if needed. 
    For Windows Sharing
    Generally we will have different sharing folders later. Creating all shares in a root folder sounds good but actually we may not able to accomplish. So it actually depends on actual environment.
    For DFS replication limitation
    I assume the 10TB limitation comes from this link:
    http://blogs.technet.com/b/csstwplatform/archive/2009/10/20/what-is-dfs-maximum-size-limit.aspx
    I contacted DFSR department about the limitation. Actually DFS-R could replicate more data which do not have an exact limitation. As you can see the article is created in 2009. 
    For Backup
    As you said there is a backup limitation (2GB - single backup). So if it cannot meet your requirement you will need to find a third party solution.
    Backup limitation
    http://technet.microsoft.com/en-us/library/cc772523.aspx
    If you have any feedback on our support, please send to [email protected]

  • Best practice steps of configuration

    Hello All
    Can anyone write back on What is the best practice steps of configuration in HCM for a new implementation.
    Thanks

    Hi,
    SAP Best Practices is prepacked ready to use solutions for small and medium size Business units (SMBs)
    SAP Best practices is based on  Building Block methodology
    SAP Best Practices is fully documentation including preconfigured business processes, training material, data conversion tools, and test catalogs.
    Best Practcies is totally Buidleing block Methods.
    Building Blockss Contains:
    Business Configuration Sets (BC Sets)
    Sample master data
    Configuration documentation
    Print forms or reports
    Business Configuration Sets are group of Configuration Sets. As We can say group of tables for specific Business Process.
    for more details go help.sap.com., then go for Best practices you will very useful information.
    Regs,
    Brahma

  • CRM  - how to work with Best Practices

    Hi All,
    we will start with the implementation of mySAP CRM during the next weeks.
    I'm a bit confused - how should I work with Best Practices, and what are the differences between the 3 ways for Best Practices
    1) we have a 'Solution Manager' System where I can use Best Practices for CRM
    2) Best Practices on help.sap.com: http://help.sap.com/bp_crmv250/CRM_DE/html/index_DE.htm (Buliding Blocks!)
    3) Best Practices DVD to install on the CRM System
    Are the 3 ways exchangeable? Is there some information provided by SAP?
    We have already installed the Best Practices DVD, but now I don't know how to use this Add-On: Is there a special transaction-code to use them or a extension for the IMG?
    regards
    Stefan

    Hi Stefan Kübler,
    If the solution manager is in place, then the suggested (also the best) method is to use the best practices on it.
    If you want to install and use the best practices with CRM system then the procedure is given in the best practice CD/DVD. Also you can download the installation procedure from the below link : http://help.sap.com/bp_crmv340/CRM_DE/index.htm. Click on ‘Installation’ on left and then ‘Quick Guide’ on right. Download the document.
    Though the best practices give you a way to start with, but they can’t replace your requirement. You have to configure the system as per your exact business requirement.
    I never installed best practices before, but extensively used them as reference in all my projects.
    Follow the below thread for additional information on  best practices :
    Also refer to my past thread :
    Do not forget to reward if helps.
    Regards,
    Paul Kondaveeti

  • Best practice for migrating data tables- please comment.

    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    Please comment on your view of this practice. Thanks!

    >
    Please comment on your view of this practice. Thanks!
    >
    Sounds like the DBAs are using best practices to get the job done. Congratulations to them!
    >
    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    >
    The process you describe is what I would expect, and require, in any well-run environment.
    >
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    >
    Nobody cares if if is riskier for you. The production environment is sacred. Any and all risk to it must be reduced to a minimum at all cost. In my opinion a DBA should NEVER move ANYTHING from a development environment directly to a production environment. NEVER.
    Development environments are sandboxes. They are often not backed up. You or anyone else could easily modify tables or data with no controls in place. Anything done in a DEV environment is assumed to be incomplete, unsecure, disposable and unvetted.
    If you are doing development and don't have scripts to rebuild your objects from scratch then you are doing it wrong. You should ALWAYS have your own backup copies of DDL in case anything happens (and it does) to the development environment. By 'have your own' I mean there should be copies in a version control system or central repository where your teammates can get their hands on them if you are not available.
    As for data - I agree with what others have said. Further - ALL data in a dev environment is assumed to be dev data and not production data. In all environments I have worked in ALL production data must be validated and approved by the business. That means every piece of data in lookup tables, fact tables, dimension tables, etc. Only computed data, such as might be in a data warehouse system generated by an ETL process might be exempt; but the process that creates that data is not exempt - that process and ultimately the data - must be signed off on by the business.
    And the business generally has no access to, or control of, a development environment. That means using a TEST or QA environment for the business users to test and validate.
    >
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    >
    Absolutely! That's how professional deployments are performed. Deployment documents are prepared and submitted for sign off by each of the affected groups. Those groups can include security, dba, business user, IT and even legal. The deployment documents always include recovery steps so that is something goes wrong or the deployment can't procede there is a documented procedure of how to restore the system to a valid working state.
    The deployments themselves that I participate in have representatives from the each of those groups in the room or on a conference call as each step of the deployment is performed. Your 5 tables may be used by stored procedures, views or other code that has to be deployed as part of the same process. Each step of the deployment has to be performed in the correct order. If something goes wrong the responsible party is responsible for assisting in the retry or recovery of their component.
    It is absolutely vital to have a known, secure, repeatable process for deployments. There are no shortcuts. I agree, for a simple 5 new table and small amount of data scenario it may seem like overkill.
    But, despite what you say it simply cannot be that easy for one simple reason. Adding 5 tables with data to a production system has no business impact or utility at all unless there is some code, process or application somewhere that accesses those tables and data. Your post didn't mention the part about what changes are being made to actually USE what you are adding.

Maybe you are looking for