Best practices to secure out of bound management access

What are the best practices to secure Out Of Bound Management (OOBM) access?
I planning to put in an DSL link for OOBM. I have a console switch which supports SSH and VPN based on IPSec with NAT traversal. My questions are -
Is it secure enough?
Do I need to have a router/firewall in front of the console switch?
Im planing to put a Cisco 1841 router as an edge router. What do you think?
Any suggestions would be greatly appreciated.

Hi,
You're going to have an OOB access via VPN?
This is pretty secure (if talking about IPsec)
An 1841 should work fine.
You can check the design recommendations here:
www.cisco.com/go/srnd
Chose the security section...
Hope it helps.
Federico.

Similar Messages

  • What is the best practice in securing deployed source files

    hi guys,
    Just yesterday, I developed a simple image cropper using ajax
    and flash. After compiling the package, I notice the
    package/installer delivers the same exact source files as in
    developed to the installed folder.
    This doesnt concern me much at first, but coming to think of
    it. This question keeps coming out of my head.
    "What is the best practice in securing deployed source
    files?"
    How do we secure application installed source files from
    being tampered. Especially, when it comes to tampering of the
    source files after it's been installed. E.g. modifying spraydata.js
    files for example can be done easily with an editor.

    Hi,
    You could compute a SHA or MD5 hash of your source files on
    first run and save these hashes to EncryptedLocalStore.
    On startup, recompute and verify. (This, of course, fails to
    address when the main app's swf / swc / html itself is
    decompiled)

  • Best Practice for Securing Web Services in the BPEL Workflow

    What is the best practice for securing web services which are part of a larger service (a business process) and are defined through BPEL?
    They are all deployed on the same oracle application server.
    Defining agent for each?
    Gateway for all?
    BPEL security extension?
    The top level service that is defined as business process is secure itself through OWSM and username and passwords, but what is the best practice for security establishment for each low level services?
    Regards
    Farbod

    It doesnt matter whether the service is invoked as part of your larger process or not, if it is performing any business critical operation then it should be secured.
    The idea of SOA / designing services is to have the services available so that it can be orchestrated as part of any other business process.
    Today you may have secured your parent services and tomorrow you could come up with a new service which may use one of the existing lower level services.
    If all the services are in one Application server you can make the configuration/development environment lot easier by securing them using the Gateway.
    Typical probelm with any gateway architecture is that the service is available without any security enforcement when accessed directly.
    You can enforce rules at your network layer to allow access to the App server only from Gateway.
    When you have the liberty to use OWSM or any other WS-Security products, i would stay away from any extensions. Two things to consider
    The next BPEL developer in your project may not be aware of Security extensions
    Centralizing Security enforcement will make your development and security operations as loosely coupled and addresses scalability.
    Thanks
    Ram

  • Best Practice for Security Point-Multipoint 802.11a Bridge Connection

    I am trying to get the best practice for securing a point to multi-point wireless bridge link. Link point A to B, C, & D; and B, C, & D back to A. What authenication is the best and configuration is best that is included in the Aironet 1410 IOS. Thanks for your assistance.
    Greg

    The following document on the types of authentication available on 1400 should help you
    http://www.cisco.com/univercd/cc/td/doc/product/wireless/aero1400/br1410/brscg/p11auth.htm

  • Best practice for checking out a file

    Hello,
    What is the SAP best practice to check out a file? (DTR -> EDit or DTR -> Edit Exclusive?)
    What are pros and cons of checking out a file exclusively?
    Thanks
    MLS

    Thanks Pascal.
    Also, I think if a developer checks out exclusively, makes changes and leaves the company without checking in, the only way is to revert those files in which case all his changes will be gone.

  • Basic Strategy / Best Practices for System Monitoring with Solution Manager

    I am very new to SAP and the Basis group at my company. I will be working on a project to identify the best practices of System and Service level monitoring using Solution Manager. I have read a good amount about SAP Solution Manager and the concept of monitoring but need to begin mapping out a monitoring strategy.
    We currently utilize the RZ20 transaction and basic CCMS monitors such as watching for update errors, availability, short dumps, etc.. What else should be monitored in order to proactively find possible issues. Are there any best practices you all have found when implimenting Monitoring for new solutions added to the SAP landscape.... what are common things we would want to monitor over say ERP, CRM, SRM, etc?
    Thanks in advance for any comments or suggestions!

    Hi Mike,
    Did you try the following link ?
    If not, it may be useful to some extent:
    http://service.sap.com/bestpractices
    ---> Cross-Industry Packages ---> Best Practices for Solution Management
    You have quite a few documents there - those on BPM may also cover Solution Monitoring aspects.
    Best regards,
    Srini
    Edited by: Srinivasan Radhakrishnan on Jul 7, 2008 7:02 PM

  • Best practice for securing confidential legal documents in DMS?

    We have a requirement to store confidential legal documents in DMS and are looking at options to secure access to those documents.  We are curious to know.  What is the best practice?  And how are other companies doing it?
    TIA,
    Margie
    Perrigo Co.

    Hi,
    The standard practice for such scenarios is to use 'authorization' concept.You can give every user to use authorization to create,change or display these confidential documents. In this way, you can control access authorization.SAP DMS system monitors how you work, and prevents you from displaying or changing originals if you do not have the required authorization.
    The below link will provide you with an improved understanding of authorization concept and its application in DMS
    http://help.sap.com/erp2005_ehp_04/helpdata/en/c1/1c24ac43c711d1893e0000e8323c4f/frameset.htm
    Regards,
    Pradeepkumar Haragoldavar

  • Best Practice Internet Security with ADO / OraMTS / OraOLEDB and 9i?

    Hi people,
    I have the following scenario to support and I URGENTLY need some information regarding the security model vs performance envelope of these platforms.
    We currently are developing a web-application using IE 5.0^ as our browser, IIS 5.0 as our server, ASP (JScript) as our component glue, custom C++ COM+ middle tier components using ADO / Oracle OLE DB to talk to a Solaris based Oracle 9i instance.
    Now it comes to light from the application requirements that the system should, if at all possible, be supporting Virtual Private Databases for subscribers [plus we need to ease backend data service development and row-level security combined with fine grained audit seems the way to go].
    How does one use Oracle's superior row-level security model in this situation?
    How does one get the MS middle tier to authenticate with the database given that our COM+ ADO components are all required to go through ONE connection string? [Grrrr]
    Can we somehow give proxy rights to this identity so that it can "become" and authenticate with an OID/LDAP as an "Enterprise User"? If so, how?
    I have seen a few examples of JDBC and OCI middle-tier authentication but how does one achieve the same result as efficiently as possible from the MS platform?
    It almost appears, due to connection pooling that each call to the database on each open connection could potentially be requiring a different application context - how does one achieve this efficiently?
    If this is not the way to go - how could it work?
    What performance tradeoffs do we have using this architecture? (And potentially how will we migrate to .Net on the middle tier?)
    As you can see, my questions are both architectural and technical. So, are there any case studies, white papers or best practice monographs on this subject that are available to either Technet members or Oracle Partners?
    Alternatively, anyone else come up against this issue before?
    Thanks for your attention,
    Lachlan Pitts
    Developer DBA (Oracle)
    SoftWorks Australia Pty Ltd

    Hi people,
    I have the following scenario to support and I URGENTLY need some information regarding the security model vs performance envelope of these platforms.
    We currently are developing a web-application using IE 5.0^ as our browser, IIS 5.0 as our server, ASP (JScript) as our component glue, custom C++ COM+ middle tier components using ADO / Oracle OLE DB to talk to a Solaris based Oracle 9i instance.
    Now it comes to light from the application requirements that the system should, if at all possible, be supporting Virtual Private Databases for subscribers [plus we need to ease backend data service development and row-level security combined with fine grained audit seems the way to go].
    How does one use Oracle's superior row-level security model in this situation?
    How does one get the MS middle tier to authenticate with the database given that our COM+ ADO components are all required to go through ONE connection string? [Grrrr]
    Can we somehow give proxy rights to this identity so that it can "become" and authenticate with an OID/LDAP as an "Enterprise User"? If so, how?
    I have seen a few examples of JDBC and OCI middle-tier authentication but how does one achieve the same result as efficiently as possible from the MS platform?
    It almost appears, due to connection pooling that each call to the database on each open connection could potentially be requiring a different application context - how does one achieve this efficiently?
    If this is not the way to go - how could it work?
    What performance tradeoffs do we have using this architecture? (And potentially how will we migrate to .Net on the middle tier?)
    As you can see, my questions are both architectural and technical. So, are there any case studies, white papers or best practice monographs on this subject that are available to either Technet members or Oracle Partners?
    Alternatively, anyone else come up against this issue before?
    Thanks for your attention,
    Lachlan Pitts
    Developer DBA (Oracle)
    SoftWorks Australia Pty Ltd

  • Best practices for EasyDMS Public Folder usage/management

    Hi,
        We are implementing EDMS and are looking for best practices on the use of the Public Folder in EDMS.  We have different sites that have different business models, such as Engineer to Order or a "Projects" based business.  While other sites have a large Flow operation of standard catalog products with ordering options.   Initial thoughts are to put only documents in the public folders that are common to all users at a site, such as document templates or procedures.  Others suggest putting project folders there where anybody can browse through the different documents for a project.   And that raises the question about who is the owner or manager of that public folder.  We don't want the masses to be able to create random folders so that soon the structure of the Public Folder is a big unorganized mess.  Any thoughts on best practices you have implemented or seen in practice are appreciated.
    Thanks,
    Joseph Whiteley

    Hi!
    My suggestion is to skip the folders all together! It will end as a total mess after a couple of years. My recommendation is to use the classification of the document type and classify the document with the right information. You can then search for the documents and you don't need to look through tons of folders to find the right document.
    I know that you have to put the document in a folder to be able to create it in EasyDMS but at my customers we have a year folder and then month folders underneath where they just dump the documents. We then work with either object links or classification to find the right documents in the business processes. Another recommendation is to implement the TREX engine to be able to find your documents. I donu2019t know if this was the answer you wanted to get but I think this is the way forward if you would like to have a DMS system that could be used to 10+ years. Imagine replacing Google with a file browser!
    Best regards,
    Kristoffer P

  • Best Practices of security for develop applications

    I need information about a model to use for develop application using Forms and Reports. I read many documents about best security practices for database, but I don´t find information about how can I join the database security with my software, and how can I establish an standard for my programmers.
    Thanks you for your help.

    There are a number of levels of implementation pain here-- best practices in a Fortune 500 company, for example, are likely to require a lot more infrastructure than best practices in a 5000 person organization. A Fortune 500 is also much more likely to have requirements based on the needs of a security team separate from the DBA group, requirements about auditing, etc.
    At the high end, everyone in your organization might be an enterprise user authenticated against a LDAP repository (such as Active Directory) with a variety of functional roles granted to those users and potentially something like fine-grained access control in the database. Depending on how applications are deployed, you might also be using proxy authentication to authenticate these individual users.
    Deploying this sort of infrastructure, though, will be somewhat time intensive and will create a degree of administrative overhead that you may not need. It will also potentially require a decent investment in development costs. Your needs may be far simpler (or more complex), so your security model ought to reflect that.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Best Practices- Number of users in Contract Manager

    What is the best practices for the number of users operating in Primavera Contract Manager? We currently have v13.0.3.0

    There are no real limits to the number of users working in PCM, but your server(s) must be sized correctly to work effectively.  I've seen PCM instances with hundreds of users.  Sizing requirements is typically included with your installation materials; otherwise you may want to look at the knowledgebase for this document.

  • What is the best practice to roll out ApEx to production?

    Hi,
    My first ApEx application :) What is the best practice to deploy an ApEx application to production?
    Also, I created end users account and use the end user's accountsto log in to ApEx via URL (http://xxx.xxx.xxx:8080/apex/f?p=111:1). However, how come sometimes it's still in development mode (ie: with the Home|Application#|Edit Page#|Create|Session|... ) tool bar showing up at the bottom, but sometimes not?
    Thanks much :)
    Helen

    When you setup your users, make sure the radio button for both workspace admin and developer are set to No. This should make them an "end user" and they should not see the links. Only developers and workspace admins can see it.

  • Best practice followed on shopping cart error management

    Hi All
    application monitor via WEB.
    Shopping cart Error log:-
    Shopping cart :- backend application errors
    Shopping cart :- Not transfered to backend
    sc xxxx follow on docuemnt(s) not transfered.
    Shopping cart :- Spooler commuication errors
    Shopping cart :- Local errors
    Really SC xxx has created a follow on docuemnts . why still srm system shows and it misleads others. sc xxxx follow on docuemnt(s) not transfered.
    Why once the error SC resolved and follow on docuemnts created  , automaticlly error log deletes right.. why it is not happening some time. Should we manualy delete these errors?
    What is the best practice you guys followed and clear these error?
    Muthu

    Hi Muthu,
    seems we two guys are the only ones with webmonitor problems
    but my situation is different than yours - the webmonitor tells me real errors. I am only investigating to get the system going automatically to transfer shopping carts later on, when SM12 locks in backend are off - like I state in:
    Re: SRM 5.0: CLEAN_REQREQ_UP, BBP_REQEQ_TRANSFER and Webmonitor
    kind regards,

  • A must read best practices when starting out in Designer

    Hi,
    Here is a link to a blog by Vishal Gupta on best practices when developing XFA Forms.
    http://www.adobe.com/devnet/livecycle/articles/best-practices-xfa-forms.html
    Please go read it now; it is excellent :-)
    Niall

    I followed below two links. I think it should be the same even though the links are 2008 R2 migration steps.
    http://kpytko.pl/active-directory-domain-services/adding-first-windows-server-2008-r2-domain-controller-within-windows-2003-network/
    http://blog.zwiegnet.com/windows-server/migrate-server-2003-to-2008r2-active-directory-and-fsmo-roles/
    Hope this help!

  • Best practice for secure zone various access

    I am setting up a new site with a secure zone.
    There will be a secure zone. Once logged in, users will have access to search and browse medical articles/resources
    This is how an example may go:
    The admin user signs up Doctor XYZ to the secure zone.
    The Doctor XYZ is a heart specialist, so he only gets access to web app items that are classified as "heart".
    However, he may also be given access to other items, eg: "lung" items.
    Or, even all items. It will vary from user to user.
    Is there any way to separate areas within the secure zone and give access to those separate areas (without having to give access to individual items - which will be a pain because there will be hundreds of records; and also without having the user log out and log into another secure area)

    my only issue with this is that I have no idea how to open up File Sharing to ONLY allow users who are connecting from the VPN
    Simple - don't expose your server to the outside world.
    As long as you're running on a NAT network behind some firewall or router that's filtering traffic, no external traffic can get to your server unless you setup port forwarding - this is the method used to run, say, a public web server where you tell the router/firewall to allow incoming traffic on port 80 to get to your server.
    If you don't setup any port forwarding, no external traffic can get in.
    There are additional steps you can take - such as running the software firewall built into Mac OS X to tell it to only accept network connections from the local network, but that's not necessary in most cases.
    And 2. The best way to ensure secure AND encrypted file sharing via the server...
    VPN should take care of most of your concerns - at least as far as the file server is concerned. I'd be more worried about what happens to the files once they leave the network - for example have you ensured that the remote user's local system is sufficiently secured so that no one can get the documents off his machine once they're downloaded?

Maybe you are looking for