Best practice for securing confidential legal documents in DMS?

We have a requirement to store confidential legal documents in DMS and are looking at options to secure access to those documents.  We are curious to know.  What is the best practice?  And how are other companies doing it?
TIA,
Margie
Perrigo Co.

Hi,
The standard practice for such scenarios is to use 'authorization' concept.You can give every user to use authorization to create,change or display these confidential documents. In this way, you can control access authorization.SAP DMS system monitors how you work, and prevents you from displaying or changing originals if you do not have the required authorization.
The below link will provide you with an improved understanding of authorization concept and its application in DMS
http://help.sap.com/erp2005_ehp_04/helpdata/en/c1/1c24ac43c711d1893e0000e8323c4f/frameset.htm
Regards,
Pradeepkumar Haragoldavar

Similar Messages

  • Best Practice for Security Point-Multipoint 802.11a Bridge Connection

    I am trying to get the best practice for securing a point to multi-point wireless bridge link. Link point A to B, C, & D; and B, C, & D back to A. What authenication is the best and configuration is best that is included in the Aironet 1410 IOS. Thanks for your assistance.
    Greg

    The following document on the types of authentication available on 1400 should help you
    http://www.cisco.com/univercd/cc/td/doc/product/wireless/aero1400/br1410/brscg/p11auth.htm

  • Best Practice for Securing Web Services in the BPEL Workflow

    What is the best practice for securing web services which are part of a larger service (a business process) and are defined through BPEL?
    They are all deployed on the same oracle application server.
    Defining agent for each?
    Gateway for all?
    BPEL security extension?
    The top level service that is defined as business process is secure itself through OWSM and username and passwords, but what is the best practice for security establishment for each low level services?
    Regards
    Farbod

    It doesnt matter whether the service is invoked as part of your larger process or not, if it is performing any business critical operation then it should be secured.
    The idea of SOA / designing services is to have the services available so that it can be orchestrated as part of any other business process.
    Today you may have secured your parent services and tomorrow you could come up with a new service which may use one of the existing lower level services.
    If all the services are in one Application server you can make the configuration/development environment lot easier by securing them using the Gateway.
    Typical probelm with any gateway architecture is that the service is available without any security enforcement when accessed directly.
    You can enforce rules at your network layer to allow access to the App server only from Gateway.
    When you have the liberty to use OWSM or any other WS-Security products, i would stay away from any extensions. Two things to consider
    The next BPEL developer in your project may not be aware of Security extensions
    Centralizing Security enforcement will make your development and security operations as loosely coupled and addresses scalability.
    Thanks
    Ram

  • Best Practice For Secure File Sharing?

    I'm a newbie to both OX X Server and File Sharing protocols, so please excuse my ignorance...
    My client would like to share folders in the most secure way possible; I was considering that what might be the best way would be for them to VPN into the server and then view the files through the VPN tunnel; my only issue with this is that I have no idea how to open up File Sharing to ONLY allow users who are connecting from the VPN (i.e. from inside of the internal network)... I don't see any options in Server Admin to restrict users in that way....
    I'm not afraid of the command line, FYI, I just don't know if this is:
    1. Possible!
    And 2. The best way to ensure secure AND encrypted file sharing via the server...
    Thanks for any suggestions!

    my only issue with this is that I have no idea how to open up File Sharing to ONLY allow users who are connecting from the VPN
    Simple - don't expose your server to the outside world.
    As long as you're running on a NAT network behind some firewall or router that's filtering traffic, no external traffic can get to your server unless you setup port forwarding - this is the method used to run, say, a public web server where you tell the router/firewall to allow incoming traffic on port 80 to get to your server.
    If you don't setup any port forwarding, no external traffic can get in.
    There are additional steps you can take - such as running the software firewall built into Mac OS X to tell it to only accept network connections from the local network, but that's not necessary in most cases.
    And 2. The best way to ensure secure AND encrypted file sharing via the server...
    VPN should take care of most of your concerns - at least as far as the file server is concerned. I'd be more worried about what happens to the files once they leave the network - for example have you ensured that the remote user's local system is sufficiently secured so that no one can get the documents off his machine once they're downloaded?

  • Best practice for secure zone various access

    I am setting up a new site with a secure zone.
    There will be a secure zone. Once logged in, users will have access to search and browse medical articles/resources
    This is how an example may go:
    The admin user signs up Doctor XYZ to the secure zone.
    The Doctor XYZ is a heart specialist, so he only gets access to web app items that are classified as "heart".
    However, he may also be given access to other items, eg: "lung" items.
    Or, even all items. It will vary from user to user.
    Is there any way to separate areas within the secure zone and give access to those separate areas (without having to give access to individual items - which will be a pain because there will be hundreds of records; and also without having the user log out and log into another secure area)

    my only issue with this is that I have no idea how to open up File Sharing to ONLY allow users who are connecting from the VPN
    Simple - don't expose your server to the outside world.
    As long as you're running on a NAT network behind some firewall or router that's filtering traffic, no external traffic can get to your server unless you setup port forwarding - this is the method used to run, say, a public web server where you tell the router/firewall to allow incoming traffic on port 80 to get to your server.
    If you don't setup any port forwarding, no external traffic can get in.
    There are additional steps you can take - such as running the software firewall built into Mac OS X to tell it to only accept network connections from the local network, but that's not necessary in most cases.
    And 2. The best way to ensure secure AND encrypted file sharing via the server...
    VPN should take care of most of your concerns - at least as far as the file server is concerned. I'd be more worried about what happens to the files once they leave the network - for example have you ensured that the remote user's local system is sufficiently secured so that no one can get the documents off his machine once they're downloaded?

  • Best Practices for securing VTY lines?

    Hi all,
    The thread title makes this sound like a big post but it's not. 
    If my router has say., 193 VTY lines as a maximum, but by default running-config has only a portion of those mentioned, should I set any configs I do on all lines, or just on the lines sh run shows?  Example: 
    sh run on a router I have with default config has: :
    line vty 0 4
    access-class 23 in
    privilege level 15
    login local
    transport input telnet ssh
    line vty 5 15
    access-class 23 in
    privilege level 15
    login local
    transport input telnet ssh
    Yet, I have the option of configuring up to 193 VTY lines:
    Router(config)#line vty ?
      <0-193>  First Line number
    It seems lines 16-193 still exist in memory, so my concern is that they are potentially exposed somehow to exploits or what not.  So my practice is to do any configs I do using VTY 0 193 to ensure universal configuration.  But, my "enabling" the extra lines, am I using more memory, and, how secure is this against somebody trying to say, connect 193 times to my router simtaneously?  Does it increase the likelihood of success on DoS attack for example. 

    Hi guys, thanks for the replies and excellent information.  I'm excited to look at the IOS Hardending doc and the other stuff too. 
    Just to clarify, I don't actually use the default config, I only pasted it from a new router just to illustrate the default VTY line count. 
    I never use telnet from inside or outside, anyting snooping a line will pick up the cleartext as ou both know of course.  SSH is always version 2 etc. 
    I was considering doing a console server from the insidde as the only access method - which I do have set up but I have to remote to it It's just that with power outages at times, the console PC won't come back up (no BIOS setting to return to previous state, no WOL solution in place) so now I have both that plus the SSH access.  I have an ACL on both the VTY lines themselves as well as a ZBFW ACL governing SSH - perhaps a bit redundant in some ways but oh well if there's a zero-day ou thtere for turning off the zbfw I might still be protected  
    Regretfully I havne't learned about AAA yet - that I believe is in my CCNA Security book but first I need to get other things learned. 
    And with regard to logging in general, both enabling the right kind and monitoring it properly, that's a subject I need to work on big time.  I still get prot 25 outbound sometimes from a spam bot, but by the time I manually do my sh logging | i :25 I have missed it (due to cyclic logging with a buffer at 102400).  Probably this woud be part of that CCNA Security book as well. 
    So back to the # of VTY lines.  I will see what I can do to reduce the line count.  I suppose something like "no line vty 16 193" might work, if not it'll take some research. 
    But if an attacker wants to jam up my vty lines so I can't connect in, once they've fingerprinted the unit a bit to find out that I don't have an IPS running for example, wouldn't it be better that they have to jam up 193 lines simultaneously (with I presume 193 source IPs) instaed of 16?  Or am I just theorizing too much here.  I'ts not that this matters much, anybody who cares enough to hack this router will get a surprise when they find out there's nothing worth the effort on the other side But this is more so I can be better armed for future deployments.  Anyway, I will bookmark the info from this thread and am looking forward to reading it. 

  • Best Practices for Securing Oracle e-Business Suite -Metalink Note 189367.1

    Ok we have reviewed our financials setup against the title metalink document. But we want to focus on security and configuration specific to the Accounts Payable module of Oracle Financialos. Can you point me in the direction of any useful documents for this or give me some pointers??

    Ok we have reviewed our financials setup against the title metalink document. But we want to focus on security and configuration specific to the Accounts Payable module of Oracle Financialos. Can you point me in the direction of any useful documents for this or give me some pointers??

  • Best practices for securely storing environment properties

    Hi All,
    We have a legacy security module that is included in many
    different applications. Historically the settings (such as
    database/ldap username and password) was stored directly in the
    files that use them. I'm trying to move towards a more centralized
    and secure method of storing this information, but need some help.
    First of all, i'm struggling a little bit with proper scoping
    of these variables. If another application does a cfinclude on one
    of the assets in this module, these environment settings must be
    visible to the asset, but preferrably not visible to the 'calling'
    application.
    Second i'm struggling with the proper way to initialize these
    settings. If other applications run a cfinclude on these assets,
    the application.cfm in the local directory of the script that's
    included does not get processed. I'm left with running an include
    statement in every file, which i would prefer to avoid if at all
    possible.
    There are a ton (>50) applications using this code, so i
    can't really change the external interface. Should i create a
    component that returns the private settings and then set the
    'public' settings with Server scope? Right now i'm using
    application scope for everything because of a basic
    misunderstanding of how the application.cfm's are processed, and
    that's a mess.
    We're on ColdFusion 7.
    Thanks!

    Hi,
    Thank you for posting in Windows Server Forum.
    As per my research, we can create some script for patching the server and you have 2 servers for each role. If this is primary and backup server respectively then you can manage to update each server separately and bypass the traffic to other server. After
    completing once for 1 server you can just perform the same step for other server. Because as I know we need to restart the server once for successful patching update to the server.
    Hope it helps!
    Thanks.
    Dharmesh Solanki

  • Best practices for securing communication to internet based SCCM clients ?

    What type of SSL certs does the community think should be used to secure traffic from internet based SCCM clients ?  should 3rd party SSL certs be used ?  When doing an inventory for example of the clients configuration in order to run reports
    later how the  data be protected during transit ?

    From a technical perspective, it doesn't matter where the certs come from as there is no difference whatsoever. A cert is a cert is a cert. The certs are *not* what provide the protection, they simply enable the use of SSL to protect the data in transit
    and also provide an authentication mechanism.
    From a logistics and cost perspective though, there is a huge difference. You may not be aware, but *every* client in IBCM requires its own unique client authentication certificate. This will get very expensive very quickly and is a recurring cost because
    certs expire (most commercial cert vendors rarely offer certs valid for more than 3 years). Also, deploying certs from a 3rd party is not a trivial endeavor -- you more less run into chicken and egg issues here. With an internal Microsoft PKI, if designed
    properly, there is zero recurring cost and deployment to internal systems is trivial. There is still certainly some cost and overhead involved, but it is dwarfed by that that comes with using with a third party CA for IBCM certs.
    Jason | http://blog.configmgrftw.com | @jasonsandys

  • Enterpise Best Practices for iPad

    Is anyone aware of any documentation idnetifying best practices for securely deploying iPads in an enterprise environment?

    There is some information out there, though not as much as I think we are typically used to for enterprise environments. (It is a consumer device, and Apple is a consumer-driven company, and I don't fault them for that one bit.)
    Here is some documentation from Apple:
    http://www.apple.com/support/ipad/enterprise/
    Also, Jamf Software has some information regarding their Casper suite.
    We don't use it yet at my workplace, but I have heard good things about them.
    http://www.jamfsoftware.com/solutions/mobile-device-management
    Edit:
    And welcome to the forums!
    Message was edited by: tibor.moldovan

  • Best Practices for a Legal Department Datamart/Datawarehouse

    Hello,
    Doaes anyone have a bestpractices document for a Legal Department DataMart/Datawarehouse?
    Thanks

    Over and above what I'd think where 'standard' best practices for data modelling couple with OBIEE develment? Off the top of my head Steve Hoberman wrote a few books on dimensional modelling with real world contexts, pretty sure I saw some legal models in there, maybe some cross over into insurance etc. etc.

  • SAP Best Practice for Document Type./Item category/Acc assignment cat.

    What is the Best Practice for the document Type & Item category
    I want to use NB -  Item category  - B & K ( Blanket PO) , D ( Service)  and T( Text) .
    Is sap recommends to use FO Only for the Blanket Purchase Order.
    We want to use service contract (with / without service entry sheet) for all our services.
    We want to buy asset for our office equipments .
    Which is the best one to use NB or FO ?
    Please give me any OSS notes or reference for this
    Thanks
    Nick

    Thank you very much for your response. 
    I hope I can provide some clarity on how the accounting needs to be handle per FERC  Regulations.  The G/L balance on the utility that is selling the assets will be in the following accounts (standard accounts across all FERC Regulated Utilities):
    101 - Acquisition Value for the assets
    108 - Accumulated Depreciation Value for the assets
    For an example, there is Debit $60,000,000 in FERC Account 101 and a credit $30,000,000 in FERC Account 108.  When the purchase occurs, the net book value for the asset will be on our G/L in FERC Account 102.  Once we have FERC Approval to acquire the plant assets, we will need to enter the Acquisition Value and associated Accumulated Depreciation onto our G/L to FERC Account 101 and FERC Account 108 respectively with an offset to FERC Account 102.
    The method that I came up with is to purchase the NBV of the assets to a clearing account.  I then set up account assignments that will track the Acquisition Value and respective Accumulated Depreciation for each asset that is being purchased.  I load the respective asset values using t-code AS91 and then make an entry to the 2 respective accounts with the offset against the clearing account using t-code OASV.  Once my company receives FERC approval, I will transfer the asset to new assets that has the account assignments for FERC Account 101 and FERC Account 108 using t-code ABUMN or FB01.

  • Best practice for external but secure access to internal data?

    We need external customers/vendors/partners to access some of our company data (view/add/edit).  It’s not so easy as to segment out those databases/tables/records from other existing (and put separate database(s) in the DMZ where our server is).  Our
    current solution is to have a 1433 hole from web server into our database server.  The user credentials are not in any sort of web.config but rather compiled in our DLLs, and that SQL login has read/write access to a very limited number of databases.
    Our security group says this is still not secure, but how else are we to do it?  Even if a web service, there still has to be a hole in somewhere.  Any standard best practice for this?
    Thanks.

    Security is mainly about mitigation rather than 100% secure, "We have unknown unknowns". The component needs to talk to SQL Server. You could continue to use http to talk to SQL Server, perhaps even get SOAP Transactions working but personally
    I'd have more worries about using such a 'less trodden' path since that is exactly the areas where more security problems are discovered. I don't know about your specific design issues so there might be even more ways to mitigate the risk but in general you're
    using a DMZ as a decent way to mitigate risk. I would recommend asking your security team what they'd deem acceptable.
    http://pauliom.wordpress.com

  • Best practice for the test environment  &  DBA plan Activities    Documents

    Dears,,
    In our company, we made sizing for hardware.
    we have Three environments ( Test/Development , Training , Production ).
    But, the test environment servers less than Production environment servers.
    My question is:
    How to make the best practice for the test environment?
    ( Is there any recommendations from Oracle related to this , any PDF files help me ............ )
    Also please , Can I have a detail document regarding the DBA plan activities?
    I appreciate your help and advise
    Thanks
    Edited by: user4520487 on Mar 3, 2009 11:08 PM

    Follow your build document for the same steps you used to build production.
    You should know where all your code is. You can use the deployment manager to export your configurations. Export customized files from MDS. Just follow the process again, and you will have a clean instance not containing production data.
    It only takes a lot of time if your client is lacking documentation or if you re not familiar with all the parts of the environment. What's 2-3 hours compared to all the issues you will run into if you copy databases or import/export schemas?
    -Kevin

  • Best Practice for Storing Sharepoint Documents

    Hi,
    Is there a best practice where to store the documents of a Sharepoint? I heard some people say it is best to store sharepoint documents directly into file system. Ohters said that it is better to store sharepoint documents into a SQL Server.

    What you are referring to is the difference between SharePoint's native storage of documents in SQL, and the option/ability to use SQL's filestream functionality for Remote BLOB Storage (also known as RBS). Typically you are much better off sticking with
    SQL storage for BLOBs, except in a very few scenarios.
    This page will help you decide if RBS is right for your scenario:
    https://technet.microsoft.com/en-us/library/ff628583.aspx?f=255&MSPPError=-2147217396
    -Corey

Maybe you are looking for

  • Advanced web: mod_fastcgi / suexec

    Hello, MacMini OS X server 10.6x, Apache/2.2.17 (Unix), mod_fastcgi/2.4.2 I'm a novice fastCGI user here and only am begginning to learn about Mac OSX Server. I am O.K. with apache config in a *nix environment. I have set up "PHP-like" fcgi apps on o

  • Fillable PDF works on very few computers

    I imported a form to Acrobat, had fields auto-detected, made a few required, saved as Reader Enabled, imported into FormsCentral, added a submit button and uploaded it. After submissions worked, I made the fields right, saved as RE, imported to FC, a

  • If i purchase mountain lion can i do 2 installs?

    .

  • Changing the installation path for Adobe Reader X under Windows XP

    I installed Adobe Reader on an external drive called G:\ Now I do not have this G:\ drive anymore and I want to install Adobe Reader X on my main drive : C:\ but when I download the software there is no possibility to select a new path.  Adobe Reader

  • ATI Radeon 9200 and Mac OS X 10.5?

    I just bought this card (ATI Radeon 9200 Mac Edition - 128MB), installed the available drivers (for Mac OS X 10.4, according to ati.com) and firmware update, but the graphics are horrible. In World of Warcraft, it's running at 5 FPS at lowest graphic