Best Practices of security for develop applications

I need information about a model to use for develop application using Forms and Reports. I read many documents about best security practices for database, but I don´t find information about how can I join the database security with my software, and how can I establish an standard for my programmers.
Thanks you for your help.

There are a number of levels of implementation pain here-- best practices in a Fortune 500 company, for example, are likely to require a lot more infrastructure than best practices in a 5000 person organization. A Fortune 500 is also much more likely to have requirements based on the needs of a security team separate from the DBA group, requirements about auditing, etc.
At the high end, everyone in your organization might be an enterprise user authenticated against a LDAP repository (such as Active Directory) with a variety of functional roles granted to those users and potentially something like fine-grained access control in the database. Depending on how applications are deployed, you might also be using proxy authentication to authenticate these individual users.
Deploying this sort of infrastructure, though, will be somewhat time intensive and will create a degree of administrative overhead that you may not need. It will also potentially require a decent investment in development costs. Your needs may be far simpler (or more complex), so your security model ought to reflect that.
Justin
Distributed Database Consulting, Inc.
http://www.ddbcinc.com/askDDBC

Similar Messages

  • Any known security best practices to follow for FMS deployment

    Hi all,
    We have recently deployed Flash Media Streaming server 3.5.2 and Flash Media Encoder on a Windows 2003 machine. Do you guys know of any security best practices to follow for the FMS server deployment on a Windows machine, could you please point me to that resource.

    Hi
    I will add some concepts, I am not sure how all of them work technically but there should be enough here for you to
    dig deeper, and also alot of this is relevant to your environment and how you want to deploy it.
    I have done a 28 server deployment, 4 origin and 24 edge servers.
    All the Edge servers on the TCP/IP properties we disabled file and printer sharing. Basically this is a way in for hackers and we disabled this only on the edge servers as these are the ones presented to the public.
    We also only allowed ports 1935, 80, 443 on our NICs. Protocol numbers are 6 and 17, this means that you are allowing UDP and TCP. So definitely test out your TCP/IP port filtering until you are confortable that all your connection types are working and secure.
    Use RTMPE over RTMP, as it is there to be used and I am surprised not more people use it. The problem as with any other encryption protocol, it may cause higher overhead on resources of the servers holding the connections.
    You may want to look at SWF verification. In my understanding, it works as the following. You publish a SWF file on a website. This is a source code that your player uses for authentication. If you enable your edge servers to only listen for authentication requests from that SWF file, then hopefully you are really lessening the highjacking possibilities on your streams.
    If you are doing encoding via FME then I would suggest that you download the authentication plugin that is available on the Flash Media Encoder download site.
    There are other things you can look at making it more secure like adaptor.xml, using a front end load balancer, HTML domains, SWF domains,
    Firewalls and DRM.
    I hope this helps you out.
    Roberto

  • What is the best practice in securing deployed source files

    hi guys,
    Just yesterday, I developed a simple image cropper using ajax
    and flash. After compiling the package, I notice the
    package/installer delivers the same exact source files as in
    developed to the installed folder.
    This doesnt concern me much at first, but coming to think of
    it. This question keeps coming out of my head.
    "What is the best practice in securing deployed source
    files?"
    How do we secure application installed source files from
    being tampered. Especially, when it comes to tampering of the
    source files after it's been installed. E.g. modifying spraydata.js
    files for example can be done easily with an editor.

    Hi,
    You could compute a SHA or MD5 hash of your source files on
    first run and save these hashes to EncryptedLocalStore.
    On startup, recompute and verify. (This, of course, fails to
    address when the main app's swf / swc / html itself is
    decompiled)

  • Best practices to secure out of bound management access

    What are the best practices to secure Out Of Bound Management (OOBM) access?
    I planning to put in an DSL link for OOBM. I have a console switch which supports SSH and VPN based on IPSec with NAT traversal. My questions are -
    Is it secure enough?
    Do I need to have a router/firewall in front of the console switch?
    Im planing to put a Cisco 1841 router as an edge router. What do you think?
    Any suggestions would be greatly appreciated.

    Hi,
    You're going to have an OOB access via VPN?
    This is pretty secure (if talking about IPsec)
    An 1841 should work fine.
    You can check the design recommendations here:
    www.cisco.com/go/srnd
    Chose the security section...
    Hope it helps.
    Federico.

  • Best Practice: Export unrendered for internet

    Hi
    As I recall, in previous versions of FCPX  it was considered best practice, at least for exports for web, to export unrendered.
    Still true? Ever true for all other export reasons?
    I want to post a rough of a film on youtube, unlisted, for a few friends to comment on.
    best
    elmer
    Btw, always seems like when I open my browser while fcpx is open, I get problems and have to delete my prefs to get back to normal. Any reason why? Just curious.

    Steve: If these are bitmaps inside a PDF that's going to be viewed on the iPad, you cannot rely on its "native resolution". Think about this: What if the original page size of this PDF is 5.5" x 8"? What if it is 20" x 32"? Which one will show the images "at their native resolution"?

  • Best Practices Building Blocks for CRM 5.0 & CRM 2007

    Hi Experts,
    Where can I find Best Practices Building Blocks for CRM 5.0 & CRM 2007?
    Thanks in advance,
    Vishwa.

    Hi
    Go to: http://help.sap.com/
    Click on the Best Practices Tab,
    Then Cross-Industry Packages,
    Then Customer Relationship Management
    They should all be under there.
    Regards
    Arden

  • Looking for best practice / installation guide for grid agent for RAC

    I am looking for best practice / installation guide for grid agent for RAC, running on windows server.
    Thanks.

    Please refer :
    MOS note Id : [ID 378037.1] -- How To Install Oracle 10g Grid Agent On RAC
    http://repettas.wordpress.com/2007/10/21/how-to-install-oracle-10g-grid-agent-on-rac/
    Regards
    Rajesh

  • Need best practice configuration document for ISU CCS

    I am working on ISU CCS project. i need  best practice cofiguration document for
    Contract management
    Collections management
    Invoicing
    Work Management as it relates to ERP Billing.
    Thanks
    Priya
    priyapandey.sapcrmatgmailcom

    Which version are you setting up and what are the requirements? IF you are discussing the use of NIC bonding for high availability beginning in 11.2.0.2 there is a concept of "High Availability IP" of HAIP as discussed in the pre-installation chapters,
    http://docs.oracle.com/cd/E11882_01/install.112/e22489/prelinux.htm, section 2.7.1 Network Hardware Requirements.
    In essence, using HAIP eliminates the need to use NIC bonding to provide for redundancy.

  • [More information] 'SAP Best Practices Baseline package for Brazil V3.607'

    Hi.
    When I study 'SAP Best Practices Baseline package for
    Brazil V3.607', I wonder somthing.
    I want solution of problem.
    ---------Problem---------
    In '100: SAP Best Practices Installation' document on point 3.4 Define Tax Jurisdiction Code it says
    Enter the Jurisdiction Codes according to the document SMB41_J_1BTXJURV_B020_NFE.TXT.
    I have search the internet for this document and the only hit is the actual Best practice document.
    Does anybody knows where to get this document?
    ASAP, reply for me.
    Thanks.

    Dear Dimitry,
    the Best Practice baseline content is freely available to anyone w/o any charge.
    You find the whole content about it at:
    SAP Best Practices package for Russia V3.607 (English)
    SAP Best Practices package for Russia V3.607 (Russian)
    Kind Regards,
    Jan

  • Install Best Practices- Baseline Package for ECC 6.0 EHP4

    Hi All,
    I know its not the right forum to post this message, but posting here as i didnt get the info from the related forum, apologies for that.
    Now we are planning to install SAP Best practices-Baseline Packages for ECC-EHP4 in our new server.
    Can anybody help me out what are the steps to be carried out. Now we have completed installation of Linux & we want to install BP for General, not for any specific industry.
    B/regds,
    CB

    Please post this question in SAP Basis Forum.
    Alternatively, you can check documents in scribd/ help.sap.com
    Raghavan

  • SAP Best Practices Baseline package for Russia V3.607

    Dear colleagues,
    My partner - BearingPoint Russia - has an interest to SAP Best Practices Baseline package for Russia V3.607
    Would you please help to find the contact whom they can ask the questions about content & price to?
    Best regards,
    Dmitry Popov

    Dear Dimitry,
    the Best Practice baseline content is freely available to anyone w/o any charge.
    You find the whole content about it at:
    SAP Best Practices package for Russia V3.607 (English)
    SAP Best Practices package for Russia V3.607 (Russian)
    Kind Regards,
    Jan

  • Best practice on sqlite for games?

    Hi Everyone, I'm new to building games/apps, so I apologize if this question is redundant...
    I am developing a couple games for Android/iOS, and was initially using a regular (un-encrypted) sqlite database. I need to populate the database with a lot of info for the games, such as levels, store items, etc. Originally, I was creating the database with SQL Manager (Firefox) and then when I install a game on a device, it would copy that pre-populated database to the device. However, if someone was able to access that app's database, they could feasibly add unlimited coins to their account, unlock every level, etc.
    So I have a few questions:
    First, can someone access that data in an APK/IPA app once downloaded from the app store, or is the method I've been using above secure and good practice?
    Second, is the best solution to go with an encrypted database? I know Adobe Air has the built-in support for that, and I have the perfect article on how to create it (Ten tips for building better Adobe AIR applications | Adobe Developer Connection) but I would like the expert community opinion on this.
    Now, if the answer is to go with encrypted, that's great - but, in doing so, is it possible to still use the copy function at the beginning or do I need to include all of the script to create the database tables and then populate them with everything? That will be quite a bit of script to handle the initial setup, and if the user was to abandon the app halfway through that population, it might mess things up.
    Any thoughts / best practice / recommendations are very appreciated. Thank you!

    I'll just post my own reply to this.
    What I ended up doing, was creating the script that self-creates the database and then populates the tables (as unencrypted... the encryption portion is commented out until store publishing). It's a tremendous amount of code, completely repetitive with the exception of the values I'm entering, but you can't do an insert loop or multi-line insert statement in AIR's SQLite so the best move is to create everything line by line.
    This creates the database, and since it's not encrypted, it can be tested using Firefox's SQLite manager or some other database program. Once you're ready for deployment to the app stores, you simply modify the above set to use encryption instead of the unencrypted method used for testing.
    So far this has worked best for me. If anyone needs some example code, let me know and I can post it.

  • Best Practice Internet Security with ADO / OraMTS / OraOLEDB and 9i?

    Hi people,
    I have the following scenario to support and I URGENTLY need some information regarding the security model vs performance envelope of these platforms.
    We currently are developing a web-application using IE 5.0^ as our browser, IIS 5.0 as our server, ASP (JScript) as our component glue, custom C++ COM+ middle tier components using ADO / Oracle OLE DB to talk to a Solaris based Oracle 9i instance.
    Now it comes to light from the application requirements that the system should, if at all possible, be supporting Virtual Private Databases for subscribers [plus we need to ease backend data service development and row-level security combined with fine grained audit seems the way to go].
    How does one use Oracle's superior row-level security model in this situation?
    How does one get the MS middle tier to authenticate with the database given that our COM+ ADO components are all required to go through ONE connection string? [Grrrr]
    Can we somehow give proxy rights to this identity so that it can "become" and authenticate with an OID/LDAP as an "Enterprise User"? If so, how?
    I have seen a few examples of JDBC and OCI middle-tier authentication but how does one achieve the same result as efficiently as possible from the MS platform?
    It almost appears, due to connection pooling that each call to the database on each open connection could potentially be requiring a different application context - how does one achieve this efficiently?
    If this is not the way to go - how could it work?
    What performance tradeoffs do we have using this architecture? (And potentially how will we migrate to .Net on the middle tier?)
    As you can see, my questions are both architectural and technical. So, are there any case studies, white papers or best practice monographs on this subject that are available to either Technet members or Oracle Partners?
    Alternatively, anyone else come up against this issue before?
    Thanks for your attention,
    Lachlan Pitts
    Developer DBA (Oracle)
    SoftWorks Australia Pty Ltd

    Hi people,
    I have the following scenario to support and I URGENTLY need some information regarding the security model vs performance envelope of these platforms.
    We currently are developing a web-application using IE 5.0^ as our browser, IIS 5.0 as our server, ASP (JScript) as our component glue, custom C++ COM+ middle tier components using ADO / Oracle OLE DB to talk to a Solaris based Oracle 9i instance.
    Now it comes to light from the application requirements that the system should, if at all possible, be supporting Virtual Private Databases for subscribers [plus we need to ease backend data service development and row-level security combined with fine grained audit seems the way to go].
    How does one use Oracle's superior row-level security model in this situation?
    How does one get the MS middle tier to authenticate with the database given that our COM+ ADO components are all required to go through ONE connection string? [Grrrr]
    Can we somehow give proxy rights to this identity so that it can "become" and authenticate with an OID/LDAP as an "Enterprise User"? If so, how?
    I have seen a few examples of JDBC and OCI middle-tier authentication but how does one achieve the same result as efficiently as possible from the MS platform?
    It almost appears, due to connection pooling that each call to the database on each open connection could potentially be requiring a different application context - how does one achieve this efficiently?
    If this is not the way to go - how could it work?
    What performance tradeoffs do we have using this architecture? (And potentially how will we migrate to .Net on the middle tier?)
    As you can see, my questions are both architectural and technical. So, are there any case studies, white papers or best practice monographs on this subject that are available to either Technet members or Oracle Partners?
    Alternatively, anyone else come up against this issue before?
    Thanks for your attention,
    Lachlan Pitts
    Developer DBA (Oracle)
    SoftWorks Australia Pty Ltd

  • Best Practice Table Creation for Multiple Customers, Weekly/Monthly Sales Data in Multiple Fields

    We have an homegrown Access database originally designed in 2000 that now has an SQL back-end.  The database has not yet been converted to a higher format such as Access 2007 since at least 2 users are still on Access 2003.  It is fine if suggestions
    will only work with Access 2007 or higher.
    I'm trying to determine if our database is the best place to do this or if we should look at another solution.  We have thousands of products each with a single identifier.  There are customers who provide us regular sales reporting for what was
    sold in a given time period -- weekly, monthly, quarterly, yearly time periods being most important.  This reporting may or may not include all of our product identifiers.  The reporting is typically based on calendar-defined timing although we have
    some customers who have their own calendars which may not align to a calendar month or calendar year so recording the time period can be helpful.
    Each customer's sales report can contain anything from 1,000-20,000 rows of products for each report.  Each customer report is different and they typically have between 4-30 columns of data for each product; headers are consistently named.  The
    product identifiers included may vary by customer and even within each report for a customer; the data in the product identifier row changes each week.  Headers include a wide variety of data such as overall on hand, overall on order, unsellable on hand,
    returns, on hand information for each location or customer grouping, sell-through units information for each location or customer grouping for that given time period, sell-through dollars information for each location or customer grouping for that given time
    period,  sell-through units information for each location or customer grouping for a cumulative time period (same thing for dollars), warehouse on hands, warehouse on orders, the customer's unique categorization of our product in their system, the customer's
    current status code for that product, and so on.
    Currently all of this data is stored in a multitude of Excel spreadsheets (by customer, division and time period).  Due to overall volume of information and number of Excel sheets, cross-referencing can take considerable time.  Is it possible to
    set-up tables for our largest customers so I can create queries and pivot tables to more quickly look at sales-related information by category, by specific product(s), by partner, by specific products or categories across partners, by specific products or
    categories across specific weeks/months/years, etc.  We do have a separate product table so only the product identifier or a junction table may be needed to pull in additional information from the product table with queries.  We do need to maintain
    the sales reporting information indefinitely.
    I welcome any suggestions, best practice or resources (books, web, etc).
    Many thanks!

    Currently all of this data is stored in a multitude of Excel spreadsheets (by customer, division and time period).  Due to overall volume of information and number of Excel sheets, cross-referencing can take considerable time.  Is it possible to
    set-up tables .....
    I assume you want to migrate to SQL Server.
    Your best course of action is to hire a professional database designer for a short period like a month.
    Once you have the database, you need to hire a professional DBA to move your current data from Access & Excel into the new SQL Server database.
    Finally you have to hire an SSRS professional to design reports for your company.
    It is also beneficial if the above professionals train your staff while building the new RDBMS.
    Certain senior SQL Server professionals may be able to do all 3 functions in one person: db design, database administration/ETL & business intelligence development (reports).
    Kalman Toth Database & OLAP Architect
    SELECT Video Tutorials 4 Hours
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

  • Best Practice transport procedure for SRM-MDM Catalogue repositories change

    Hi,
    I have a question regarding SRM-MDM Catalogue repository change transports.
    We currently have two QA servers and to Production servers (main and fail-over).
    We are investigating the need of a Development server.
    Changes are being made to the repositories on the system, and I see the need of a dev server.
    What is best practice for SRM-MDM Catalogue?
    With only QA and Prod environments I guess Repository schema transport is the best option, since there has not been created a Refference file (which is needed for change file transport).
    Any other options?
    We are running MDM as well, with dev, QA and prod environments. Here we use CTS+ for transports.
    Is it best practice to use CTS+ also for SRM-MDM Catalogue transports?
    KR,
    Thomas

    Hi Thomas.
    What is best practice for SRM-MDM Catalogue?
    SAP recommends to have the landscape model like DEV-QA-PROD.
    So in case of catalog as well if we follow the same technique it will help you to have a successful implementation
    Any other options?
    As a part of proceeding with the CTS+ you need to create a reference file
    Refer the Link: [CTS+|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d0dd1ae0-36e5-2b10-f8b4-e6365e643c0b?quicklink=index&overridelayout=true] for more details
    Is it best practice to use CTS+ also for SRM-MDM Catalogue transports?
    It is upto the requirement. if you feel there are many changes in catalog xml schema in various phases in an automatic manner then you can go ahead with CTS+ or you can perform the existing method of exporting and importing the schema to the repository.
    Hope it helps.
    Best Regards
    Bala
    Edited by: chandar_sap on Sep 28, 2011 12:17 PM

Maybe you are looking for

  • HDMI output not working after 1.0.8.6067 update on sharp LCD TV's

    Have three TV's two of which are Sharps Aquos and the other is a Samsung. After updating to 1.0.8.6067 the Playbook cannot detect the two Sharps(each different model #'s) . I have three different cables that worked fine before the update and all thre

  • IOS 4.2.1 Broke VPN

    Any reason 4.2.1 would break our VPN connection? Currently connecting to a SonicWall VPN using L2TP. Everything worked great on 3.2.2, but broke on 4.2.1. I have even downgraded and it works fine, but no go on 4.2.1

  • Invoicing plan process

    Hi, We are setting up invoicing plan in our system. Can anybody please help in following. 1. How can we setup invoicing date as 15th of every month. My client want to have the invoice generated on 15th. 2. For first 12 months amount should be 8000 an

  • Reporting display problems to Apple

    If you are having display problems, you should not be waiting for others to complain or just sit there stewing in your juices. Call AppleCare and tell them what's going on. People in IT have told me that one of the most frustrating things that they e

  • Using XI to FTP large files

    Hi Folks, I have a scenario in which I need to transfer a flat file to an external system. No mapping is required. We are not using BPM. I read Michael's comments on using Java proxies to transfer large files. I know that we can use standard Java IO