Looking for a hardening guide or best practices in production for WLS 8.1.6

Hi,
I'm working to deliver to my government customer best practices in the form of a hardening guide that conforms to NIST SP800-44. I am aware of http://edocs.bea.com/wls/docs81/lockdown/practices.html which adds some great operational tips.
Is anyone aware of other resources, or has delivered anything similar to their customer? I would greatly appreciate any guidance here.
Thanks,
Rich

Hi I would take a guide that covers any version, or other Oracle products.
-Rich

Similar Messages

  • Any known security best practices to follow for FMS deployment

    Hi all,
    We have recently deployed Flash Media Streaming server 3.5.2 and Flash Media Encoder on a Windows 2003 machine. Do you guys know of any security best practices to follow for the FMS server deployment on a Windows machine, could you please point me to that resource.

    Hi
    I will add some concepts, I am not sure how all of them work technically but there should be enough here for you to
    dig deeper, and also alot of this is relevant to your environment and how you want to deploy it.
    I have done a 28 server deployment, 4 origin and 24 edge servers.
    All the Edge servers on the TCP/IP properties we disabled file and printer sharing. Basically this is a way in for hackers and we disabled this only on the edge servers as these are the ones presented to the public.
    We also only allowed ports 1935, 80, 443 on our NICs. Protocol numbers are 6 and 17, this means that you are allowing UDP and TCP. So definitely test out your TCP/IP port filtering until you are confortable that all your connection types are working and secure.
    Use RTMPE over RTMP, as it is there to be used and I am surprised not more people use it. The problem as with any other encryption protocol, it may cause higher overhead on resources of the servers holding the connections.
    You may want to look at SWF verification. In my understanding, it works as the following. You publish a SWF file on a website. This is a source code that your player uses for authentication. If you enable your edge servers to only listen for authentication requests from that SWF file, then hopefully you are really lessening the highjacking possibilities on your streams.
    If you are doing encoding via FME then I would suggest that you download the authentication plugin that is available on the Flash Media Encoder download site.
    There are other things you can look at making it more secure like adaptor.xml, using a front end load balancer, HTML domains, SWF domains,
    Firewalls and DRM.
    I hope this helps you out.
    Roberto

  • Best Practice transport procedure for SRM-MDM Catalogue repositories change

    Hi,
    I have a question regarding SRM-MDM Catalogue repository change transports.
    We currently have two QA servers and to Production servers (main and fail-over).
    We are investigating the need of a Development server.
    Changes are being made to the repositories on the system, and I see the need of a dev server.
    What is best practice for SRM-MDM Catalogue?
    With only QA and Prod environments I guess Repository schema transport is the best option, since there has not been created a Refference file (which is needed for change file transport).
    Any other options?
    We are running MDM as well, with dev, QA and prod environments. Here we use CTS+ for transports.
    Is it best practice to use CTS+ also for SRM-MDM Catalogue transports?
    KR,
    Thomas

    Hi Thomas.
    What is best practice for SRM-MDM Catalogue?
    SAP recommends to have the landscape model like DEV-QA-PROD.
    So in case of catalog as well if we follow the same technique it will help you to have a successful implementation
    Any other options?
    As a part of proceeding with the CTS+ you need to create a reference file
    Refer the Link: [CTS+|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d0dd1ae0-36e5-2b10-f8b4-e6365e643c0b?quicklink=index&overridelayout=true] for more details
    Is it best practice to use CTS+ also for SRM-MDM Catalogue transports?
    It is upto the requirement. if you feel there are many changes in catalog xml schema in various phases in an automatic manner then you can go ahead with CTS+ or you can perform the existing method of exporting and importing the schema to the repository.
    Hope it helps.
    Best Regards
    Bala
    Edited by: chandar_sap on Sep 28, 2011 12:17 PM

  • Best Practice: Export unrendered for internet

    Hi
    As I recall, in previous versions of FCPX  it was considered best practice, at least for exports for web, to export unrendered.
    Still true? Ever true for all other export reasons?
    I want to post a rough of a film on youtube, unlisted, for a few friends to comment on.
    best
    elmer
    Btw, always seems like when I open my browser while fcpx is open, I get problems and have to delete my prefs to get back to normal. Any reason why? Just curious.

    Steve: If these are bitmaps inside a PDF that's going to be viewed on the iPad, you cannot rely on its "native resolution". Think about this: What if the original page size of this PDF is 5.5" x 8"? What if it is 20" x 32"? Which one will show the images "at their native resolution"?

  • Best Practices Building Blocks for CRM 5.0 & CRM 2007

    Hi Experts,
    Where can I find Best Practices Building Blocks for CRM 5.0 & CRM 2007?
    Thanks in advance,
    Vishwa.

    Hi
    Go to: http://help.sap.com/
    Click on the Best Practices Tab,
    Then Cross-Industry Packages,
    Then Customer Relationship Management
    They should all be under there.
    Regards
    Arden

  • Need best practice configuration document for ISU CCS

    I am working on ISU CCS project. i need  best practice cofiguration document for
    Contract management
    Collections management
    Invoicing
    Work Management as it relates to ERP Billing.
    Thanks
    Priya
    priyapandey.sapcrmatgmailcom

    Which version are you setting up and what are the requirements? IF you are discussing the use of NIC bonding for high availability beginning in 11.2.0.2 there is a concept of "High Availability IP" of HAIP as discussed in the pre-installation chapters,
    http://docs.oracle.com/cd/E11882_01/install.112/e22489/prelinux.htm, section 2.7.1 Network Hardware Requirements.
    In essence, using HAIP eliminates the need to use NIC bonding to provide for redundancy.

  • [More information] 'SAP Best Practices Baseline package for Brazil V3.607'

    Hi.
    When I study 'SAP Best Practices Baseline package for
    Brazil V3.607', I wonder somthing.
    I want solution of problem.
    ---------Problem---------
    In '100: SAP Best Practices Installation' document on point 3.4 Define Tax Jurisdiction Code it says
    Enter the Jurisdiction Codes according to the document SMB41_J_1BTXJURV_B020_NFE.TXT.
    I have search the internet for this document and the only hit is the actual Best practice document.
    Does anybody knows where to get this document?
    ASAP, reply for me.
    Thanks.

    Dear Dimitry,
    the Best Practice baseline content is freely available to anyone w/o any charge.
    You find the whole content about it at:
    SAP Best Practices package for Russia V3.607 (English)
    SAP Best Practices package for Russia V3.607 (Russian)
    Kind Regards,
    Jan

  • Install Best Practices- Baseline Package for ECC 6.0 EHP4

    Hi All,
    I know its not the right forum to post this message, but posting here as i didnt get the info from the related forum, apologies for that.
    Now we are planning to install SAP Best practices-Baseline Packages for ECC-EHP4 in our new server.
    Can anybody help me out what are the steps to be carried out. Now we have completed installation of Linux & we want to install BP for General, not for any specific industry.
    B/regds,
    CB

    Please post this question in SAP Basis Forum.
    Alternatively, you can check documents in scribd/ help.sap.com
    Raghavan

  • SAP Best Practices Baseline package for Russia V3.607

    Dear colleagues,
    My partner - BearingPoint Russia - has an interest to SAP Best Practices Baseline package for Russia V3.607
    Would you please help to find the contact whom they can ask the questions about content & price to?
    Best regards,
    Dmitry Popov

    Dear Dimitry,
    the Best Practice baseline content is freely available to anyone w/o any charge.
    You find the whole content about it at:
    SAP Best Practices package for Russia V3.607 (English)
    SAP Best Practices package for Russia V3.607 (Russian)
    Kind Regards,
    Jan

  • Best Practice Table Creation for Multiple Customers, Weekly/Monthly Sales Data in Multiple Fields

    We have an homegrown Access database originally designed in 2000 that now has an SQL back-end.  The database has not yet been converted to a higher format such as Access 2007 since at least 2 users are still on Access 2003.  It is fine if suggestions
    will only work with Access 2007 or higher.
    I'm trying to determine if our database is the best place to do this or if we should look at another solution.  We have thousands of products each with a single identifier.  There are customers who provide us regular sales reporting for what was
    sold in a given time period -- weekly, monthly, quarterly, yearly time periods being most important.  This reporting may or may not include all of our product identifiers.  The reporting is typically based on calendar-defined timing although we have
    some customers who have their own calendars which may not align to a calendar month or calendar year so recording the time period can be helpful.
    Each customer's sales report can contain anything from 1,000-20,000 rows of products for each report.  Each customer report is different and they typically have between 4-30 columns of data for each product; headers are consistently named.  The
    product identifiers included may vary by customer and even within each report for a customer; the data in the product identifier row changes each week.  Headers include a wide variety of data such as overall on hand, overall on order, unsellable on hand,
    returns, on hand information for each location or customer grouping, sell-through units information for each location or customer grouping for that given time period, sell-through dollars information for each location or customer grouping for that given time
    period,  sell-through units information for each location or customer grouping for a cumulative time period (same thing for dollars), warehouse on hands, warehouse on orders, the customer's unique categorization of our product in their system, the customer's
    current status code for that product, and so on.
    Currently all of this data is stored in a multitude of Excel spreadsheets (by customer, division and time period).  Due to overall volume of information and number of Excel sheets, cross-referencing can take considerable time.  Is it possible to
    set-up tables for our largest customers so I can create queries and pivot tables to more quickly look at sales-related information by category, by specific product(s), by partner, by specific products or categories across partners, by specific products or
    categories across specific weeks/months/years, etc.  We do have a separate product table so only the product identifier or a junction table may be needed to pull in additional information from the product table with queries.  We do need to maintain
    the sales reporting information indefinitely.
    I welcome any suggestions, best practice or resources (books, web, etc).
    Many thanks!

    Currently all of this data is stored in a multitude of Excel spreadsheets (by customer, division and time period).  Due to overall volume of information and number of Excel sheets, cross-referencing can take considerable time.  Is it possible to
    set-up tables .....
    I assume you want to migrate to SQL Server.
    Your best course of action is to hire a professional database designer for a short period like a month.
    Once you have the database, you need to hire a professional DBA to move your current data from Access & Excel into the new SQL Server database.
    Finally you have to hire an SSRS professional to design reports for your company.
    It is also beneficial if the above professionals train your staff while building the new RDBMS.
    Certain senior SQL Server professionals may be able to do all 3 functions in one person: db design, database administration/ETL & business intelligence development (reports).
    Kalman Toth Database & OLAP Architect
    SELECT Video Tutorials 4 Hours
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

  • Best Practice - Hardware requirements for exchange test environment

    Hi Experts,
    I'm new to exchange and I want to have a test environment for learning, testing ,batches and updates.
    In our environment we have co-existence 2010 and 2013 and I need to have a close scenario on my test environment.
    I was thinking of having an isolated (not domain joined) high end workstation laptop with (quad core i7, 32GB RAM, 1T SSD) to implement the environment on it, but the management refused and replied "do it on one of the free servers within the live production
    environment at the Data Center"... !
    I'm afraid of doing so not to corrupt the production environment with any mistake by my configuration "I'm not that exchange expert who could revert back if something wrong happened".
    Is there a documented Microsoft recommendation on how to do it and where to do so to be able to send it to them ??
    OR/ Could someone help with the best practice on where to have my test environment and how to set it up??
    Many Thanks
    Mohamed Ibrahim

    I think this may be useful:
    It's their official test lab set up guide.
    http://social.technet.microsoft.com/wiki/contents/articles/15392.test-lab-guide-install-exchange-server-2013.aspx
    Also, your spec should be fine as long as you run the VMs within their means.

  • Is there any best practice or standard for database object naming ?

    Hi
    Thank you for reading my post
    is there any standard or best practice for databse objects naming ?
    for example how should we name columns of a table ? should it be like TOTAL_VOTE or TOTALVOTE and many other items.
    Thanks

    what does oracle suggest as a naming schema for tables , fields , views. indexes , tablespaces , ... If you look at the data dictionary you will see that not even Oracle keeps rigidly to any specific standard, although there are tendencies :)
    "The nice thing about standards is that there are so many of them to choose from."      
    -- Andrew Tannenbaum
    Cheers, APC

  • Set filter criteria on page 1 for page 2 OData model - "best practice"?

    Hello, I have a problem with an app - where I want to filter data on a second page, based on settings from the first page. I use an OData model.
    The collections on both pages are not related in terms of "navigation" properties, that is my problem and I can not change the data source...
    So I am looking for ideas/best practices to solve this because sometimes my filtering doesn't work... the following problem occurred: Request aborted
    I have a page with a sap.m List with items="{/tabWorkPlace}" and and a local JSON model where I store relevant data during the app lifecycle.
    handleListSelect - first page
    var context = evt.getParameter("listItem").getBindingContext();
    var dataModel = sap.ui.getCore().getModel("dataModel");
    var workplace = context.getProperty("WORKPLACE_ID");
    dataModel.setProperty("/WORKPLACE_ID", workplace);
    this.nav.to("SubMaster", context);
    The general App.controller.js handles the nav.to function:
    var app = this.getView().app;
    var page = app.getPage(pageId);
    if(pageId == "secondPage") {
         page.getController().filterData();
    And the controller of the second page:
    filterData: function() {
    var oModel = sap.ui.getCore().getModel("odata");
    var dataModel = sap.ui.getCore().getModel("dataModel");
    var workplace = dataModel.getProperty("/WORKPLACE_ID");
    var items = this.getView().byId("list");
    var oFilter=new sap.ui.model.Filter("WORKPLACE_ID",sap.ui.model.FilterOperator.EQ,workplace);
    items.getBinding("items").filter(oFilter);
    I don't write this code into the onInit() or beforeRendering() function, because they are called only once and I am navigating back and forth between the two pages, because the pages are created only once and "just" the data is changed.
    The desired page looks like this - with an other collection bound to it:
    <List
      id="list"
      select="handleListSelect"
      items="{/tabWorkstep_Status}"
    >
    But when I call it - then the request gets aborted:
    The following problem occurred: Request aborted
    But despite the fact the Request is aborted, the list on the second page is filtered!
    The filter criteria for the model works when I type it into the browser with URL. Maybe this fails because the data binding for the list didn't took place at this phase?
    I have this pattern (filter criteria on one page and result on the second page) more times - (and I think a better data model would be better with navigation properties would be better, but I cannot change it)
    But at another constellation the filtering doesn't work - same error... the following problem occurred: Request aborted
    I also don't want to change the pattern (page 1 to page 2) into popup lists or this fancy new filtering possibilities because it is not suitable for my use case.
    Is there maybe a more elegant solution - because sometimes filtering works, sometimes don't..., do I have an error in my solution (general approach)?
    Many thanks for any input!
    BR,
    Denise

    Hello, yeah you are right, but it works without the odata> stuff because of this in App.controller.js:
    var uri = "http://localhost:32006/JsonOdataService.svc";
    var oModelMS = new sap.ui.model.odata.ODataModel(uri);
    sap.ui.getCore().setModel(oModelMS, "odata");
    oView.setModel(oModelMS);
    So my question is - how to navigate from one page to another - and on the other page first bind a collection to a select and then when selecting bind certain elements (a textfield) to the selected filtered entity.
    The stuff with context and binding won't work, because the two Collections don't have a navigation/association property between them...
    So for example:
    page1
    select a list item with property color: red and year 1985. Press one of the buttons and pass this criteria to another page.
    page 2:
    show a dropdown box with all car names which fullfill this criteria, and when one car is selected, then display the data for THIS car in several text fields.
    This is not a master->detail navigation example, because on page 1 i select certain criterias, and then with buttons I navigate to several pages with those criterias.
    But since the OData model has no relationships it is really hard to do it manually... With a dummy mock.json like in DJ Adams Fiori like SAPUI5 apps it is no problem... But with OData and no things related to each other it is hard...

  • Best practice RAID configuration for UCS C260 M2 for Unified Communications?

    I have two UCS C260 M2 servers with 16 drives (PID: C260-BASE-2646) and I am trying to figure out what the best practice is for setting up the RAID.
    I will be running CUCM, CUP, CUC, Prime, etc. for about 2000 phone environment.
    If anyone can offer real world suggestions that would be great. I also have a redundnat server.

    The RAID setup depends a bit on your specific configuration; however, there is a guide for Cisco Collaboration on Virtual Servers that you can review here:
    http://www.cisco.com/c/en/us/td/docs/voice_ip_comm/cucm/virtual/CUCM_BK_CF3D71B4_00_cucm_virtual_servers/CUCM_BK_CF3D71B4_00_cucm_virtual_servers_chapter_010.html#CUCM_TK_C3AD2645_00
    If your server is spec'd as a Tested Reference Configuration (TRC) then the C260M2 TRC1 would have 16 HDD that you would configure/split into 2 x 8HDD RAID 5 arrays.
    Hailey
    Please rate helpful posts!

  • Trade offs for spreading oraganizatons across suffixes - best practices?

    Hey Everyone, I am trying to figure out some best practices here. I'v looked through the docs but have not found anything that quite touches on this.
    In the past, here is how I created my directory (basically using dsconf create-suffix for each branch I needed)
    dsconf list-suffixes
    dc=example,dc=com
    ou=People,dc=example,dc=com
    ou=Groups,dc=example,dc=com
    o=Services,dc=example,dc=com
    ou=Groups,o=Services,dc=example,dc=com
    ou=People,o=Services,dc=example,dc=com
    o=listserv,dc=example,dc=com
    ou=lists,o=listserv,dc=example,dc=com
    A few years later, learning more, and setting up replication, it seems I may have made my life a bit more complicated that it should be. It seems i would need many more replication agreements to get every branch of the tree replicated. It also seems that different parts of the directory are stored in different backend database files.
    It seems like I should have something like this:
    dsconf list-suffixes
    dc=example,dc=com
    Instead of creating all the branches as suffixes or sub-suffixes, maybe i should have just created organization and organizational unit entries within a single suffix "dc=example,dc=com". This way I can replicate all data by replicating just one suffix. Is there a downside to having one backend db files containing all the data instead of spreading it across multiple files (were talking possibly 90K entries across the entire directory).
    Can anyone confirm the logic here or provide any insight?
    Thanks much in Advance,
    Deejam

    Well, there are a couple of dimensions to this question. The first is simply whether your DIT ought to have more or less depth. This is an old design debate that goes back to problems with changing DNs in X500 style DITs with lots of organizational information embedded in the DN. Nowadays DITs tend to be flatter even though there are more tools for renaming entries. You still can't rename entries across backends, though. The second dimension is, given a DIT, how should you distribute the containers in your DIT across the backend databases.
    As you have already determined, the principal design consideration for your backend configuration will be replication, though scalability and backup configuration might also come into it. From what you have posted, though, it does not look like you have that much data. So yes, you should configure database backends and associated suffixes with sufficient granularity to support your replication requirements. So, if a particular suffix needs to be replicated differently than another suffix, they need to be defined as distinct suffixes/backends. Usually we define the minimal number of suffixes and backends needed to satisfy the topological requirements, though I can imagine there might be cases where suffixes might be more fine grained.
    For large, extensible Directory topologies, I usually look for data that's sensibly divisible into "building blocks". So for instance you might have a top-level suffix "dc=example,dc=com" with a bunch of global ACIs, system users and groups that are going to need to be everywhere. Then you might have a large chunk of external customer data, and a small amount of internal employee data. I would consider putting the external users in a distinct suffix from the employees, because the two types of entries are likely to be quite different. If I have a need to build a public Directory somewhere, all I have to do is configure the external suffix and replicate it. The basic question I would be asking there is if I might ever need to expose a subset of the Directory, will the data already be partitioned for me or will I have to do data reorganization.
    In your case, it does not look likely you will need to chop up your data much, so it's probably simpler to stay monolithic and use only one backend.

Maybe you are looking for