OS X Server 3.0 new setup -- best practices?

Alright, here's what I'm after.
I'm setting up a completely new OS X Server 3.0 environment.  It's on a fairly new (1.5 year old) Mac Mini, plenty of RAM and disk space, etc.  This server will ONLY be used interally.  It will have a private IP address such as 192.168.1.205 which will be outside of my DHCP server's range (192.168.1.10 to .199) to prevent any IP conflicts.
I am using Apple's Thuderbolt-to-Ethernet dongle for the primary network connection.  The built-in NIC will be used strictly for a direct iSCSI connection to a brand new Drobo b800i storage device.
This machine will provide the following services, rougly in order of importance:
1.  A Time Machine backup server for about 50 Macs running Maverics.
1a.  Those networked Macs will authenticate individually to this computer for the Time Machine service
1b.  This Server will get it's directory information from my primary server via LDAP/Open Directory
2.  Caching server for the same network of computers
3.  Serve a NetInstall image which is used to set up new computers when a new employee arrives
4.  Maybe calendaring and contacts service, still considering that as a possibility
Can anyone tell me the recommended "best practices" for setting this up from scratch?  I've done it twice so far and have faced problems each time.  My most frequent problem, once it's set up and running, is with Time Machine Server.  With nearly 100 percent consistency, when I get Time Machine Server set up and running, I can't administer it.  After a few days, I'll try to look at it via the Server app.  About half the time, there'll be the expected green dot by "Time Machine" indicating it is running and other times it won't be there.  Regardless, when I click on Time Machine, I almost always get a blank screen simply saying "Loading."  On rare occasion I'll get this:
Error Reading Settings
Service functionality and administration may be affected.
Click Continue to administer this service.
Code: 0
Either way, sometimes if I wait long enough, I'll be able to see the Time Machine server setup, but not every time.  When I am able to see it, I'll have usability for a few minutes and then it kicks back to "Loading."
I do see this apparently relevant entry in the logs as seen by Console.app (happens every time I see the Loading screen):
servermgrd:  [71811] error in getAndLockContext: flock(servermgr_timemachine) FATAL time out
servermgrd:  [71811] process will force-quit to avoid deadlock
com.apple.launchd: (com.apple.servermgrd[72081]) Exited with code: 1
If I fire up Terminal and run "sudo serveradmin fullstatus timemachine" it'll take as long as a minute or more and finally come back with:
timemachine:command = "getState"
timemachine:state = "RUNNING"
I've tried to do some digging on these issues and have been greeted with almost nothing to go on.  I've seen some rumblings about DNS settings, and here's what that looks like:
sudo changeip -checkhostname
Primary address = 192.168.1.205
Current HostName = Time-Machine-Server.local
The DNS hostname is not available, please repair DNS and re-run this tool.
dirserv:success = "success"
If DNS is a problem, I'm at a loss how to fix it.  I'm not going to have a hostname because this isn't on a public network.
I have similar issues with Caching, NetInstall, etc.
So clearly I'm doing something wrong.  I'm not upgrading, again, this is an entirely clean install.  I'm about ready to blow it away and start fresh again, but before I do, I'd greatly appreciate any insight from others on some "best practices" or an ordered list on the best way to get this thing up and running smoothy and reliably.

Everything in OS X is dependant on proper DNS.  You probably should start there.  It is the first service you should be configuring and it is the most important to keep right.  Don't configure any services until you have DNS straight.  In OS X, DNS really stands for Do Not Skip.
This may be your toughest decision.  Decide what name you want the machine to be.  You have two choices.
1: Buy a valid domain name and use it on your LAN devices.  You may not have a need now for use externally, but in the future when you use VPN, Profile Manager, or Web Services, at least you are prepared.  This method is called split horizon DNS.  Example would be apple.com.  Internally you may name the server tm.apple.com.  Then you may alias to it vpn.apple.com.  Externally, users can access the service via vpn.apple.com but tm.apple.com remains a private address only.
2: Create an invalid private domain name.  This will never route on the web so if you decide to host content for internal/external use, you may run into trouble, especially with services that require SSL certificates.  Examples might be ringsmuth.int or andy.priv.  These type of domains are non-routable and can result in issues of trust when communicating with other servers, but it is possible.
Once you have the name sorted out, you need to configure DNS.  If you are on a network with other servers, just have the DNS admin create an A and PTR record for you.  If this is your only server, then you need to configure and start the DNS service on Mavericks.  The DNS service is the best Apple has ever created.  A ton of power in a compact tool.  For your needs, you likely need to just hit the + button and fill out the New Device record.  Use a fully qualified host name in the first field and the IP address of your server (LAN address).  You did use a fixed IP address and disabled the wireless card, right?
Once you have DNS working, then you can start configuring your other services.  Time Machine should be pretty simple.  A share point will be created automatically for you.  But before you get here, I would encourage starting Open Directory.  Don't do that until DNS is right and you pass the sudo changeip -checkhostname test.
R-
Apple Consultants Network
Apple Professional Services
Author, "Mavericks Server – Foundation Services" :: Exclusively in the iBooks Store

Similar Messages

  • Workflow setup:Best Practices

    Hi All,
    Could anyone please share knowledge related to Oracle Workflow setup:Best Practices.What all are the high level steps?
    I am looking from embedded workflow setup for R11 or R12.
    Thanks for your time!
    Regards,

    This is a very broad question - narrowing it to specifics might help folks respond better.
    There are a lot of documents on MOS that refer to best practices from a technology stack perspective.
    Oracle Workflow Best Practices Release 12 and Release 11i          (Doc ID 453137.1)
    As far as functional practices are concerned, these may vary from module to module, as functionality and workflow implementation vary from module to module.
    FAQ: Best Practices For Custom Order Entry Workflow Design          (Doc ID 402144.1)
    HTH
    Srini

  • Report server setup best practice info needed -SOLVED-

    Hello, I'm looking for some best practice info on how to set up the report server to handle multiple reports from multiple apps (hopefully in separate directories).
    We are converting forms 5 apps to 10g. Currently reports live in the same dir as the form files, each in their own application directory. Moving to 10g, the report.conf file specifies a reports dir. It does not seem that there can be multiple directories listed in the sourceDir parameter, in order to handle mutiple directories where reports can live. Is it possible to set it up so it can find reports in any of our 20 application directories? Do we have to have only one directory where all reports are served from (if so we'll have an issue, as reports from different apps could be named the same)?
    How have you folks solved this situation?
    Thanks for any info,
    Gary

    Got it working! Thanks to all for your input! I found a reference on Metalink to a known issue with running on Sun Solaris, which was causing me problems.
    Bottom line, here's what I did to get it all working:
    1) Report server .conf file:
    - Comment out sourceDir line in the engine entry.
    - Add environment entries for each app before the </server> line at the end .
    <environment id="prs">
    <envVariable name="REPORTS_PATH" value="(path to dir where reports live for this app)"/>
    </environment>
    - Bounce the server (not sure if this is necessary)
    2) $ORACLE_HOME/bin/reports.sh:
    - Comment out line that sets REPORTS_PATH
    This was necessary for Sun Solaris (the bug as mentioned on Metalink)
    3) The app .fmb that calls the report:
    - Set the report object property to specify the environment ID before calling
    run_report_object():
    set_report_object_property(rpt_id, REPORT_OTHER, 'ENVID="prs"');
    Blue Skies,
    Gary

  • Reason for creating New Version Best practice

    Hello SAP Gurus,
    We know we are creating new version of the DIR when it is required to make a change in the original keeping in tact the Older file also for reference,
    we want to know when a new version of the document is getting created how can we specify, why and what made us to create this new Version, we would like to capture this reason in SAP so the user can easily come to know why this Version is created,
    can you please let us know, where actually are we capturing this details while creating the new version,
    i know there are lot of free entry fields like Generic object Text field, Long text field, change the description, place the text in the WR log field of the new Version,
    which is the best practice so that the user can easily come to know the reason why this new version has been created,
    may be they should see this reason details in CV04N search as well, or immediately when opening the new version.
    Thanks and regards
    Kumar

    Hello Kumar
    Hope following explanation will clarify your questions
    We have to classify our documents broadly in two category  1) Controlled document --means which under goes various level of  / Inspection / review / approval etc with in Organisation e.g. Engg Drawing , Standard Operating Procedure for business purpose, best way for this is use Engineering Change Management (ECM)Functiuon and when you make any up versions, you can track / write key docu ment history in ECM 
    also when you do up version let say in Micosoft word , you can maintain simple table to track what is purpose of change, in what location you are making chnage and its justification.
    For Engineering drawing of 2D /3D we can use Redlining functions to createt the Lawyers to know reviwers observation so that next version thopse are taken care
    2) Uncontrolled Document---Generally they are less significant from document Change Control point of view
    Hope this is useful
    Regards
    GIRISH

  • ESA setup - best practices

    I have 2 ESA (c370) running active/active setup. Currently each ESA configured to use only a single port for both inbound and outbound email. The other 3 ports are not in use. What are the best practices of setting up the ESA ?
    Sent from Cisco Technical Support iPhone App

    Interfaces- Only one of the three available Ethernet interfaces on the Cisco appliance is required
    for most network environments. However, you can configure two Ethernet interfaces and segregate
    your internal network from your external Internet network connection.
    Source: ESA_8.0_User_Guide.pdf
    You could potentially have low security, high security, and a management interface. Some networks need a physical connection into a dmz but if your environment doesn't need it there is no reason to use multiple interfaces.

  • Converting Physical Exchange Server into virtual environment (P2V) – Best Practices & Tips from the field (Guide)

    Hi, I wrote a 10-pages guide regarding Exchange P2V process which I performs often at organiziations.
    I would like to hear your opinion about this guide, if you have additional information - feel free to contact me at netanel [at] ben-shushan [dot] net.
    Here's the link to the P2V guide:
    http://blogs.microsoft.co.il/files/folders/898538/download.aspx
    Netanel Ben-Shushan, MCSA/E, MCTS, MCITP, Windows Expert-IT Pro MVP. IT Consultant & Trainer | Website (Hebrew): http://www.ben-shushan.net | IT Services: http://www.ben-shushan.net/services | Weblog (Hebrew): http://blogs.microsoft.co.il/blogs/netanelb
    | E-mail: [email protected]

    Hi Netanel, thank your for the useful guidelines. 
    Do you have a similar guide for a Exchange 2010 P2V process? Or the basic guidelines would be generally the same? 
    Best regards
    LJ

  • Syncing, best practice?

    Still with MobileMe, I'm about to move everything over (iMac, MacBook 10.7.2 and iPhone iOS5) to iCloud.
    I've always found the multiplicity of ways to sync (Wi-Fi, or via iTunes, or push, or via iSync etc!)… from the MobileMe site, in each app) confusing.
    Does the apparent association of data with the MobileMe account still apply?
    Is there a site, link, PDF or other doc which lays out clearly what the new iCloud best practice is to keep only iCal and Contacts (Apple Address Book) in perfect synch (no delays, no duplicates, no fuss), please?
    TIA!

    Mark Sealey wrote:
    1) What's the differece bwteeen http://www.me.com/migrate and http://www.icloud.com/move?
    Should be the same. For some MM users the first works, for others the second.
    It simply moves Mobile Me data over to icloud and changes the Mail Server settings.
    I never migrated, because I never used MM. I am just a new icloud user , so no need for these facilities. You can also opt in to simply create a new account, from the iOS5 setup wizard. Apple doesn't seem to care how many accounts one creates in icloud. But useable is of course only one at a time, better : one per "up to 5 devices".
    2) Yes, iCloud is ruled system-wide, not on a per-app setting for OS X.
        But I currently have 'Documents' checked in my iCloud System Preference (without yet having set up iCloud) .     If I uncheck it on my iMac (without having yet set up iCloud), I get a warning,
                            'If you turn off Documents & Data, all documents stored in iCloud will be removed from                                this Mac'!
        Really? :-(
    Can safely be ignored on OS X - nothing gets deleted, because nothing is to delete. As I said already no single app makes use of "documents and data" in OS X currently, only selected apps in iOS5 ( and there , in iOS5 the warning is not to be ignored, as it really does delete all documents ).
    3) iTunes 10.5 supports Wifi Sync, must be enabled in iTunes' settings. Whenever the device is on Wifi, and there     is something to sync with iTunes, it will do.
       This is the same as traditional USB-cable-sync ( which is at all times also available ) .
    But I should not synch this way, should I? I should not let iTunes do it, should I?
    Icloud sync per se has nothing to do with iTunes sync. These are two facilities for two very different reasons : icloud will sync adresses,bookmarks, iCloud Mail but for example not your private ( non-store-purchased ) movies or music.
    In other words : you use iTunes only to sync to the device what iCloud wont do. iTunes-syncs will not affect the iCloud and vice versa - basically the two don't know about each other.
    I can't tell you if you "should" use it. I do, because I want my music on my iPod, I also sync garageband and iMovie projects from and to it. All this icloud doesn't for me.
    So the old (MM) way with calendars associated with mobileme accounts is gone?
    I don't know what MobileMe did different . iCloud calendar syncs your local OSX iCal entries to the cloud and thus also to the iOS5 devices. All google/hotmail/yahoo entries will not sync to the cloud, only "Mac" local calendar.
    You need to be aware that iCloud is not MobileMe but a new product. You wont have a "idisk" to store files in icloud , you wont have Galleries to share pictures online and also no iWeb hosting space. It's really all about syncing. The only "storage" if one can even call it so, is for the mobile iwork apps that they can store office files in the icloud. From a Mac one can then up- and download via iCloud's web-interface at www.icloud.com.
    And that option exists only because OSX apps wont sync / mirror your changes you did to a document on pages for iOS5, for example. You must always download to Mac, edit there, upload again , edit on iOS, sync to icloud, download on Mac again ... it is very clumsy as it is now. Most people thought Apple would have made the "document and data" thing available also in OSX ( like you were wondering about your 100GB space .. ) , but they did not. Maybe a future version of iwork will access this, but right now no.

  • Best practicies exposing AM (OAF 11.5.10) as webservice to external systems

    IHAC how is developing extensions to there ebusiness install base using OAF 11.5.10 and they have approached me with questions on how they could expose some of the business services developed (AM VO mainly) as webservices to be used in a BPEL/Webservice framework. The BPEL service is seebeyond (not sure how it is spelled) and not Oracle's.
    I have outlined 2 ways, but since I have not developed anything on OAF I have no idea it is possible.
    First was to migrate the ADF BC (or BC4J) projects from OAF with JDeveloper 10.1.3 and just more or less create a simple facade layer of a session bean right-click and deploy as webservice.
    Second: was to use a webservice library such as axis to be used in JServ directly to expose them on the "target" server.
    Has anyone any best practicies on this topic,

    For recognition and stable functionality of USB devices, your B&W G3 should be running OS 8.6 minimally. The downloadable OS 8.6 Update can be run on systems running OS 8.5/8.5.1. If (after updating to 8.6), your flash drive still isn't recognized, I'd recommend downloading the OS 9.1 Update for the purpose of extracting its newer USB support drivers, using the downloadable utility "TomeViewer." These OS 9.1 USB support files can be extracted directly to your OS 8.6 Extensions folder and are fully compatible with the slightly older OS software. It worked for me, when OS 8.6's USB support files lacked a broad enough database to support my first USB flash drive.

  • Advice for Soon-to-be MacPro Owner. Need Recs for Best Practices...

    I'll be getting a Quad Core 3 Ghz with 1GB of RAM, a 250Gig HD, the ATI X1900 card. It will be my first mac after five years (replacing a well-used G4 Tibook 1Ghz).
    First the pressing questions: Thanks to the advice of many on this board, I'll be buying 4GB of RAM from Crucial (and upgrading the HD down the road when needs warrant).
    1) Am I able to add the new RAM with the 1G that the system comes with? Or will they be incompatible, requiring me to uninstall the shipped RAM?
    Another HUGE issue I've been struggling with is whether or not to batch migrate the entire MacPro with everything that's on my TiBook. I have so many legacy apps, fonts that I probably don't use any more and probably have contributed to intermittent crashes and performance issues. I'm leaning towards fresh installs of my most crucial apps: photoshop w/ plugins, lightroom, firefox with extensions and just slowly and systematically re-installing software as the need arises.
    Apart from that...I'd like to get a consensus as to new system best practices. What should I be doing/buying to ensure and establish a clean, maintenance-lite, high-performance running machine?

    I believe you will end up with 2x512mb ram from the Apple store. If you want to add 4gb more you'll want to get 4x1gb ram sticks. 5gb ram is never an "optimal" amount but people talk like it's bad or something but it's simply that the last gig of ram isn't accessed quite as fast. You'll want to change the placement so the 4x1 sticks are "first" and will be all paired up nicely so your other two 512 sticks only get accessed when needed. A little searching here will turn up explanations for how best to populate the ram for your situation. It's still better to have 5 gigs where the 5th gig of ram isn't quite as fast than 4. They will not be incompatible but you WILL want to uninstall the original RAM, then put in the 4gigs into the optimal slots then add the other two 512 chips.
    Do fresh installs. Absolutely. Then only add those fonts that you really need. If you use a ton of fonts I'd get some font checking app that will verify them.
    I don't use RAID for my home machine. I use 4 internal 500gig drives. One is my boot, the other is my data (although it is now full and I'll be adding a pair of external FW). Each HD has a mirror backup drive. I use SuperDuper to create a clone of my Boot drive only after a period of a week or two of rock solid performance following any system update. Then I don't touch it till another update or installation of an app followed by a few weeks of solid performance with all of my critical apps. That allows me to update quicktime or a security update without concern...because some of those updates really cause havoc with people. If I have a problem (and it has happened) I just boot from my other drive and clone that known-good drive back to the other. I also backup my data drive "manually" with Superduper.
    You will get higher performance with Raid of course, but doing that requires three drives (two for performance and one for backup) just for data-scratch, as well as two more for boot and backup of boot. Some folks can fit all their boot and data on one drive but photoshop and many other apps (FCP) really prefer data to be on a separate disk. My setup isn't the absolute fastest, but for me it's a very solid, low maintenance,good performing setup.

  • Looking for Some Examples / Best Practices on User Profile Customization in RDS 2012 R2

    We're currently running RDS on Windows 2008 R2. We're controlling user's Desktops largely with Group Policy. We're using Folder Redirection to configure their Start Menus as well.
    We've installed a Server 2012 R2 RDS box and all the applications that users will need. Should we follow the same customization steps for 2012 R2 that we used in 2012 R2? I would love to see some articles on someone who has customized a user profile/Desktop
    in 2012 R2 to see what's possible.
    Orange County District Attorney

    Hi Sandy,
    Here are some related articles below for you:
    Easier User Data Management with User Profile Disks in Windows Server 2012
    http://blogs.msdn.com/b/rds/archive/2012/11/13/easier-user-data-management-with-user-profile-disks-in-windows-server-2012.aspx
    User Profile Best Practices
    http://social.technet.microsoft.com/wiki/contents/articles/15871.user-profile-best-practices.aspx
    Since you want to customize user profile, here is another blog for you:
    Customizing Default users profile using CopyProfile
    http://blogs.technet.com/b/askcore/archive/2010/07/28/customizing-default-users-profile-using-copyprofile.aspx
    Best Regards,
    Amy
    Please remember to mark the replies as answers if they help and un-mark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]

  • Upcoming SAP Best Practices Data Migration Training - Chicago

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    SAP America, Downers Grove in Chicago, IL:
    November 3 u2013 5, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
    Logistics & How to Register
    Nov. 3 u2013 5: SAP America, Downers Grove,  IL
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 8AM u2013 3PM
                     Address:
                     SAP America u2013Buckingham Room
                     3010 Highland Parkway
                     Downers Grove, IL USA 60515
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please use the hyperlink below.
    http://service.sap.com/~sapidb/011000358700000917382010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Upcoming SAP Best Practices Data Migration Training - Berlin

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    Berlin, Germany: October 06 u2013 08, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    Logistics & How to Register
    October 06 u2013 08: Berlin, Germany
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 9AM u2013 4PM
                     SAP Deutschland AG & Co. KG
                     Rosenthaler Strasse 30
                     D-10178 Berlin, Germany
                     Training room S5 (1st floor)
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please follow the hyperlink below
    http://intranet.sap.com/~sapidb/011000358700000940832010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Best practices for search service in a sharepont farm

    Hi
    in a sharepoint web application there is many BI dashboards are deployed and also we have plan to
    configure enterprise search  for this application.
    in our sharepoint 2010 farm we have
    2  application server s
    2 WFE servers
    here one application server is running
    c.a + webanalytics service and itself is a domain controller
    second application server is for only running secure store service+ Performance point service only
    1 - here if we  run search server service in second application server can any issues to BI performance and
    2 - its best practice to run Performance point service and search service in one server
    3 -also is it  best practice to run search service in a such a application server where already other services running
    and where we have only one share point web application need to be crawled and indexed with  below crawl schedule.
    here we only run full crawl per week and incremental crawl at midnight daily
    adil

    Hi adil,                      
    Based on your description, you want to know the best practices for search service in a SharePoint farm.
    Different farms have different search topologies, for the best search performance, I recommend that you follow the guidance for small, medium, and large farms.
    The article is about the guidance for different farms. 
    Search service can run with other services in the same server, if condition permits and you want to have better performance for search service and other services including BI performance, you can deploy search service in dedicated server.
    If condition permits, I recommend combining a query component with a front-end Web server to avoid putting crawl components and query components on the same serve.
    In your SharePoint farm, you can deploy the query components in a WFE server and the crawl components in an application server.
    The articles below describe the best practices for enterprise search.
    https://technet.microsoft.com/en-us/library/cc850696(v=office.14).aspx
    https://technet.microsoft.com/en-us/library/cc560988(v=office.14).aspx
    Best regards      
    Sara Fan
    TechNet Community Support

  • Cisco MSE Best Practice

    Hi there,
    I have two MSE 3355, MSE-1 for wIPS and MSE-2 for Location.
    In the PI (Prime Infrastructure) site maps reports I can't look the wIPS attackers from the MSE-2 Location. It is possible to do this without having the wIPS license in the same server? How can I interoperate the wIPS attacks reports from the MSE-1 to the MSE-2 location base server? What is the best practice for this scenario?
    Thanks

    Using TACACS+ with ACS especially gives you all of the AAA's - this is better/best practice for mgmt access to Cisco devices imho.
    Bilal

  • ECC 6.0 Best Practices installation

    Dear Experts,
    I have some questions to ask you regarding the project I am working on.
    Our team is involved in the implementation of an SAP ERP (ECC) solution for our customer. In order to speed up all the process the team has decided to use the SAP Best Practices technology.
    The project was very successful and our customer has asked us to deploy the solution to other subsidiaries.
    The system has been configured as follow for the french country:
    1. SAP ERP 6.0
    2. SAP Best Practices for french with the required package level in order to install the bp
    3. Enhancement Package 2
    Questions:
    1. How can we implement the same bp scenario on the other subsidiaries if the system is not at the same support package as the previous project?
    2. The new bp for Germany and USA are now available only with support package 3?
    Thanks in advanced for your help.
    Alberto

    Dear ALL,
    Please note below the answer from SAP.
    Regards,
    Alberto
    Regarding your questions:
    1. The official answer from SAP is that we can only guarantee a successful installation for a specific SAP Best Practices package for the specific support package level as mentioned in the quick guide. The recommended path in the customer project is to install the SAP ERP release with the required support package level, install and activate the SAP Best Practices scenarios and then apply any higher support package or Enhancement package.
    2. Yes, the new SAP Best Practices baseline packages V1.603 for US, Germany and also now for France are based on SAP ERP EhP3. The reason for this is that in the EhP3 the functionality for running the SAP NetWeaver Business Client including the preconfigured roles and power lists are now part of the standard SAP ERP and delivered within the EhP3

Maybe you are looking for

  • How do i use youtube/safari without wifi

    I know people LOVE the wifi option, and i do too.... HOWEVER im running on a VERY slow internet connection at home, using "Basic" internet, which allows me to download at a max of 24kbps for torrents, and for when i download with firefox. So, it take

  • I in have a product registered an old account, how do i move it to a new account and close the old one?

    I in have a product registered an old account, how do i move it to a new account and close the old one?

  • Page cannot be dispalyed message

    i installed tomcat and created a simple servlet named servlet1. i created a directory named servlet1 and put servlet1.class under servlet1/WEB-INF/classes. i also put web.xml under WEB-INF. in web.xml i wrote- <?xml version="1.0" encoding="UTF-8"?> <

  • Automate creating a mailbox database

    Hi everyone, I have a new deployment at a customer and I want to create many databases in an automated way. I have all the settings in an excel sheet delivered by the customer after reviewing design options. So how can I do such automated creation fo

  • Exchange 2007 with windows 2012 R2 domain controllers environment.

    Hello, I am planning to upgrade to AD windows 2012 R2 from 2008 in 2003 on funtional level for domain and forrest, my question is, is 2007 exchange (Version build: 08.03.0245.001) supported on 2012 R2 domain environment on 2008 funtional level? I am