Sharepoint DMZ

SharePoint 2010:
Is this possible ?
SharePoint External site will be accessed from wfe in DMZ.  but wfe in dmz  should talk to application server which is in internal corp network and  application serve will talk to SQL which is also in internal corp network
SharePoint web front server external site(DMZ)-->Application server which is Internal Corp Network--> Sql (which is Internal Corp Network).
we wanted to avoid direct communication from DMZ to SQL(Internal) for which we wanted to by pass through application sever which is in internal corp network.
Is this possible ? Appreciate any help.
Kannalo

Where would your content then populate from?
What you need to consider, is a reverse proxy, to act as the middle man between your DMZ environment and your internal servers (app, sql, etc...). 
Check out the documentation on Extranet Best practices for SharePoint 2010:
http://technet.microsoft.com/en-us/library/hh204611%28v=office.14%29.aspx

Similar Messages

  • KMS issue – KMS not receiving Activation Requests from Servers

    Hi,
    I've got a very strange problem with KMS.  I have recently inherited a SharePoint DMZ environment built using
    Best Practices for Securing Active Directory and
    CIS Microsoft Windows Server 2008 R2.  All the servers were built using MAK keys, which worked well enough until IT Security locked down firewall access.  Now all the server are going non-genuine. My solution was to setup a KMS activation and
    convert all the servers over. Seemed simple enough, even tested it in my Dev Lab no issues once the Windows Firewall was configured to allow KMS (TCP-in). So I set it up started converting servers to KMS, (slmgr /ipk). The client all Windows 2008 R2 Servers
    started sending activation requests to the KMS machine, (Event ID: 12288).  Here’s where the problems start, according to the firewall logs on the KMS Server it allows the request in but no Event ID: 12289’s are registered and the client display’s the
    following: “Error: 0xC004F074 The Software Licensing Service reported that the computer could not be activated. The Key Management Service (KMS) in unavailable”.
    Verified the KMS is active and listening on port 1688. Ran it again with Wire Shark installed and it showed the DCERPC protocol attempting to connect and getting “status: nca_s_fault_access_denied”. Began investigating RPC issues, found two RPC related GPO
    entries in our Default Domain Policy and reference in the CIS benchmark referenced above.
    Computer Configuration \ <policies> \ Administrative Templates \ System \ Remote Procedure Call
    Restrictions for unauthenticated RPC clients
     RPC endpoint mapper client authentication
    Both are “Enabled” and the first is set with the following option:
     "Authenticated without Exceptions".
    So I proceeded to set these up in my Lab and… BOOM, killed my Lab environment. See
    http://blogs.technet.com/b/askds/archive/2011/04/08/restrictions-for-unauthenticated-rpc-clients-the-group-policy-that-punches-your-domain-in-the-face.aspx
    After I rebuilt my Lab, I determined the culprit was GPO entry #1 -
    Restrictions for unauthenticated RPC clients -
    Authenticated without Exceptions.  My attempts at changing this option cause it to “hoop” my lab environment yet
    again.
    Question time:
    Does anyone know how to change this option safely and to not cause the problems I’ve had?
    Does anyone have any alternate methods or ideas for setting up KMS in an environment such as mine?
    Any help would be greatly appreciated.
    Regards,
    James

    So we determined last this past Friday that I was mistaken, the GPO's in question were not in our Default Domain Policy as a thought...My apologize. They were however in our DMZ Base Server GPO, I was able to drop our KMS Server into a new
    sub-OU (called KMS Activations) and then provide a GPO override in that OU for the following line:
    Computer Configuration \ <policies> \ Administrative Templates \ System \ Remote Procedure Call\Restrictions for unauthenticated RPC clients
    and set it to: "Authenticated".
    Once we did this plus STOP and Start SPP Service, we ran Slmgr / ato  on 5 servers to hit the threshold, the rest of the server activated on their own over the weekend.
    Confirmed this morning, once we flipped the remaining servers over from MAK to KMS they activated on their own. Modifying our build template to have the KMS key now not the MAK.
    Still not sure what grants KMS the exception to the GPO, assuming the Firewall?
    The original was "Authenticated without Exceptions" verses the Current "Authenticated" so the only real difference is that exceptions are aloud if authorized
    by something.
    I appreciate the responses.
      

  • Moving SharePoint Form to another DMZ zone

    Hi,
    We have SharePoint application deployed on DMZ zone.So entire farm (WFE, APP & DB server)  is under DMZ zone. however for some reason client is looking to move entire farm to another DMZ zone. I would like to know what are the aspects we need to
    consider for this activity.
    Best Regards,
    Safder

    A few things come to mind:
    Active Directory location & firewall access
    Network Routing
    Reverse Proxies (if applicable)
    Network Load Balancers
    Server name / DNS / IP changes
    URL changes (if needed)
    Dimitri Ayrapetov (MCSE: SharePoint)

  • SharePoint 2010 portal on DMZ with reverse proxy

    Hi,
    I need to publish sharepoint portal for extranet,Portal can access on internet with AD credential.
    i have one WFE,one App and on db server,I need to know WFE server is required to host on DMZ or new server with any reverse proxy tool.
    we are more concern about security threat.
    Hasan Jamal Siddiqui(MCTS,MCPD,ITIL@V3),Sharepoint and EPM Consultant,TCS
    |
    | Twitter

    Chek below:
    http://technet.microsoft.com/en-us/library/dn607304%28v=office.15%29.aspx
    Port details:
    APP\WEB
    1.1.1.1
    1.1.1.2
    APP\WEB
    1.1.1.1
    1.1.1.2
    TCP 16500-16519
    search index component
    APP\WEB
    1.1.1.1
    1.1.1.2
    APP\WEB
    1.1.1.1
    1.1.1.2
    TCP 22233-22236
    AppFabric Caching Service 
    APP\WEB
    1.1.1.1
    1.1.1.2
    APP\WEB
    1.1.1.1
    1.1.1.2
    TCP 808
    Windows Communication Foundation communication
    APP\WEB
    1.1.1.1
    1.1.1.2
    APP\WEB
    1.1.1.1
    1.1.1.2
    TCP 32843, 32844, 32845
    Web servers and service applications (the default is HTTP)
    APP\WEB
    1.1.1.1
    1.1.1.2
    AD DS \DNS(If multiple please include)
    1.1.1.3
    TCP 5725 TCP&UDP 389 (LDAP service) TCP&UDP 88 (Kerberos) TCP&UDP 53 (DNS) UDP 464 (Kerberos Change Password)
    synchronizing profiles between SharePoint 2013 and Active Directory Domain Services (AD DS)
    APP\WEB
    1.1.1.1
    1.1.1.2
    SQL
    1.1.1.4
    TCP 1433, UDP 1434
    SQL Server communication
    APP\WEB
    1.1.1.1
    1.1.1.2
    APP\WEB
    1.1.1.1
    1.1.1.2
    TCP 32846
    SharePoint Foundation User Code Service
    APP\WEB
    1.1.1.1
    1.1.1.2
    SMTP server
    1.1.1.5
    TCP 25
    SMTP for e-mail integration
    APP\WEB
    1.1.1.1
    1.1.1.2
    APP\WEB
    1.1.1.1
    1.1.1.2
    TCP 30000
    Central Admin
    APP\WEB
    1.1.1.1
    1.1.1.2
    APP\WEB
    1.1.1.1
    1.1.1.2
    TCP 2382
    SQL Server Browser service
    SQL1
    1.1.1.4
    SQL2
    1.1.1.5
    TCP 1433 and TCP 5022.
    Multiple SQL if exists
    APP\WEB
    1.1.1.1
    1.1.1.2
    SQL1
    1.1.1.4
    TCP port 135
     Integration Services service
    APP\WEB
    1.1.1.1
    1.1.1.2
    All clients
    All
    TCP 80/443
    For client access
    If this helped you resolve your issue, please mark it Answered

  • Sharepoint Internet publishing dmz and lan

    we have provided below list to Operation to configure dmz and lan envoirnment
    dmz server was not on domain they faced issue to put dmz on domain they have to open on firewall any from
    dmz to active directory is there any port we are missing below if we have to have communication from DMZ to db/application server
    MCTS,ITIL

    WFE -> DB only requires 1433 (or the assigned port) and 1434/udp if using a random port. WFE -> WFE communication is what leverages 32843/32844 (service calls).
    Outbound email must be port 25, unless you configure an anonymous relay that SharePoint can communicate to over port 25. 
    Trevor Seward
    Follow or contact me at...
    &nbsp&nbsp
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • Public-facing on-premises SharePoint with NTLM authentication

    I've been searching for authentication best practices for public-facing SharePoint site but I didn't find any useful resources on the issue that is troubling me.
    Assume I set up a web application with Classic NTLM authentication. On that web application I enable
    Anonymous access. This means that users inside organization's network will be able to authenticate (actually use SSO) using organization's DC. They will be able to access and administer all content. All other anonymous users will be able to see
    published content only i.e. content which is permitted to anonymous users.
    My question is: Is this kind of setup a security issue because if a potential attacker hacks a WFE then he has direct access to DC?
    Is FBA maybe a better solution for public-facing sites? Or maybe use NTLM, but create a separate domain with one-way trust to organization's domain?

    There are many variations you can take with this - and really you need to consider more than just your content. For true separation:
    I would have a dedicated DC to manage service accounts.
    I would break up my DMZ behind firewall contexts with a reverse proxy publishing SharePoint at the edge.
    proxy/firewall -- SP Server -- Firewall -- SQL/DC
    For true separation you don't want to share any underlying infrastructure with internal either, although in reality logical separation is usually enough.
    Now you have to deal with internal user authentication and how to handle that. The first thing is I would have at minimum two webs available, your primary for editing and the extended version for public access.
    While a one way trust would work - you still do expose user info out to the public which you may not want. With this configuration you could configure people picker to only select from a particular OU to minimize this.
    Another option however is to look at using ADFS between your domains and create the trust there. You would have to configure the farm for claims auth to make this work, but this would eliminate the possibility of probing all the users in AD or the OU you expose.
    With the ADFS method when you update documents you user name is still tagged to content - however if you don't populate the user profiles this will be the only information available about any internal user.
    You may even want to go a step further and when you extend the public site, use forms authentication but don't provide any users. Then there is no authenticated access from the public URL. And with ADFS/Reverse Proxy may you even be able to configure some pre
    authentication for your internal users before they can even reach the internal SharePoint pages.
    I would strongly consider moving to SharePoint 2013 and looking at the cross site publishing (2010 and below have the content publishing - but stay away from that, when it works it's great, but when it doesn't it's a PITA to get back in sync). with cross site
    publishing you have an editing site and the publishing site pulls from the Search index and the permissions are completely separate.

  • Provision Search in SharePoint Foundation 2013 without Domain Controller / Active Directory - Domain accounts

    Hi,
    I have successfully setup SharePoint Foundation 2013 as single server farm with SQL Server Standard database in a DMZ environment using local accounts since DMZ doesn't have an Active Directory and hence Domain accounts using powershell as described
    in https://theblobfarm.wordpress.com/2012/12/03/installing-sharepoint-2013-without-a-domain-controller 
    When I run Farm configuration wizard to provision search service application, I get an error:
    ERROR: "The service application(s) for the service "Search Service Application" could not be provisioned because of the following error: I/O error occurred."
    The log file logged the details of this error as:
    ERROR: "Failed to create file share Analytics_e441aa1c-1a8d-4f0a-a079-58b499eb4c50 at D:\SharePoint Search\Office Server\Analytics_e441aa1c-1a8d-4f0a-a079-58b499eb4c50 (System.ArgumentException: The SDDL string contains an invalid sid or a sid
    that cannot be translated."
    After investigation, I found that potentially the error could be because the timer service is trying to setup a network share for analytics component (as part of provisioning search). It is trying to setup that share with a domain account that happens to
    be a local user instead in this case and fails with error “System.ArgumentException: The SDDL string contains an invalid sid or a sid that cannot be translated”.
    I got some pointer from the below thread
    https://social.technet.microsoft.com/Forums/en-US/c8e93984-f4e5-46da-8e8a-c5c79ea1ff62/error-creating-search-service-application-on-sharepoint-foundation-with-local-account?forum=sharepointadmin
    However, the above thread doesn't state that the solution worked.
    I have tried creating share manually for Analytics_<Guid> folder but it doesn't work since every time farm configuration wizards is run it creates a new Analytics_<Guid> folder.
    Since, I have setup SharePoint Foundation 2013 on a production environment I cannot test and trial various solutions.
    Can some please guide me on how to successfully provision search for SharePoint Foundation 2013 setup as a single server farm with SQL Server Standard database in a DMZ environment using local accounts (without Active Directory - domain accounts).
    Thanks in advance.
    Himanshu

    Microsoft documentation doesn't always specifically call out all products (Project Server isn't there, either). But it does apply. You'll need to stand up at least one Domain Controller, or allow port access back to a DC.
    Preferably, set up SharePoint on the internal network and use a reverse proxy (which will terminate client connections at the reverse proxy) present in the DMZ.
    Trevor Seward
    Follow or contact me at...
    &nbsp&nbsp
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • Sharepoint Foundation 2010 and Large Media Files?

    I have already seen links giving instructions in how to raise the default 50MB upload limit to up to 2GB, but it doesn't seem like a good idea based on all the caveats and warnings about it.
    If we need to occasionally allow access to external SharePoint users to files of size much larger than 50MB (software application installation files, audio recordings and video recordings) despite most documents being much less than 50MB (Office documents)
    what is the best solution that does not involve using third party external services such as OneDrive, Azure or Dropbox because we must host all of our files on premises.
    The SharePoint server is Internet accessible, but it requires AD authentication to log in and access files.
    Some have recommended file server shares for the larger files, but the Internet users only have AD accounts that are used to access the SharePoint document libraries, but they do not have VPN that would be needed to access an internal file share.
    I have heard of FTP and SFTP, but the users need something more user-friendly, doesn't require anything applications than their browser and that will use their existing AD credentials and we need to have auditing of who is uploading and downloading files.
    Is there any other solution other than just raising the file limit to 1 or 2GB just for a handful of large files in a document library full of mostly sub 50MB files?

    I had a
    previous post about performance impacts on the upload/download of large content on SharePoint.
    Shredded storage has got little to do with this case, as it handles Office documents shared on SharePoint, being edited on Office client, and saving back only the differences, therefore, lightening up the throughput.
    These huge files are not to be edited, they're uploaded once.
    It's a shame to expose this SharePoint Farm on the extranet, just because of a handful of huge files.
    Doesn't the company have a webserver on the DMZ hosting a extranet portal ? Can't that extranet portal feature a page that discovers those files intended to be downloaded from the outside, and then, act as a reverse proxy to stream them out ?
    Or, it may be a chance to build a nice Portal site on SharePoint.

  • Cant upload large files to Sharepoint site through S-160 proxy

    Hi, I've got the following setup in our Windows AD infrastructure. Client admin websites hosted in our dmz, whose url is accessed via an externally hosted DNS reference. I've built up an Ironport S-160 to replace a defrunct Bluecoat and have trouble uploading files larger than 40kb to the admin sites via Sharepoint. After a while the connection times out saying the max size limit has been exceeded, however sharepoint is set to a maximum for 50mb uploads.
    The S-160 is using NTLMSSP as it's authentication scheme on the global identity policy and I've created an authentication exemption identity for the sharepoint admin sites. But from what I can determine the only way you can upload is to set the authentication on the admin sites to basic, which means you would be transmitting your login credentials in clear text over the internet.
    Has anyone an idea how i can persuade the S-160 to authenticate to the admin sites, whilst using NTLMSSP and not using basic authentication? I found this url (http://serverfault.com/questions/101127/couldnt-upload-files-to-sharepoint-site-while-passing-through-squid-proxy) which detailed a similar problem but they decided to use basic + ssl which I think is still a bad idea.

    Come on! Surely there is someon out there who has run into this problem before?

  • Adding a secure, internal-only SharePoint Web application / Site collection in existing farm

    Hi,
    We are currently working on creating a new internal-only SharePoint site that will host sensitive information. We are planning the architecture to provide a secure environment to host this information in SharePoint. We will create the new web app on a separate
    database with encryption enabled TDE; we are also planning to encrypt the data through the SharePoint (Insert third-party vendor here) forms before it gets to the SP DB. And obviously, SharePoint permissions will be set accordingly.
    Additionally, we would like to have the site accessible
    only through our internal network and keep it off the DMZ.
    Our current SharePoint environment consists of two web-front end servers (load-balanced) externally exposed (DMZ), one application server and the SQL server both behind the DMZ (internal-only). Currently all of our SharePoint web apps are accessible externally
    through SSL.
    What is the best way to accomodate this new internal-only web application within our existing farm providing the security measures explained before?
    I am thinking  on adding an extra WFE server to the existing farm and put it behind the DMZ (internal-only) in a similar way as our application server is configured right now, but just serving exclusively this new internal site's content. I would then
    have the NEtwork guys to make the site accessible only to users logged-in internally in our network and through this new dedicated server only. My concern is that since all of our other web apps in the farm are exposed externally, and since the new server
    would be part of the same farm, that could be open doors for bad guys to access this information. Are there any other topology options I should consider? I have thought about creating a small (one-server only) new farm just for this purpose, but I am trying
    to avoid going that route.
    Any thoughts?
    Thank you,
    Rob

    You're mostly going down the right track.
    A new web application in dedicated SQL DB and web application policies to deny all external accounts access to the sites will go a long way. You can also make sure that the DNS does not resolve externally.
    If you want security you will probably be building the web application on https alone, which is my preference for any farms these days. That might negate the need for your encrypted infopath system.
    However you cannot add a WFE to a farm and dedicate a web app soley to that server. Any server with the SharePoint Foundation Web Application role will host all web applications. You can steer traffic to one
    server or another but that's not really doing much for security. If it's on one WFE it's on them all. For that reason I would say that the standalone farm is the best, most secure, solution.
    All of what you've been describing will help with security but you'll have to spend hours testing connections, securing files and testing testing testing.  Whilst the standalone will just work.
    No, i don't know why that turned into tiny print either.

  • SharePoint and ADFS 2.0

    Hello, how are you doing?
    currently I have the following scenario:
    Organization number 1
    SharePoint 2013
    ADFS 2.0 on the LAN
    ADFS Proxy DMZ
    Organization number 2
    ADFS 2.0 on the lan
    certified public fs.dominio.com
    I want to post fs.domio.com but they don't count with ADFS proxy, my question is:
    You can do to publish and use my certificate public fs.domino.com, the drawback is that they don't count with more resources to implement an ADFS proxy.
    I've configured in the fortinet achieve me authenticate directly to the ADFS on the LAN but I get an error message by the unsafe site, for which I want to use my purchased certificate.

    Is there a trusted certificate bound to the site used by ADFS (that is, on the ADFS server)?
    Which guide, specifically, did you follow?
    Trevor Seward
    Follow or contact me at...
    &nbsp&nbsp
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • Setup Sharepoint 2013 Extranet as separate farm

    Hi All
    We currently have a Sharepoint 2013 Intranet and would now like to setup a Sharepoint 2013 Extranet/Portal to allow external customers/clients a shared collaboration area to work with internal users.  The proposed site would be fairly basic and most
    likely comprise of a site page per customer with a few libraries and maybe some form of announcements.  We don't envisage the demand being huge - maybe a few hundred users and 40/50 customer/client areas.
    I just wanted to outline my plan / ideas in case I've missed any pitfalls or have missed anything obvious.
    We plan to setup the extranet separately from our internal Sharepoint Intranet so the two sites will not need to communicate with each other and will be on two different domains.
    The Extranet will be setup in our DMZ and be composed of 2 x SP Servers (App & Web);  2 x SQL;  2 x DC  (all on VMWare).  This follows the MS 'Back to Back' architecture but we do not plan to segment the DMZ area using routers. 
    We will also configure a one-way forest trust from the internal AD to the new external AD so that internal users can authenticate with their existing credentials.
    The Topology diagram that I have seen states that you use a UAG between the DMZ and the Internet - is this necessary?
    Any advice / guidance / tips appreciated
    Andy

    Another issue to consider is that you will need to open ports in order for the SharePoint servers in the DMZ to resolve users via the People Picker. The ports are outlined here: http://blogs.technet.com/b/wbaer/archive/2009/01/21/people-picker-port-protocol-requirements.aspx
    You can use an IPSec tunnel between SharePoint in the DMZ and the internal network to limit the number of ports you need to open.
    Another thing to consider is that while 2012 R2 does have WAP included, it requires ADFS and does not work with SharePoint Apps as it doesn't support wildcard domains.
    Trevor Seward
    Follow or contact me at...
    &nbsp&nbsp
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • Sharepoint 2013 Web server role placement

    We are in the process of deploying sharepoint 2013, and I have a question on deploying the web server roles.   Where should they be ideally placed, on the internal network or on the DMZ.
    If they are placed on the DMZ, can the web server roles be on workgroups or do they have to be always on the domain.  Can you join a work group computer to sharepoint farm, I assumed it always had to be on the domain.
    Majority of our sharepoint users will be internal, but we have external partners who we want to access the sites as well, thats why we are thinking DMZ.
    Any advise on the above questions

    Yes, that is correct. You need those ports.
    The WFEs communicate with every other server in the farm, as well as your Active Directory, DNS, and SMTP servers. This includes all supporting infrastructure services such as LDAP, Kerberos (if you're using it), etc. I believe SMB is only if you are indexing
    fileshares.
    It does seem like quite a lot for a web server, however a SharePoint WFE is not just a web server.
    Here is the list of ports:
    TCP 80, TCP 443 (SSL)
    Custom ports for search crawling, if configured (such as for crawling a file share or a website on a non-default port)
    Ports used by the search index component — TCP 16500-16519 (intra-farm only)
    Ports required for the AppFabric Caching Service — TCP 22233-22236
    Ports required for Windows Communication Foundation communication — TCP 808
    Ports required for communication between Web servers and service applications (the default is HTTP):
    HTTP binding: TCP 32843
    HTTPS binding: TCP 32844
    net.tcp binding: TCP 32845 (only if a third party has implemented this option for a service application)
    Ports required for synchronizing profiles between SharePoint 2013 and Active Directory Domain Services (AD DS) on the server that runs the Forefront Identity Management agent:
    TCP 5725
    TCP&UDP 389 (LDAP service)
    TCP&UDP 88 (Kerberos)
    TCP&UDP 53 (DNS)
    UDP 464 (Kerberos Change Password)
    For information about how to synchronize profiles with other directory stores, see User
    Profile service hardening requirements, later in this article.
    Default ports for SQL Server communication — TCP 1433, UDP 1434. If these ports are blocked on the SQL Server computer (recommended) and databases are installed on a named instance, configure a SQL Server client
    alias for connecting to the named instance.
    Microsoft SharePoint Foundation User Code Service (for sandbox solutions) — TCP 32846. This port must be open for outbound connections on all Web servers. This port must be open for inbound connections on Web servers
    or application servers where this service is turned on.
    Ensure that ports remain open for Web applications that are accessible to users.
    Block external access to the port that is used for the Central Administration site.
    SMTP for e-mail integration — TCP 25
    Jason Warren
    Infrastructure Architect
    Habanero Consulting Group
    habaneroconsulting.com/blog

  • AD-RMS with SharePoint Document Access from Internet

    Hi Guys,
    I have a single AD-RMS Server running on 2008 R2 and SQL 2008 R2
    I have sharePoint 2010 Published to Internet.
    I need to integrate AD-RMS with SharePoint instep to provide access for my docuements through SHarepoint internet.
    My Questions:
    Using my single AD-RMS Server with SharePoint Intergration can all AD users access SharePoint from outside to be able to open encrypted documents?
    Is this artical (http://technet.microsoft.com/en-us/library/ee259515(WS.10).aspx) also work on SP 2010? is it the same steps?
    Do I need AD FS in my case?
    Do I need SSO in my case?
    Thanks

    Hi Jean,
    1. to make that working the Internet users need to authenticate against the RMS server as well. So you need to publish this server. Depending on your network policy you can just reverse proxy the RMS server or in a more complex scenario to have another server
    in a DMZ AD to facilitate that. Because you publish the Sharepoint Server to the Internet I assume you do not have a complex scenario.
    I hope you have chosen the URL for the RMS cluster wisely, so that it can be address from the Internet.
    2. I am not a big Sharepoint guy, but it looks familiar.
    3. No, you don't.
    4. No, but would be nice if the users do not have to authenticate twice. Will require a reverse proxy, e.g. TMG (what is discontinued) or similar.
    Hope it helps,
    Lutz

  • Sharepoint 2010 Accessing online through internet

    Before publishing sharepoint 2010 to internet what should be done
    we have following architecture
    one web app
    one wfe
    one database server
    all are virtual machines on single server
    what are the recommendations and best practice
    we do have liecese of fore front protection is it enough or we have to have liecese of symantec protection of sharepoint 2010
    need the best approach
    MCTS,ITIL

    Hiya, 
    Heres a few considerations:
    1: In regards to AV there are a few solutions:
    a: Use SharePoint integrated AV only. This will scan your SP files and SP files only. No server scanning.
    b: Use Server scanning only. This will scan only files on your server and not files within SharePoint. 
    c: Use both.
    d: Use none.
    Which one you need depends on your usage pattern and users. If you already have AV scanning on your client computers are they are the only ones uploading, you should not need a or c. If you only access your servers from protected computers, you should not
    need b. IF you do both, there would be no problem in choosing d.
    When you want to expose a solution to the internet, there are quite a few things to consider if we take everything into consideration. The network placement of your front end servers should be DMZ, with only specified ports open to and from App server and
    database server. App server and database server could be placed within your normal LAN. That way your minimizing your attack surface as well as minimizing the routes available if second line of defense is broken.
    Now as for SharePoint, you need a FIS license, which is quite a different price tag than server + CAL licensing. 
    In terms of making the solution available you need to create a route from a client to your site. That means. DNS record -> Public IP -> Public IP -> Your SharePoint server -> SP site Name(Your web server should be responding on this name)
    I presume that your site is internet type of site, meaning your allowing anonymous and not Extranet type of site, in that case your site also needs to have anonymous access enabled.
    If you can provide more details about your solution, it would be possible to be more specific in terms of defining the best architecture :)

Maybe you are looking for