OS for a server machine

Hello,
Does anybody know which OS i should install on a server machine?
I need a free version of course :)
Should it be Fedora? RedHat?
Anyone can drop a link?
Thanks!

Pick what you know best. Know what you are doing. If you don't know what you're doing, pay extra for managed hosting to someone who does.
If it's just a server for your home network, however, choose whatever you want to know most about; you can only learn by doing, so this is a chance to do a lot with a specific OS. But buy a hardware firewall (if you don't already have one) to put between you and the big bad internet.
Dave.

Similar Messages

  • Download for SQL Server machine running Windows Server 2008 Enterprise SP2

    I am confused what 11g product to download and how to configure for the above server. I just need for SQL Server 2008 to be able to communicate with some Oracle databases. I don't need to manage the Oracle database, nor do I need to create one. I just need to be able to create linked servers, use SQL Plus, and create some SSIS or DTS packages that will bring Oracle data into the SQL Server. I think I just need the client, but for 11g, I see it is less than 1GB in size. I am afraid I may be missing something because I thought the client downloads were about 2GB, but the 2GB downloads are not called "client". This is a 64bit machine running an Intel Xeon X7460 chip.
    Our server has a C:\, D:\, and E:\ drive on it. The data (.mdf) is on drive E:\. The normal environment is on C:\ like usual. I also need to know if I need to set environment variables which I read about, but am not sure how to set them and make them stay, or if the software does that.
    As an aside, I tried installing a 10g on there that the DBA downloaded and could never get it to work, although I was successful in getting tnsping to work, but SQLPlusW would close on me when I tried to connect and SQLPlus failed with some error but can't remember what right now. I uninstalled it. I think it may have been the wrong download.
    I would appreciate any info at all, even if you don't know about everything said here.
    I have a link here where I think I am close, but I still am not sure. Let me know if this is correct and which link to click. If I am wrong, please provide the correct link if you can. Thank you.
    [http://www.oracle.com/technetwork/database/enterprise-edition/downloads/112010-win64soft-094461.html]
    Edited by: user486992 on Oct 20, 2010 11:07 AM

    Hi,
    To integrate Oracle with SQL server you need the following
    Install Oracle 11g Release 2 client software
    Install Oracle 11g Release 2 ODAC software
    Restart SQL services
    Configure OraOLEDB.Oracle provider
    Create linked server
    Add remote logins for linked server
    see this links
    http://www.mssqltips.com/tip.asp?tip=1433
    http://www.easysoft.com/applications/oracle/database-gateway-dg4odbc.html :)

  • External or  Internal IP for Database server machine?

    We have a web server machine with firewall controlling our network security. This server gets some critical data from a database server machine located also in the same local network. Which IP is secure for database server, Internal or External IP ?
    regards
    Siyavuş

    Internal IP
    For security reasons, is better if you could use a DMZ, with the web server in the DMZ, and the database server in the internal network
    Something like this:
    Internet <-->external firewall <--> DMZ (web server, mail server, etc) <--> internal firewall <--> LAN (database, pcs, etc)
    If you want more granularity, you could incluse put the internal servers in another DMZ

  • How to get a certificate for SQL server (Virtual machine) on Azura

    Hi 
    I am lost and I don't know what to do have a certificate available for SQL2014 (Data warehousing virtual machine Size A7).
    My purpose is to connect to SQL server via Power Query and Engagement Studio  with Encryption connection option ticked. 
    I have looked at a lot of pages via Google and I wish there is a clear step-by-step guide for me to follow. This is a MS Cloud with a predefined virtual machine. There should be some guide. Can you point me to the right direction please? 
    I have a 

    Hi,
    Here are some related links below for you:
    a real certificate for a virtual machine
    https://social.msdn.microsoft.com/Forums/azure/en-US/7c48763f-fb04-46c6-a6e6-c21740d007cf/a-real-certificate-for-a-virtual-machine?forum=WAVirtualMachinesforWindows
    Configuring a custom domain name for an Azure cloud service
    http://azure.microsoft.com/en-us/documentation/articles/cloud-services-custom-domain-name/
    Create a Service Certificate for Azure
    https://msdn.microsoft.com/library/azure/gg432987.aspx?f=255&MSPPError=-2147217396
    Since we are not familiar with Azure, if the information above is not helpful, please post another thread in Azure forums and post a feedback regarding your requirement:
    http://feedback.azure.com/forums/34192--general-feedback
    Best Regards,
    Amy
    Please remember to mark the replies as answers if they help and un-mark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]

  • Best Setup for Lion Server Time Machine Backup with Drobo?

    I've been thinking about this a lot, yet I don't feel I have a good solution for this, so I'm going to throw it out to the community.
    I have a home server setup using a Mac Mini running Lion Server 10.7.2 with a Firewire 800 Drobo attached.  The Drobo is used for both Time Machine backups and files.  I also have a Powerbook G4 running Leopard and a MacBook 2.4 GHz Intel Core 2 Duo with Lion 10.7.2 which connect to the Server and the Drobo wirelessly thorugh an Airport Extreme.
    I want to use Time Machine to have all of my computers back up to the server & Drobo, but realize there are several ways to go, each with their pluses and minuses:
    Server Time Machine Backup:
    + Centralizes backup process, rules, and other elements
    + Currently Mac Mini is backing up to the Drobo correctly using this process
    + Have setup size limit on Server backup so that it does not eat up file space
    - Would combine laptop backups with server backup into one sparse image: this would lead to the computer with the largest backup needs taking up too much space
    Client-Driven Time Machine Backup:
    + Allows for customization of backup processes by computer
    + Can setup specific space requirements for each computer
    + Backups are separate from each other
    - Wireless backup from laptops to Drobo is not functioning currently
    Any thoughts or experiences on how best to set this up?  I tend to do most of my work on the MacBook, hence I am concerned about it having it's backup space eaten up by the server, but that may be more of a theoretical issue than a real one.
    Thanks in advance for your help!

    Well I'm not sure if I am following you but I will explain how I set mine up. When I got the Drobo I inserted 2 drives and selected the highest available volume I could (16TB). My drives are 4 TB each and I knew I would soon add 2 more. Then Drobo did its thing and prepared these drives. The Drobo shows up on my mini desktop as an external drive. When I log into my server from my other computer I can see the mini server volume and the Drobo volume. I can access each no problem. They act as regular volumes. Soon after I added the second two drives and everything stayed the same meaning I could still see and access the Drobo on the desktop of my mini. So it sounds like you used the Drobo dashboard to partition yours for two volumes? Are they both showing on the desktop? 
    "Maybe the Drobo needs to be mounted on the desktop to be considered AFP feature enabled." I could be wrong and hopefully someone will correct me but I think the Drobo (or volumes) have to be mounted on the desktop to work with AFP.

  • Windows OS required for hp proliant ml10 server machine

    Dear / Sir
    I planned to buy a new server machine which is a hp proliant ml10
    so my query is  whch operating systems i can legally install on it.
    any legel windows operating systems or only windows server edidtion.
    Thanking you
    nitin

    Hi,
    You can found your server model is certificated with Windows Server or not in Windows Server Catalog website:
    ProLiant ML10
    by Hewlett-Packard Company
    Compatible with   the following versions of Microsoft Windows
    Windows Server 2012 R2 x64
    Certified for Windows
    Windows Server 2012 x64
    Certified for Windows
    Windows Server 2008 R2 x64
    Certified for Windows
    Windows Server 2008 x64
    Certified for Windows
    But the ProlLiant
    ML10 have many extension model, personal advice is you can ask your hardware vendor for the further help.
    The related third party information:
    HP ProLiant ML10 Server
    http://www8.hp.com/us/en/products/proliant-servers/product-detail.html?oid=5390723#!tab%3Dmodels
    More information:
    Windows Server Catalog
    http://www.windowsservercatalog.com/item.aspx?idItem=3d94ac2f-058b-25af-7225-f6119e3c25f9&bCatID=1282
    Hope this helps.
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Windows server 2012 R2 file server for windows XP machines?

    I know that we can't auto enroll users that use an XP machine, but can an XP machine use the windows server 2012 R2 as a file server? Users will still be authenticated on a windows server 2003 machine.

    Hi,
    I found this article on Symantec website:
    Enterprise Vault 10.0.3 Feature Briefings - FSA support for Windows Server 2012
    http://www.symantec.com/business/support/index?page=content&id=DOC6307
    So the newest version should support Deduplication (at least on Windows Server 2012) now. 
    And for best practice, do you have any specific requirement? It actually depends on necessary. 
    If you have any feedback on our support, please send to [email protected]

  • Installing terminal server role on windows server 2012 to use for Windows server 2008 R2 machines.

    Dear
    Our current setup is as follows :-
    Active Driectory is in Windows Server 2003.
    Citrix Xenapp version 5.6 with all Xenapp servers in Windows Server 2003
    Now we are upgrading our Citrix farm to 6.5 and all Xen app servers into Windows Server 2008R2.
    We did the configuration and testing; everything works fine except the terminal server is not configured which gives the pop always for the expiry date.
    We do have license for 2012 terminal server which is not possible to downgrade for some reason and managent need to install new Windows Server 2012 for terminal server and activte the licenses.
    My question is :-
    1 Whether Windows Server 2008 R2 will get RDS cal license from Windows Server 2012 terminal server (RDS CAL license is of windows server 2012).
    2. What is the role and features I need to activate in Server 2012 in order to use for Citrix 6.5 (only applicable fetaure for getting the RDS cal license.) 
    3. Whether I can add this 2012 server into domain. Is it possible to add 2012 server into Windows Server 2003 AD. 
    Your early reply is highly appreciated.

    Thanks Jeremy..
    I got exactly the right answers I am lookoing for ..
    I successfully activated the terminal service on the newly created Windows 2012 Server and installed the retail license pack we had of 50+50+25 keys . It later shows the 125 keys successfully applied.
    when I tried to link the Xenapp (Windows Server2008R2) to the license server, it first gives me error that session host service role is not installed on the ternial server. So I installed that role also in the terminal server.
    But now its anothee error as below.
    RDS Cals are not available for this Remote Desktop Session host server, and licensing Diagnostic has identified licensing problems for the RD session host server.
    1. Is it wrong I did by installing session host server role into the terminal server (but really it gave me error like 'session host server role is not running on the license server' when I tried to link the license server).
    2. Is it the problem that Windows Server 2008 R2 will not get license from 2012 RDS CAL license.
    3. whether Clearing house will take time to update the license.
    Your reply is highly appreciated. I need to know whether any other way I can link the Windows Server  2008 R2 Servers (Xenapp) to the license server.

  • How can I know how many processors do I need for a new SQL Server machine?

    My company intend to buy a new server and will dedicate that for SQL Server which will serve an in-house developed application.
    How can we know how many processors are enough to buy?
    I know that each SQL Server edition and version has it's different limitation of the number of processors it can support, for example SQL Server 2012 Standard Edition can support up to 4 sockets or 16 cores per SQL Server Instance, but this does not mean
    that I have to buy 4 physical CPUs. two could be enough. how can I know? 

    How can we know how many processors are enough to buy?
    I know that each SQL Server edition and version has it's different limitation of the number of processors it can support, for example SQL Server 2012 Standard Edition can support up to 4 sockets or 16 cores per SQL Server Instance, but this does not mean
    that I have to buy 4 physical CPUs. two could be enough. how can I know? 
    This is off topic question how can we answer it. We even dont have any knowledge about your application.
    Please speak to licensing specialist about servers and cores below link will give you more information.
    https://msdn.microsoft.com/en-us/library/ms143760.aspx?f=255&MSPPError=-2147217396
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it
    My Technet Wiki Article
    MVP

  • Error while running SDK application in XIR2 server machine

    Hi all,
    Thanks for your support provided so far.
    I have developed a .NET sdk application to extract the metadata from the XI-R2 universe and report files. I'm able to run the application in my machine. But when I run my tool in the server machine which has XI-R2 installed in it, it throws the following error.
    System.Runtime.InteropServices.COMException (0x80040154): Retrieving the COM class factory for component with CLSID {DCDC6F02-5766-11D0-AF14-00A0C912DCDD} failed due to the following error: 80040154.
    I have referred the Busobj.dll and Designer.Dll in the application to extract metadata. These DLL's are embedded along with the application.
    Am I missing any Dll's? Kindly help me.

    Thanks Ted.
    Yes I looked the CLSID, it refers the Desktop Intelligence Application and the following keys under the CLSID are as follows:
    1. InProcHandler32-- Ole32.dll
    2. Localserver32  -- C:\BusinessObjects\BusinessObjectsEnterprise 11.5\win32_x86 busobj.exe /Automation
    3. ProgID -- BusinessObjects.Application.11
    4. VersionIndependentProgID --BusinessObjects.Application
    Kindly help me.

  • Remote Server Admin not working from outside of network for 1 server

    Our company recently changed ISPs and I had to change our two 10.4 server's IP addresses. We have a mail server (intel xserve) and a file/web server (quicksilver g4). Both servers have two network cards in them. The problem is two-fold:
    1- I can successfully manage the Xserve machine locally on our network and from my home. However, I can only manage the Quicksilver g4 server locally. Any kind of external access is not even acknowledged.
    2- I'm not sure if I missed any steps when changing IP addresses for these server-based computers. Also, I'm not sure if I correctly set our dns names to the correct IP address.
    For some background, this is the exact IP update process I used for each server:
    Quicksilver G4 (file/web server) - Installed network card #2 and configured it with the new Public IP in the "System Preferences/Network" panel. In Server Admin I set our website to use the new public IP address. (network card #2 has no firewall device in between it and the internet.)
    Then, I configured the default network card #1 to a static, yet private IP address that's behind our DLink firewall device with the rest of our network.
    Intel Xserve (e-mail server) - Network card #1 was the only one setup before our network change. It had a static, public IP address. When we changed ISPs, I configured network card #2 to the new static, public IP address supplied to us by the new ISP in the "System Preferences/Network" panel. This new IP address is where all email traffic currently gets pointed to. (All mail is successfully sent and delivered.) Until our former ISP gets shutdown, I still have network card #1 configured to the older static public IP address. After the old ISP is shut off, I plan on changing network card #1 to a static, private address.
    Any assistance would be greatly appreciated.
      Mac OS X (10.4.8)  

    What should I check in AD?  I am by no means an expert with AD.
    Yes, I am using the same client OS.
    I am talking about RDP over the internet, like from home to the office.  We have a static IP assigned to the router from ISP.  A static internal IP assigned to the server on the LAN.  And the router port forwards 3389 to the assigned IP. 
    It was working fine before we reinstalled Server 2012.  These are the steps I took when reinstalling:
    1. format drive and install OS
    2. rename the server
    3. install SQL server
    4. Install TFS and SharePoint
    5. Add Active Directory role and promote to Domain Controller
    6. Add domain users
    7. Enable remote access on the server and add users to remote access list

  • Increase Performance and ROI for SQL Server Environments

    May 2015
    Explore
    The Buzz from Microsoft Ignite 2015
    NetApp was in full force at the recent Microsoft Ignite show in Chicago, talking about solutions for hybrid cloud, and our proven solutions for Microsoft SQL Server and other Microsoft applications.
    Hot topics at the NetApp booth included:
    OnCommand® Shift. A revolutionary technology that lets you move virtual machines back and forth between VMware and Hyper-V environments in minutes.
    Azure Site Recovery to NetApp Private Storage. Replicate on-premises SAN-based applications to NPS for disaster recovery in the Azure cloud.
    These tools give you greater flexibility for managing and protecting important business applications.
    Chris Lemmons
    Director, EIS Technical Marketing, NetApp
    If your organization runs databases such as Microsoft SQL Server and Oracle DB, you probably know that these vendors primarily license their products on a "per-core" basis. Microsoft recently switched to "per-core" rather than "per-socket" licensing for SQL Server 2012 and 2014. This change can have a big impact on the total cost of operating a database, especially as core counts on new servers continue to climb. It turns out that the right storage infrastructure can drive down database costs, increase productivity, and put your infrastructure back in balance.
    In many customer environments, NetApp has noticed that server CPU utilization is low—often on the order of just 20%. This is usually the result of I/O bottlenecks. Server cores have to sit and wait for I/O from hard disk drives (HDDs). We've been closely studying the impact of all-flash storage on SQL Server environments that use HDD-based storage systems. NetApp® All Flash FAS platform delivers world-class performance for SQL Server plus the storage efficiency, application integration, nondisruptive operations, and data protection of clustered Data ONTAP®, making it ideal for SQL Server environments.
    Tests show that All Flash FAS can drive up IOPS and database server CPU utilization by as much as 4x. And with a 95% reduction in latency, you can achieve this level of performance with half as many servers. This reduces the number of servers you need and the number of cores you have to license, driving down costs by 50% or more and paying back your investment in flash in as little as six months.
    Figure 1) NetApp All Flash FAS increases CPU utilization on your SQL Server database servers, lowering costs.
    Source: NetApp, 2015
    Whether you're running one of the newer versions of SQL Server or facing an upgrade of an earlier version, you can't afford not to take a second look at your storage environment.
    End of Support for Microsoft SQL Server 2005 is Rapidly Approaching
    Microsoft has set the end of extended support for SQL Server 2005 for April 2016—less than a year away. With support for Microsoft Windows 2003 ending in July 2015, time may already be running short.
    If you're running Windows Server 2003, new server hardware is almost certainly needed when you upgrade SQL Server. Evaluate your server and storage options now to get costs under control.
    Test Methodology
    To test the impact of flash on SQL Server performance, we replaced a legacy HDD-based storage system with an All Flash FAS AFF8080 EX. The legacy system was configured with almost 150 HDDs, a typical configuration for HDD storage supporting SQL Server. The AFF8080 EX used just 48 SSDs.
    Table 1) Components used in testing.
    Test Configuration Components
    Details
    SQL Server 2014 servers
    Fujitsu RX300
    Server operating system
    Microsoft Windows 2012 R2 Standard Edition
    SQL Server database version
    Microsoft SQL Server 2014 Enterprise Edition
    Processors per server
    2 6-core Xeon E5-2630 at 2.30 GHz
    Fibre channel network
    8Gb FC with multipathing
    Storage controller
    AFF8080 EX
    Data ONTAP version
    Clustered Data ONTAP® 8.3.1
    Drive number and type
    48 SSD
    Source: NetApp, 2015
    The test configuration consisted of 10 database servers connected through fibre channel to both the legacy storage system and the AFF8080 EX. Each of the 10 servers ran SQL Server 2014 Enterprise Edition.
    The publicly available HammerDB workload generator was used to drive an OLTP-like workload simultaneously from each of the 10 database servers to storage. We first directed the workload to the legacy storage array to establish a baseline, increasing the load to the point where read latency consistently exceeded 20ms.
    That workload was then directed at the AFF8080 EX. The change in storage resulted in an overall 20x reduction in read latency, a greater than 4x improvement in IOPS, and a greater than 4x improvement in database server CPU utilization.
    Figure 2) NetApp All Flash FAS increases IOPS and server CPU utilization and lowers latency.
    Source: NetApp, 2015
    In other words, the database servers are able to process four times as many IOPS with dramatically lower latency. CPU utilization goes up accordingly because the servers are processing 4x the work per unit time.
    The All Flash FAS system still had additional headroom under this load.
    Calculating the Savings
    Let's look at what this performance improvement means for the total cost of running SQL Server 2014 over a 3-year period. To do the analysis we used NetApp Realize, a storage modeling and financial analysis tool designed to help quantify the value of NetApp solutions and products. NetApp sales teams and partners use this tool to assist with return on investment (ROI) calculations.
    The calculation includes the cost of the AFF8080 EX, eliminates the costs associated with the existing storage system, and cuts the total number of database servers from 10 to five. This reduces SQL Server licensing costs by 50%. The same workload was run with five servers and achieved the same results. ROI analysis is summarized in Table 2.
    Table 2) ROI from replacing an HDD-based storage system with All Flash FAS, thereby cutting server and licensing costs in half.
    Value
    Analysis Results
    ROI
    65%
    Net present value (NPV)
    $950,000
    Payback period
    six months
    Total cost reduction
    More than $1 million saved over a 3-year analysis period compared to the legacy storage system
    Savings on power, space, and administration
    $40,000
    Additional savings due to nondisruptive operations benefits (not included in ROI)
    $90,000
    Source: NetApp, 2015
    The takeaway here is that you can replace your existing storage with All Flash FAS and get a big performance bump while substantially reducing your costs, with the majority of the savings derived from the reduction in SQL Server licensing costs.
    Replace your existing storage with All Flash FAS and get a big performance bump while substantially reducing your costs.
    Maximum SQL Server 2014 Performance
    In addition to the ROI analysis, we also measured the maximum performance of the AFF8080 EX with SQL Server 2014. A load-generation tool was used to simulate an industry-standard TPC-E OLTP workload against an SQL Server 2014 test configuration.
    A two-node AFF8080 EX achieved a maximum throughput of 322K IOPS at just over 1ms latency. For all points other than the maximum load point, latency was consistently under 1ms and remained under 0.8ms up to 180K IOPS.
    Data Reduction and Storage Efficiency
    In addition to performance testing, we looked at the overall storage efficiency savings of our SQL Server database implementation. The degree of compression that can be achieved is dependent on the actual data that is written and stored in the database. For this environment, inline compression was effective. Deduplication, as is often the case in database environments, provided little additional storage savings and was not enabled.
    For the test data used in the maximum performance test, we measured a compression ratio of 1.5:1. We also tested inline compression on a production SQL Server 2014 data set to further validate these results and saw a 1.8:1 compression ratio.
    Space-efficient NetApp Snapshot® copies provide additional storage efficiency benefits for database environments. Unlike snapshot methods that use copy-on-write, there is no performance penalty; unlike full mirror copies, NetApp Snapshot copies use storage space sparingly. Snapshot copies only consume a small amount of storage space for metadata and additional incremental space is consumed as block-level changes occur. In a typical real-world SQL Server deployment on NetApp storage, database volume Snapshot copies are made every two hours.
    First introduced more than 10 years ago, NetApp FlexClone® technology also plays an important role in SQL Server environments. Clones are fully writable, and, similar to Snapshot copies, only consume incremental storage capacity. With FlexClone, you can create as many copies of production data as you need for development and test, reporting, and so on. Cloning is a great way to support the development and test work needed when upgrading from an earlier version of SQL Server. You'll sometimes see these types of capabilities referred to as "copy data management."
    A Better Way to Run Enterprise Applications
    The performance benefits that all-flash storage can deliver for database environments are significant: more IOPS, lower latency, and an end to near-constant performance tuning.
    If you think the performance acceleration that comes with all-flash storage is cost prohibitive, think again. All Flash FAS doesn't just deliver a performance boost, it changes the economics of your operations, paying for itself with thousands in savings on licensing and server costs. In terms of dollars per IOPS, All Flash FAS is extremely economical relative to HDD.
    And, because All Flash FAS runs NetApp clustered Data ONTAP, it delivers the most complete environment to support SQL Server and all your enterprise applications with capabilities that include comprehensive storage efficiency, integrated data protection, and deep integration for your applications.
    For complete details on this testing look for NetApp TR-4303, which will be available in a few weeks. Stay tuned to Tech OnTap for more information as NetApp continues to run benchmarks with important server workloads including Oracle DB and server virtualization.
    Learn more about NetApp solutions for SQL Server and NetApp All-flash solutions.
    Quick Links
    Tech OnTap Community
    Archive
    PDF

    May 2015
    Explore
    The Buzz from Microsoft Ignite 2015
    NetApp was in full force at the recent Microsoft Ignite show in Chicago, talking about solutions for hybrid cloud, and our proven solutions for Microsoft SQL Server and other Microsoft applications.
    Hot topics at the NetApp booth included:
    OnCommand® Shift. A revolutionary technology that lets you move virtual machines back and forth between VMware and Hyper-V environments in minutes.
    Azure Site Recovery to NetApp Private Storage. Replicate on-premises SAN-based applications to NPS for disaster recovery in the Azure cloud.
    These tools give you greater flexibility for managing and protecting important business applications.
    Chris Lemmons
    Director, EIS Technical Marketing, NetApp
    If your organization runs databases such as Microsoft SQL Server and Oracle DB, you probably know that these vendors primarily license their products on a "per-core" basis. Microsoft recently switched to "per-core" rather than "per-socket" licensing for SQL Server 2012 and 2014. This change can have a big impact on the total cost of operating a database, especially as core counts on new servers continue to climb. It turns out that the right storage infrastructure can drive down database costs, increase productivity, and put your infrastructure back in balance.
    In many customer environments, NetApp has noticed that server CPU utilization is low—often on the order of just 20%. This is usually the result of I/O bottlenecks. Server cores have to sit and wait for I/O from hard disk drives (HDDs). We've been closely studying the impact of all-flash storage on SQL Server environments that use HDD-based storage systems. NetApp® All Flash FAS platform delivers world-class performance for SQL Server plus the storage efficiency, application integration, nondisruptive operations, and data protection of clustered Data ONTAP®, making it ideal for SQL Server environments.
    Tests show that All Flash FAS can drive up IOPS and database server CPU utilization by as much as 4x. And with a 95% reduction in latency, you can achieve this level of performance with half as many servers. This reduces the number of servers you need and the number of cores you have to license, driving down costs by 50% or more and paying back your investment in flash in as little as six months.
    Figure 1) NetApp All Flash FAS increases CPU utilization on your SQL Server database servers, lowering costs.
    Source: NetApp, 2015
    Whether you're running one of the newer versions of SQL Server or facing an upgrade of an earlier version, you can't afford not to take a second look at your storage environment.
    End of Support for Microsoft SQL Server 2005 is Rapidly Approaching
    Microsoft has set the end of extended support for SQL Server 2005 for April 2016—less than a year away. With support for Microsoft Windows 2003 ending in July 2015, time may already be running short.
    If you're running Windows Server 2003, new server hardware is almost certainly needed when you upgrade SQL Server. Evaluate your server and storage options now to get costs under control.
    Test Methodology
    To test the impact of flash on SQL Server performance, we replaced a legacy HDD-based storage system with an All Flash FAS AFF8080 EX. The legacy system was configured with almost 150 HDDs, a typical configuration for HDD storage supporting SQL Server. The AFF8080 EX used just 48 SSDs.
    Table 1) Components used in testing.
    Test Configuration Components
    Details
    SQL Server 2014 servers
    Fujitsu RX300
    Server operating system
    Microsoft Windows 2012 R2 Standard Edition
    SQL Server database version
    Microsoft SQL Server 2014 Enterprise Edition
    Processors per server
    2 6-core Xeon E5-2630 at 2.30 GHz
    Fibre channel network
    8Gb FC with multipathing
    Storage controller
    AFF8080 EX
    Data ONTAP version
    Clustered Data ONTAP® 8.3.1
    Drive number and type
    48 SSD
    Source: NetApp, 2015
    The test configuration consisted of 10 database servers connected through fibre channel to both the legacy storage system and the AFF8080 EX. Each of the 10 servers ran SQL Server 2014 Enterprise Edition.
    The publicly available HammerDB workload generator was used to drive an OLTP-like workload simultaneously from each of the 10 database servers to storage. We first directed the workload to the legacy storage array to establish a baseline, increasing the load to the point where read latency consistently exceeded 20ms.
    That workload was then directed at the AFF8080 EX. The change in storage resulted in an overall 20x reduction in read latency, a greater than 4x improvement in IOPS, and a greater than 4x improvement in database server CPU utilization.
    Figure 2) NetApp All Flash FAS increases IOPS and server CPU utilization and lowers latency.
    Source: NetApp, 2015
    In other words, the database servers are able to process four times as many IOPS with dramatically lower latency. CPU utilization goes up accordingly because the servers are processing 4x the work per unit time.
    The All Flash FAS system still had additional headroom under this load.
    Calculating the Savings
    Let's look at what this performance improvement means for the total cost of running SQL Server 2014 over a 3-year period. To do the analysis we used NetApp Realize, a storage modeling and financial analysis tool designed to help quantify the value of NetApp solutions and products. NetApp sales teams and partners use this tool to assist with return on investment (ROI) calculations.
    The calculation includes the cost of the AFF8080 EX, eliminates the costs associated with the existing storage system, and cuts the total number of database servers from 10 to five. This reduces SQL Server licensing costs by 50%. The same workload was run with five servers and achieved the same results. ROI analysis is summarized in Table 2.
    Table 2) ROI from replacing an HDD-based storage system with All Flash FAS, thereby cutting server and licensing costs in half.
    Value
    Analysis Results
    ROI
    65%
    Net present value (NPV)
    $950,000
    Payback period
    six months
    Total cost reduction
    More than $1 million saved over a 3-year analysis period compared to the legacy storage system
    Savings on power, space, and administration
    $40,000
    Additional savings due to nondisruptive operations benefits (not included in ROI)
    $90,000
    Source: NetApp, 2015
    The takeaway here is that you can replace your existing storage with All Flash FAS and get a big performance bump while substantially reducing your costs, with the majority of the savings derived from the reduction in SQL Server licensing costs.
    Replace your existing storage with All Flash FAS and get a big performance bump while substantially reducing your costs.
    Maximum SQL Server 2014 Performance
    In addition to the ROI analysis, we also measured the maximum performance of the AFF8080 EX with SQL Server 2014. A load-generation tool was used to simulate an industry-standard TPC-E OLTP workload against an SQL Server 2014 test configuration.
    A two-node AFF8080 EX achieved a maximum throughput of 322K IOPS at just over 1ms latency. For all points other than the maximum load point, latency was consistently under 1ms and remained under 0.8ms up to 180K IOPS.
    Data Reduction and Storage Efficiency
    In addition to performance testing, we looked at the overall storage efficiency savings of our SQL Server database implementation. The degree of compression that can be achieved is dependent on the actual data that is written and stored in the database. For this environment, inline compression was effective. Deduplication, as is often the case in database environments, provided little additional storage savings and was not enabled.
    For the test data used in the maximum performance test, we measured a compression ratio of 1.5:1. We also tested inline compression on a production SQL Server 2014 data set to further validate these results and saw a 1.8:1 compression ratio.
    Space-efficient NetApp Snapshot® copies provide additional storage efficiency benefits for database environments. Unlike snapshot methods that use copy-on-write, there is no performance penalty; unlike full mirror copies, NetApp Snapshot copies use storage space sparingly. Snapshot copies only consume a small amount of storage space for metadata and additional incremental space is consumed as block-level changes occur. In a typical real-world SQL Server deployment on NetApp storage, database volume Snapshot copies are made every two hours.
    First introduced more than 10 years ago, NetApp FlexClone® technology also plays an important role in SQL Server environments. Clones are fully writable, and, similar to Snapshot copies, only consume incremental storage capacity. With FlexClone, you can create as many copies of production data as you need for development and test, reporting, and so on. Cloning is a great way to support the development and test work needed when upgrading from an earlier version of SQL Server. You'll sometimes see these types of capabilities referred to as "copy data management."
    A Better Way to Run Enterprise Applications
    The performance benefits that all-flash storage can deliver for database environments are significant: more IOPS, lower latency, and an end to near-constant performance tuning.
    If you think the performance acceleration that comes with all-flash storage is cost prohibitive, think again. All Flash FAS doesn't just deliver a performance boost, it changes the economics of your operations, paying for itself with thousands in savings on licensing and server costs. In terms of dollars per IOPS, All Flash FAS is extremely economical relative to HDD.
    And, because All Flash FAS runs NetApp clustered Data ONTAP, it delivers the most complete environment to support SQL Server and all your enterprise applications with capabilities that include comprehensive storage efficiency, integrated data protection, and deep integration for your applications.
    For complete details on this testing look for NetApp TR-4303, which will be available in a few weeks. Stay tuned to Tech OnTap for more information as NetApp continues to run benchmarks with important server workloads including Oracle DB and server virtualization.
    Learn more about NetApp solutions for SQL Server and NetApp All-flash solutions.
    Quick Links
    Tech OnTap Community
    Archive
    PDF

  • Set up Search Service App For SharePoint server 2013 on Windows server 2012 R2 not working

    Hi all,
    I installed SharePoint server 2013 on Windows  server 2012 R2 using VirtualBox.  I created a DC(domain controller) server with a domain set up on one VM and it has SQL server 2012 SP1 installed. Then SharePoint 2013 on another VM was set up to access
    the DC server.  Everything seems working except Search Service App which cannot be sucessfully set up. Creation process for Search service app says Successful and 4 search databases were created and look fine. But when I navigate to search service app
    admin page, it gives error info:
    System status:  The search service is not able to connect to the machine that hosts the administration component. Verify that the administration component '386f2cd6-47ca-4b3a-aeb5-d9116772ef16' in search application 'Search Service Application 1' is in
    a good state and try again.
    Search Application Topology:  Unable to retrieve topology component health states. This may be because the admin component is not up and running.
    From event viewer, I see following errors:
    (1) Error From source: SharePoint Server
    Application Server Administration job failed for service instance  Microsoft.Office.Server.Search.Administration.SearchServiceInstance
    (b7c72eb8-cbaf-435e-b4c9-963cb6e4e745).
    Reason: The object you are trying to create already exists. Try again using a different name.  
    Technical Support Details:
    System.Runtime.InteropServices.COMException (0x80040D02): The object you are trying to create already exists. Try again using a different name.  
       at Microsoft.Office.Server.Search.Administration.SearchServiceInstance.Synchronize()
       at Microsoft.Office.Server.Administration.ApplicationServerJob.ProvisionLocalSharedServiceInstances(Boolean
    isAdministrationServiceJob)
    (2) Error From source: SharePoint Server Search
    Could not access the Search database. A generic error occurred while trying to access the database to obtain the schema version info.
    Context: Application '386f2cd6-47ca-4b3a-aeb5-d9116772ef16'
    (3) Warning from source: SharePoint Server Search
    A database error occurred. Source: .Net SqlClient Data Provider Code: 8169 occurred 0 time(s) Description:  Error ordinal: 1 Message:
    Conversion failed when converting from a character string to uniqueidentifier., Class: 16, Number: 8169, State: 2    at
    System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
    (4) Error From source: SharePoint Server
    Application Server Administration job failed for service instance Microsoft.Office.Server.Search.Administration.SearchServiceInstance
    (b7c72eb8-cbaf-435e-b4c9-963cb6e4e745).
    Reason: The gatherer application could not be mounted because the search administration database schema version does not match the expected backwards compatibility schema version. The database might not have been upgraded.  
    Technical Support Details:
    System.Runtime.InteropServices.COMException (0xC0041235): The gatherer application could not be mounted because the search administration database schema version does not match the expected backwards compatibility schema version. The database might not have
    been upgraded.  
    Since separate DC server and SharePoint server do not work, I installed SharePoint 2013 on DC server ( so DC server has everything on it now ) but it gives exactly same result. Later I installed SharePoint 2013 SP1 and still have the same problem with Search
    Service app. I spent two weeks tried all suggestions available from Web and Google but SharePoint Search Service simply does not work. Config and other databases work but why Search Service has this issue seemingly related to search DB.
    Could anybody please help out? You deserve a top SharePoint consultant award if you could find a solution. I am so frustrated and so tired by this issue.    This seems also to be a SP set up issue.
    Thanks a lot.

    Using new Search Service App wizard to create SSA is always a success. I could delete existing SSA and recreate it and no problem. It says successful but when I open Search Admin page from CA, it gives me errors as mentioned.
    Now I used the following PS script for creating SSA from Max Mercher, but it stays at the last setps in following script:
    Add-PsSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue
    $IndexLocation = "C:\Search"  #Location must be empty, will be deleted during the process!
    $SearchAppPoolName = "SSAPool"
    $SearchAppPoolAccountName = "mydomain\admin"
    $SearchServiceName = "SSA"
    $SearchServiceProxyName = "SSA Proxy"
    $DatabaseServer = "W12R2DC1"
    $DatabaseName = "SSA"
    $spAppPool = Get-SPServiceApplicationPool -Identity $SearchAppPoolName -ErrorAction SilentlyContinue
    if (!$spAppPool)
     $spAppPool = New-SPServiceApplicationPool -Name $SearchAppPoolName -Account $SearchAppPoolAccountName -Verbose
    $ServiceApplication = Get-SPEnterpriseSearchServiceApplication -Identity $SearchServiceName -ErrorAction SilentlyContinue
    if (!$ServiceApplication)
    # process stays at the following step forever, already one hour now.  
    $ServiceApplication = New-SPEnterpriseSearchServiceApplication -Name $SearchServiceName -ApplicationPool $spAppPool.Name -DatabaseServer  $DatabaseServer -DatabaseName $DatabaseName
    Account mydomain\admin is an farm managed account, domain admin account, in WG_ADMIN role, It is in all SQL server roles and is DBO. I see search DBs are already on SQL server. From Event viewer, I got following errors in sequence:
    (1) Crawler:Content Plugin under source Crawler:Content Plugin 
    Content Plugin can not be initialized - list of CSS addresses is not set.
    (2) Warning for SharePoint Server Search
    A database error occurred. Source: .Net SqlClient Data Provider Code: 8169 occurred 0 time(s) Description:  Error ordinal: 1 Message: Conversion failed when converting from a character string to uniqueidentifier., Class: 16, Number: 8169, State: 2   
    at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
    (3) Error for SharePoint Server Search
    Could not access the Search database. A generic error occurred while trying to access the database to obtain the schema version info.
    Context: Application 'cbc5a055-996b-44a7-9cbc-404322f9cfdf'
    (4) Error for SharePoint Server
    Application Server Administration job failed for service instance Microsoft.Office.Server.Search.Administration.SearchServiceInstance (b7c72eb8-cbaf-435e-b4c9-963cb6e4e745).
    Reason: The gatherer application could not be mounted because the search administration database schema version does not match the expected backwards compatibility schema version. The database might not have been upgraded. 
    (5) Error Shared Services for SharePoint Server Search 
    Application Server Administration job failed for service instance Microsoft.Office.Server.Search.Administration.SearchServiceInstance (b7c72eb8-cbaf-435e-b4c9-963cb6e4e745).
    Reason: The object you are trying to create already exists. Try again using a different name. 
    Technical Support Details:
    System.Runtime.InteropServices.COMException (0x80040D02): The object you are trying to create already exists. Try again using a different name. 
       at Microsoft.Office.Server.Search.Administration.SearchServiceInstance.Synchronize()
       at Microsoft.Office.Server.Administration.ApplicationServerJob.ProvisionLocalSharedServiceInstances(Boolean isAdministrationServiceJob
    Above errors keep being generated. Last step for SSA creation stay there forever.  Any clue what is really going on?  Thanks.

  • To upload a file from client machine to server machine

    Hi everybody!
    Could anyone plz help me. I am struck in a problem
    I want to transfer a file from client's machine to server but I am not able to upload
    It is tranferring the file only to the local machine
    I am using orielley package It is transferring files only to my local machine but not to the server .Can anyone correct it. It's very urgent
    how to write the relative path for server
    I am using this path and it is not uploading
    MultipartRequest multi = new MultipartRequest(request, "../<administrator>:<dev2daask>@dev2:C:/123data/", 5 * 1024 * 1024);
    Here is my code:
    <%@ page import="java.util.*" %>
    <%@ page import="javax.servlet.*" %>
    <%@ page import="javax.servlet.http.*" %>
    <%@ page import="java.io.*" %>
    <%@ page import="com.oreilly.servlet.MultipartRequest"%>
    <%
    try {
    // Blindly take it on faith this is a multipart/form-data request
    // Construct a MultipartRequest to help read the information.
    // Pass in the request, a directory to saves files to, and the
    // maximum POST size we should attempt to handle.
    // Here we (rudely) write to the server root and impose 5 Meg limit.
    MultipartRequest multi = new MultipartRequest(request, "../<administrator>:<dev2daask>@dev2:C:/123data/", 5 * 1024 * 1024);
    out.println("<HTML>");
    out.println("<HEAD><TITLE>UploadTest</TITLE></HEAD>");
    out.println("<BODY>");
    out.println("<H1>UploadTest</H1>");
    // Print the parameters we received
    out.println("<H3>Params:</H3>");
    out.println("<PRE>");
    Enumeration params = multi.getParameterNames();
    while (params.hasMoreElements()) {
    String name = (String)params.nextElement();
    String value = multi.getParameter(name);
    out.println(name + " = " + value);
    out.println("</PRE>");
    // Show which files we received
    out.println("<H3>Files:</H3>");
    out.println("<PRE>");
    Enumeration files = multi.getFileNames();
    while (files.hasMoreElements()) {
    String name = (String)files.nextElement();
    String filename = multi.getFilesystemName(name);
    String type = multi.getContentType(name);
    File f = multi.getFile(name);
    out.println("name: " + name);
    out.println("filename: " + filename);
    out.println("type: " + type);
    if (f != null) {
    out.println("length: " + f.length());
    out.println();
    out.println("</PRE>");
    catch (Exception e) {
    out.println("<PRE>");
    out.println("</PRE>");
    out.println("</BODY></HTML>");
    %>

    you have not understood my point
    how does this code will run on servlet when I want to upload a file from client's
    machine to server machine
    what I am doing is I am giving an option to the user that he/she can browse the file and then select any file and finally it's action is post in the jsp form for which I have sent the code
    All the computers are connected in LAN
    So how to upload a file from client's machine to server's machine
    Plz give me a solution

  • How to upload file from client machine to  server machine?

    i am developing one web application.I have one html file with browse option. client can browse any type of file. what ever file the client will browse it going to be stored in server machine. for storing the file want to use servlet. my html form is of multipart type.
    can any one send me the servlet code? i am using tomcat 5.5 as web server.

    [http://commons.apache.org/fileupload]
    Start reading 'User Guide' and 'Frequently Asked Questions'.
    Good luck :)

Maybe you are looking for

  • HT1222 Is there a way change settings, i.e. color of folders, in iOS 7 to make it less childlike looking?

    In what world is this update beautiful?  ugly.  some good additions are great, but the interface is far from beautiful, hard to read, annoying and with a very childlike look. Can you give us options to put it back to look like it did prior to 7 updat

  • Stage has wrong fullScreen sizes when launched in adl

    Actually this problem only occurs when these two properties are set in the descriptor file: <aspectRatio>landscape</aspectRatio> <autoOrients>false</autoOrients> As aspected the stage has the following correct properties: stage.autoOrients  –>  false

  • Configuring XML subnodes and attributes in PDF Template

    Hi, Does anybody know how to configure non-unique XML sub-nodes or XML attributes in a PDF Template? For example, my xml is of below format: <my-main-node-1 my-attr-1="xyz"> <value>my-value-1</value> </my-main-node-1> <my-main-node-2 my-attr-1="abc">

  • Anyone have the Windows 7 Home Premium x32 upgrade disc

    Hi.  I have an upgrade disc....but I have the incorrect one for my system.   It is an upgrade to Windows 7 Home Premium x64.   I need Windows 7 Home Premium x32 upgrade.    I'm hoping to find someone with the proper disc for my system.  Lenovo said t

  • Backup Status for Multiple SQl Servers.

    Hi All, What changes are required in the below script - so that it displays the backup result for multiple SQL SERVERS. --Databases with data backup over 24 hours old SELECT    CONVERT(CHAR(100), SERVERPROPERTY('Servername')) AS Server,    msdb.dbo.b