Manage DataGrowth on File server

Hello Experts,
 We have a file server on which one of the data volume is getting grown rapidly and disk space getting full. We know there is some old data which is not at all used , 
My question is ,
How we can find out what data is not used and have the customer delete them to free up some space ,?
Do we have any tool to achieve this?
Thanks,
-Prashant Girennavar.
MCSA|MCITP SA|Microsoft Exchange 2003 Blog - http://prashant1987.wordpress.com Disclaimer: This posting is provided AS-IS with no warranties/guarantees and confers no rights.

Hi Prashant, 
AFAIK, 3rd party tool Treesize professional will give you the folder consumption in details. 
Please check each folder consumption details and cross check with the customer before deleting it.
If customer not allow to delete any folder/content, go for an disk expansion or move folder to some other share location as suggested by
Shaon Shan.
Regards, Ravikumar P

Similar Messages

  • Unable to add File Server Resource Manager Tools on Windows Server 2012 - Errors on restart and roll back install

    Unable to install Windows Server 2012 Feature -  [Tools] File Server Resource Manager Tools.
    Installs, however when I restart the server error messages appears saying feature unable to install, windows reverting changes.
    In the Setup event logs have the following information message "Update FSRM-Infrastructure of package FSRM-All failed to be turned on. Status: 0x800f0922"
    Does anyone have any idea's on why this Feature can not be installed ??
    Scott

    Hi Shaun
    Tried both of your suggestions, however neither strategy worked.
    Strategy 1
    Tried installing via powershell - "install-windowsfeature -name fs-resource-manager -includemanagementtools"   
    Feature un-installed itself during the restart.
    Attempted to use the command "DISM /online /remove-feature /featurename:FSRM-Infrastructure-Services".  However
    this did work because one the Service was'nt installed or two because there was no command option for "/remove-feature"
    PACKAGE SERVICING COMMANDS:
      /Add-Package            - Adds packages to the image.
      /Remove-Package         - Removes packages from the image.
      /Enable-Feature         - Enables a specific feature in the image.
      /Disable-Feature        - Disables a specific feature in the image.
      /Get-Packages           - Displays information about all packages in the image.
      /Get-PackageInfo        - Displays information about a specific package.
      /Get-Features           - Displays information about all features in a package.
      /Get-FeatureInfo        - Displays information about a specific feature.
      /Cleanup-Image          - Performs cleanup and recovery operations on the image.
    Strategy 2
    Tried installing via powershell, using the following command DISM
    /online /enable-feature /featurename:FSRM-Infrastructure-Services, however got the same result, the install backed out during the restart.
    All up back to where I started?

  • File Server Resource Manager 2012 - Fails to generate storage report - Event ID: 8242 and 602

    Installed file server resource manager roll on new 2012 file server.   When I attempt to run a dup report on the local volume, I received an error message: "the report generation task failed with the following errors: Error generating report
    job with task name".  "
    Event ID 8242 and 602 are logged in the event viewer.
    Log Name:      Application
    Source:        SRMSVC
    Date:          6/24/2013 11:11:03 AM
    Event ID:      8242
    Task Category: None
    Level:         Error
    Keywords:      Classic
    User:          N/A
    Computer:      xxxxxxxxxxxxxxxxx
    Description:
    Reporting or classification consumer '' has failed.
    Event Xml:
    <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
      <System>
        <Provider Name="SRMSVC" />
        <EventID Qualifiers="32772">8242</EventID>
        <Level>2</Level>
        <Task>0</Task>
        <Keywords>0x80000000000000</Keywords>
        <TimeCreated SystemTime="2013-06-24T16:11:03.000000000Z" />
        <EventRecordID>1276</EventRecordID>
        <Channel>Application</Channel>
        <Computer>xxxxxxxxxx</Computer>
        <Security />
      </System>
      <EventData>
        <Data>
        </Data>
        <Data>
    Error-specific details:
       Error: (0x80131501) Unknown error</Data>
        <Binary>2D20436F64653A20434E534D4D4F444330303030303234332D2043616C6C3A20434E534D4D4F444330303030303231322D205049443A202030303030333036302D205449443A202030303030333734382D20434D443A2020433A5C57696E646F77735C73797374656D33325C73726D686F73742E657865202D20557365723A204E616D653A204E5420415554484F524954595C53595354454D2C205349443A532D312D352D313820</Binary>
      </EventData>
    </Event>
    Log Name:      Application
    Source:        SRMREPORTS
    Date:          6/24/2013 11:11:03 AM
    Event ID:      602
    Task Category: None
    Level:         Error
    Keywords:      Classic
    User:          N/A
    Computer:      xxxxxxxxxxxxxxxxxxxx
    Description:
    Error generating report job with the task name ''.
    Context:
     - Exception encountered = System error.
    Event Xml:
    <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
      <System>
        <Provider Name="SRMREPORTS" />
        <EventID Qualifiers="0">602</EventID>
        <Level>2</Level>
        <Task>0</Task>
        <Keywords>0x80000000000000</Keywords>
        <TimeCreated SystemTime="2013-06-24T16:11:03.000000000Z" />
        <EventRecordID>1277</EventRecordID>
        <Channel>Application</Channel>
        <Computer>xxxxxx</Computer>
        <Security />
      </System>
      <EventData>
        <Data>Error generating report job with the task name ''.
    Context:
     - Exception encountered = System error.
    </Data>
      </EventData>
    </Event>
    When I click on schedule a new report task, I get an error "Class not registered".
    nada

    Hi,
    When we schedule a new job, we add a scheduled task to the c:\windows\tasks folder.
    The scheduled task will contain the following command line
    "c:\WINDOWS\system32\storrept.exe reports generate /scheduled /Task:"FSRM_Report_Task{GUID.......}"
    There is also a folder on the system drive
    C:\StorageReports\Scheduled
    We Also store information in the system volume information folder in the following files:
    c:\system Volume Information\SRM\Settings\ReportSettings.xml (we use .old and .alt extentions}
    c:\system Volume Information\SRM\reports\reportX.xml (where X = an incrementing number set, in writing to these files, we also use .old and .alt extentions}
    When experiencing issues relating to scheduled report jobs, you will want to examine these files and check for NTFS permissions issues on these locations also.
    Make sure you check the volume that you will be running the report on.
    TechNet Subscriber Support in forum |If you have any feedback on our support, please contact [email protected]

  • File Server Resource Manager will not load WMI Objects on Windows 8.0/8.1 Preview with Hyper-V and Server Tools Loaded

    Hi Folks,
    I have a problem getting "File Server Resource Manager" to start properly because WMI objects are not loading??? I don't understand this at all because I am able to access them with other Apps, etc. I have been over my Services list as well but
    have not yet discovered which Service turns on/off "File Server Resource Manager". I have looked in Windows\System32\ and I can not find SrmSvc. I found other Srm's there though. I also found the FSRM Snapin which generates the Figure below but I
    can not find it in the WMI Browser. I believe the Firewall is OKAY since I have been over that thoroughly. I do have the Server Tools Installed for Windows 8.0 or Windows 8.1 Clients as well as the Windows Updates applied, which is where I got "File Server
    Resource Manager" in the first place. I have had WMI and Powershell 3.0 Package, installed since Windows 7 but I run a repair on them anyway.
    I have read through "File Server Resource Manager could not load WMI objects on Windows Server 2012
    Article ID: 2831687" but since I don't seem to have SrmSvc the solution doesn't help me. I haven't been able to find a similar one for the Windows Client.
    If there is anything you could share on this problem I would be much obliged.
    Thanks again,
    Crysta
    PhotM Phantom of the Mobile

    Hello,
    The Windows Desktop Perfmon and Diagnostic tools forum is to discuss performance monitor (perfmon), resource monitor (resmon), and task manager, focusing on HOW-TO, Errors/Problems, and usage scenarios.
    As the question is off topic here, I am moving it to the
    Where is the Forum... forum.
    Karl
    When you see answers and helpful posts, please click Vote As Helpful, Propose As Answer, and/or Mark As Answer.
    My Blog: Unlock PowerShell
    My Book:
    Windows PowerShell 2.0 Bible
    My E-mail: -join ('6F6C646B61726C406F75746C6F6F6B2E636F6D'-split'(?<=\G.{2})'|%{if($_){[char][int]"0x$_"}})

  • Allow help desk to manage open files on file server

    I am looking to delegate the ability to manage open files to our help desk users.  They are getting an increasing number of calls from users asking about files and who has them open, or to force close them..etc.
    The help desk users are not admins on our file server, therefore do not have access to RDP to the file server.  I was hoping they could do it from computer management RSAT tools on their local machine.  I just don't know how to allow them to do
    it.
    Thanks
    sb

    Hello,
    Since they are not able to RDP the FS then they should need to access files using shared folders.
    For that, you will need to share the root folder where your files are. Please give Full Control permission on it. Here, to manage their permissions, you can grant them what you want using NTFS permissions.
    Note that NTFS and Share permissions are combined and the user will be have the minimum of privileges when he access the folder as a share. For that, I recommended using FC permission on the shared folder to avoid additional management tasks.
    This
    posting is provided "AS IS" with no warranties or guarantees , and confers no rights.   
    Microsoft
    Student Partner 2010 / 2011
    Microsoft
    Certified Professional
    Microsoft
    Certified Systems Administrator: Security
    Microsoft
    Certified Systems Engineer: Security
    Microsoft
    Certified Technology Specialist: Windows Server 2008 Active Directory, Configuration
    Microsoft
    Certified Technology Specialist: Windows Server 2008 Network Infrastructure, Configuration
    Microsoft
    Certified Technology Specialist: Windows Server 2008 Applications Infrastructure, Configuration
    Microsoft
    Certified Technology Specialist: Windows 7, Configuring
    Microsoft
    Certified Technology Specialist: Designing and Providing Volume Licensing Solutions to Large Organizations
    Microsoft
    Certified IT Professional: Enterprise Administrator
    Microsoft Certified IT Professional: Server Administrator
    Microsoft Certified Trainer

  • Data Protection Manager 2012 - Inconsistent when backing up Deduplicated File Server

    Protected Server
    Server 2012 File Server with Deduplication running on Data drive
    DPM Server
    Server 2012
    Data Protection Manager 2012 Service Pack 1
    We just recently upgraded our DPM server from DPM 2010 to DPM 2012 primarily because it is supposed to support Data Deduplication. Our primary File server that holds our home directories etc. is limited on space and was quickly running low so just after
    we got DPM 2012 in place we optimized the drive on the file server which compressed the data about 50%. Unfortunately shortly after enabling deduplication the protected shares on the deduplicated volume are getting a Replica is Inconsistent error.
    I continually get Replica is Inconsistent for the Server that has deduplication running on it. All of the other protected servers are being protected as they should be. I have run a consistency check multiple times probably about 10 times and it keeps going
    back to Replica is inconsistent. The replica volume shows that it is using 3.5 TB and the Actual protect volume is 4TB in size and has about 2.5 TB of data on it with Deduplication enabled.
    This is the details of the error
    Affected area:   G:\
    Occurred since: 1/12/2015 4:55:14 PM
    Description:        The replica of Volume G:\ on E****.net is inconsistent with the protected data source. All protection activities for data source will fail until the replica is synchronized with
    consistency check. You can recover data from existing recovery points, but new recovery points cannot be created until the replica is consistent.
    For SharePoint farm, recovery points will continue getting created with the databases that are consistent. To backup inconsistent databases, run a consistency check on the farm. (ID 3106)
    More information
    Recommended action: 
    Synchronize with consistency check.
    Run a synchronization job with consistency check...
    Resolution:        
    To dismiss the alert, click below
    Inactivate
    Steps taken to resolve - I’ve spent some time doing some searches and haven’t found any solutions to what I am seeing. I have the data deduplication role installed on the DPM server which has been the solution for many people seeing similar issues. I have
    also removed that role and the added it back. I have also removed the protected server and added it back to the protection group. It synchronizes and says consistent then after a few hours it goes back to inconsistent. When I go to recovery it shows that I
    have recovery points and it appears that I can restore but because the data is inconstant I don’t feel I can trust the data in the recovery points. Both the protected server and the DPM servers’ updates are managed via a WSUS server on our network.
    You may suggest I just un-optimize the drive on the protected server however after I have optimized the drive it takes a large amount more of space to un-optimize it (Anyone know why that is) anyways the drive isn’t large enough to support un-optimization.
    If anyone has any suggestions I would appreciate any help. Thanks in advanced.

    Ok I ran a consistency check and it completed successfully with the following message. However after a few minutes of it showing OK it now shows Replica is Inconsistent again.
    Type: Consistency check
    Status: Completed
    Description: The job completed successfully with the following warning:
     An unexpected error occurred while the job was running. (ID 104 Details: Cannot create a file when that file already exists (0x800700B7))
     More information
    End time: 2/3/2015 11:19:38 AM
    Start time: 2/3/2015 10:34:35 AM
    Time elapsed: 00:45:02
    Data transferred: 220.74 MB
    Cluster node -
    Source details: G:\
    Protection group members: 35
     Details
    Protection group: E*
    Items scanned: 2017709
    Items fixed: 653
    There was a log for a failed synchronization job from yesterday here are the details of that.
    Type: Synchronization
    Status: Failed
    Description: The replica of Volume G:\ on E*.net is not consistent with the protected data source. (ID 91)
     More information
    End time: 2/2/2015 10:04:01 PM
    Start time: 2/2/2015 10:04:01 PM
    Time elapsed: 00:00:00
    Data transferred: 0 MB
    Cluster node -
    Source details: G:\
    Protection group members: 35
     Details
    Protection group: E*

  • File Server Resources Manager not sending emails

    Appears that the SMTP settings are not being set correctly via FSRM.  Using 2012 R2
    I get the below error in the event log when I try to configure them (via options/email notifications tab).  If I click the 'send test email' button I get the test email which tells me my SMTP server is fine.  Seems like I'm getting an access
    denied message whenever I try to save the SMTP settings.  I already tried restarting the server as well as running the FSRM console using the 'run as admin' command.
    Source:  SRMSVC
    EventID:  16401
    The following access-denied assistance error configuration was modified:
    Error: 5
    Enabled: FALSE
    Client Display Flags:
    Error Message: This can occur if you don't have permission to access the file or folder, or if your computer doesn't meet security policy requirements.
    Message from the administrator of the file server:
    - Ask your manager if you're in the right security groups
    - For troubleshooting information, go to <a href="http://support.microsoft.com">Microsoft Support</a>
    If you need more help, click Request assistance.
    Email Flags: Put data owner on TO line, Put administrator on TO line, Include device claims, Include user information, Generate an event log when sending mail
    Additional To Emails:
    Email Message: For general support, contact: [Provide email address]
    For share permissions support, contact: [Provide email address]

    Hi,
    Please try to assign the "Send-as" permission from the adsiedit.msc to the user account.
    Is the SMTP server hosted on an Exchange server? If so, I suggest you ask for help from Exchange forums for better and accurate answer to the question.
    http://social.technet.microsoft.com/Forums/exchange/en-US/home?forum=exchangesvrdeploylegacy
    Regards,
    Mandy
    If you have any feedback on our support, please click
    here .
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • File server resource manager. Block file types based on size

    I have a site where our users appear to be saving some pretty huge TV quality video files (usually .MOV, and usually over 900MB in size) to our file storage server.  The
    concern I have is that such large files are going to fill up our server we will run out of space .
    I don't want to blanket block smaller (less than 30MB) .MOV video files from being saved
    to the drive, so would like to know if there's any feature in the File server resource manager that can block file type based on size.
    If it's not possible, then could someone please suggest this as a future feature of FSRM as I'm sure there are plenty of companies out there who face the same problem as me.
    thanks,
    Mike Geileskey
    Infrastructure manager, PIAS UK Limited 

    Hi,
    We could not block file types based on size by using FSRM. As a workaround you could use disk quotas on NTFS Volumes which are tracked on a per-user. 
    To apply disk quotas to existing volume users, add new quota entries in the Quota Entries window. Please understand that it will take you some time to add all users into the list. 
    It will allow administrators to control the amount of data that each user can store on an NTFS file system volume.
    For more detailed information about disk quotas on NTFS Volumes, you could refer to the article below:
    File Systems
    http://technet.microsoft.com/en-us/library/cc938945.aspx
    Best Regards,
    Mandy 
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Simple file server accessible remotely with managed access. Do I need ML Server for this?

    Hello,
    I have a  Mac Mini that will be dedicated to serving 15 folders of documents to 7 people. It would be great if each person had their own password and I'd like to be able to decide what folders each user will have access to. The people need to be able to access the files from home and on the office network.
    Do I NEED to run OS X server for this Or can i accomplish this in OS X?
    I have to get this running quickly and I may not have time for the ML Server learning curve (even though it has been simplified).
    I tried to get ML server running on my machine a few weeks ago but got stuck. If setting up ML server with JUST the file server is dramatically easier I will try again. Can anyone please suggest a tutorial that takes me through simply setting up a remotely accessible file server with managed access with ML Server?
    V

    OS X client can serve files to remote clients, via both SMB/CIFS and AFS; via the Windows and OS X fiel services.  That's cheap, uses hardware you already have, and works fine.
    Most NAS boxes don't do distributed authentication.  Typically, you have credentials for the box at most.  Some of the mid- and upper-end boxes do offer distributed authentication, but that means having that authentication around.  At the low end, an Apple Time Capsule is a reasonable NAS box, and you can add an external disk.   And can be used for backups via Time Machine, too.  The mid- and upper-end boxes from Synology have a reputation for capabilities and flexibility.  There are (many) other vendors.
    I'm not a huge fan of LogMeIn for various reasons that I won't get into here, but that service does work for accessing hosts.  I don't know if that allows access to NAS directly, but I'd tend to doubt it.  You'd need to check with both LogMeIn and with the specs for whatever NAS box you're using.  
    Given the choice, I'd use a VPN.
    Using a VPN does mean you can control — at the VPN level — who can access your private network, so that can provide a broad-brush form of access control to your NAS device or your OS X client or your OS X Server box, if you go that route.
    I don't prefer to openly serve files to the internet, as the underlying protocols have occasionally had security issues and vulnerabilities, and the internet gremlins will find and will poke at any open ports and any accessible file servers.  I prefer to configure these services via VPN.
    VPNs are also more involved to set up, where LogMeIn can be simple.
    As mentioned previously, I'm also not a huge fan of the host-based VPN servers in OS X, though those do work.  The gateway boxes I've been using in the last year or so are probably not a good choice for a user that isn't familiar with networking  — the boxes provide a user interface that very definitely expect the user to understand IP and routing and related, but is both self-consistent and quite powerful — and they're cheap for what they can do, and they do work nicely.  ZyXEL ZyWALL USG series.  If you are evaluating any of these firewall boxes, then I'd definitely encourage downloading the manuals and making sure you can understand the available information.  The server-grade firewall boxes are almost inherently flexible and thus complex devices.
    One of the easiest ways is to work with somebody that does this sort of thing to sort through the options and requirements and trade-offs available here, and potentially to set up your VPN or NAS or server configuration for you.  (Disclosure: I offer this.)

  • How to use quota on a desktop folder for all users with File Server Resource Manager.

    Hi there,
    I'd like to know if there is a posibility to use variables in the path with File Server Resource Manager if you want to set the quota.
    The path I'd like to use is d:\home\%username%\desktop.
    Like to hear from you all if there is a solution or workaround for this one :)
    Thanks for your time.
    Ben.
    Ben van der Meer

    That's a question about Automator, so I'd pop that one here
    http://discussions.apple.com/forum.jspa?forumID=1339
    Regards
    TD

  • Configure EP6SP9 KM in a file server instead of KM Database

    Hi All,
    We are at EP6SP9 in Windows 2003.
    As a part of my user requirment, I need to set up all my KM folders in a file repository and not in KM Database. The users would view this folders as KM Folders and need to utilise all the facilities offered by KM like Versioning, subscription and notification.
    I need to also integrate the central ADS Server as the user management engine (Windows Authentication) so that all the users automatically logon to the portal when they log-on to the system.TREX also has to index through this documents in the file server.
    Need some advice on the same.
    1) I have seen a document entitled "Integration of Windows File Services into SAP KM Platform using SSO and WebDAV Repository Manager". Is this the one that I have to use for setting the system up or any other suggestions. Has anyone tried connecting KM to an file server Database.
    2) Can I utilise all the KM Functionalities with the documents and folders residing in a file server.
    Would love to interact with anyone who has worked on the same.
    Regards,
    Rajan.K

    There's already lots of information on the subject right here on SDN. Here are a few pointers to get you started:
    CM repository documentation in SAP Help:
    http://help.sap.com/saphelp_nw04/helpdata/en/62/468698a8e611d5993600508b6b8b11/content.htm
    Weblog with step-by-step instructions on creating an FSDB repository:
    Creating a CM Repository Manager - FSDB
    I basically just followed the weblog and it worked. In fact, some steps in the weblog are not necessary if you intend to use the default "documents" repository. In that case, just switch to FSDB persistence mode, add your network paths and it should work after restarting the engine.
    Note that the contents of your repository will be deleted once you switch unless you backup your files to the FSDB root prior to that.
    Hope this helps.

  • Changed laptops, now iTunes can't find my music on my file server

    Ok, here's what happened when I recently switched laptops
    - I use iTunes to point to the music I store on my file server via a network share (WinXP)
    - I manually manage my iTunes folders
    - I took the \My Documents\My Music\iTunes directory from my old laptop and copied it to the same location to my new laptop
    - Installed iTunes 9 on the new laptop
    - When iTunes9 loaded on the new laptop ALMOST all music files in my library loaded with an exclamation point indicating that it couldn't locate the file (even though the file locations never changed)
    Ok... so here's the problem. I have over 20,000 songs with ratings information and if I just re-add all the files from the network share to the iTunes library again then it looks like I'd lose those ratings.
    The only way that I can see to fix things is to individually point iTunes to the network share location of the file for each file... YUCK!
    Is there a script or something I can write to tell it to find these files, or at least copy the ID3 tags to the duplicated songs that I re-added from the network share.
    Any help is GREATLY appreciated!

    Something is different in the paths to the files....hence itunes cannot find them.
    Sound like you know about the ITL file, which is why you copied it over.
    What you can do is look in its companion XML file to see the paths itunes thinks it should be using.
    If your XML is really big, open it using WordPad instead of NotePad. About the 10th line down is the itunes preference setting for the itunes folder:
    key>Music Folder</key><string>file://localhost/K:/iTunes%20Music/</string>
    On my system it's K:/iTunes Music
    The %20 just indicates a space.
    Then each song will have its own path
    <key>Location</key><string>file://localhost/K:/iTunes%20Music/Cowboy%20Junkies/The%20Trinity%20Session/07%2 0200%20More%20Miles.mp3</string>
    Now....can you see ANYTHING funny in one of those song paths?Is there an extra space, a different user name, anything different than what you could browse to in Windows Explorer?

  • Create a new web application, how shall I update the file server.xml

    Hi,
    I will create a new web application, i.e named newApp. Then I create a file structure as follows:
    - <server-root>/newApp
    - <server-root>/newApp/WEB-INF
    - <server-root>/newApp/WEB-INF/classes
    Then I must tell the server that I have created a new web application. Then I must update my file server.xml, How shall I do this and where in the file shall I type in the new information?
    I use windows XP Pro, and Tomcat 4.1.27.
    My server.xml file looks like below:
    <!-- Example Server Configuration File -->
    <!-- Note that component elements are nested corresponding to their
    parent-child relationships with each other -->
    <!-- A "Server" is a singleton element that represents the entire JVM,
    which may contain one or more "Service" instances. The Server
    listens for a shutdown command on the indicated port.
    Note: A "Server" is not itself a "Container", so you may not
    define subcomponents such as "Valves" or "Loggers" at this level.
    -->
    <Server port="8005" shutdown="SHUTDOWN" debug="0">
    <!-- Comment these entries out to disable JMX MBeans support -->
    <!-- You may also configure custom components (e.g. Valves/Realms) by
    including your own mbean-descriptor file(s), and setting the
    "descriptors" attribute to point to a ';' seperated list of paths
    (in the ClassLoader sense) of files to add to the default list.
    e.g. descriptors="/com/myfirm/mypackage/mbean-descriptor.xml"
    -->
    <Listener className="org.apache.catalina.mbeans.ServerLifecycleListener"
    debug="0"/>
    <Listener className="org.apache.catalina.mbeans.GlobalResourcesLifecycleListener"
    debug="0"/>
    <!-- Global JNDI resources -->
    <GlobalNamingResources>
    <!-- Test entry for demonstration purposes -->
    <Environment name="simpleValue" type="java.lang.Integer" value="30"/>
    <!-- Editable user database that can also be used by
    UserDatabaseRealm to authenticate users -->
    <Resource name="UserDatabase" auth="Container"
    type="org.apache.catalina.UserDatabase"
    description="User database that can be updated and saved">
    </Resource>
    <ResourceParams name="UserDatabase">
    <parameter>
    <name>factory</name>
    <value>org.apache.catalina.users.MemoryUserDatabaseFactory</value>
    </parameter>
    <parameter>
    <name>pathname</name>
    <value>conf/tomcat-users.xml</value>
    </parameter>
    </ResourceParams>
    </GlobalNamingResources>
    <!-- A "Service" is a collection of one or more "Connectors" that share
    a single "Container" (and therefore the web applications visible
    within that Container). Normally, that Container is an "Engine",
    but this is not required.
    Note: A "Service" is not itself a "Container", so you may not
    define subcomponents such as "Valves" or "Loggers" at this level.
    -->
    <!-- Define the Tomcat Stand-Alone Service -->
    <Service name="Tomcat-Standalone">
    <!-- A "Connector" represents an endpoint by which requests are received
    and responses are returned. Each Connector passes requests on to the
    associated "Container" (normally an Engine) for processing.
    By default, a non-SSL HTTP/1.1 Connector is established on port 8080.
    You can also enable an SSL HTTP/1.1 Connector on port 8443 by
    following the instructions below and uncommenting the second Connector
    entry. SSL support requires the following steps (see the SSL Config
    HOWTO in the Tomcat 4.0 documentation bundle for more detailed
    instructions):
    * Download and install JSSE 1.0.2 or later, and put the JAR files
    into "$JAVA_HOME/jre/lib/ext".
    * Execute:
    %JAVA_HOME%\bin\keytool -genkey -alias tomcat -keyalg RSA (Windows)
    $JAVA_HOME/bin/keytool -genkey -alias tomcat -keyalg RSA (Unix)
    with a password value of "changeit" for both the certificate and
    the keystore itself.
    By default, DNS lookups are enabled when a web application calls
    request.getRemoteHost(). This can have an adverse impact on
    performance, so you can disable it by setting the
    "enableLookups" attribute to "false". When DNS lookups are disabled,
    request.getRemoteHost() will return the String version of the
    IP address of the remote client.
    -->
    <!-- Define a non-SSL Coyote HTTP/1.1 Connector on port 8080 -->
    <Connector className="org.apache.coyote.tomcat4.CoyoteConnector"
    port="8080" minProcessors="5" maxProcessors="75"
    enableLookups="true" redirectPort="8443"
    acceptCount="100" debug="0" connectionTimeout="20000"
    useURIValidationHack="false" disableUploadTimeout="true" />
    <!-- Note : To disable connection timeouts, set connectionTimeout value
    to -1 -->
    <!-- Define a SSL Coyote HTTP/1.1 Connector on port 8443 -->
    <!--
    <Connector className="org.apache.coyote.tomcat4.CoyoteConnector"
    port="8443" minProcessors="5" maxProcessors="75"
    enableLookups="true"
    acceptCount="100" debug="0" scheme="https" secure="true"
    useURIValidationHack="false" disableUploadTimeout="true">
    <Factory className="org.apache.coyote.tomcat4.CoyoteServerSocketFactory"
    clientAuth="false" protocol="TLS" />
    </Connector>
    -->
    <!-- Define a Coyote/JK2 AJP 1.3 Connector on port 8009 -->
    <Connector className="org.apache.coyote.tomcat4.CoyoteConnector"
    port="8009" minProcessors="5" maxProcessors="75"
    enableLookups="true" redirectPort="8443"
    acceptCount="10" debug="0" connectionTimeout="0"
    useURIValidationHack="false"
    protocolHandlerClassName="org.apache.jk.server.JkCoyoteHandler"/>
    <!-- Define an AJP 1.3 Connector on port 8009 -->
    <!--
    <Connector className="org.apache.ajp.tomcat4.Ajp13Connector"
    port="8009" minProcessors="5" maxProcessors="75"
    acceptCount="10" debug="0"/>
    -->
    <!-- Define a Proxied HTTP/1.1 Connector on port 8082 -->
    <!-- See proxy documentation for more information about using this. -->
    <!--
    <Connector className="org.apache.coyote.tomcat4.CoyoteConnector"
    port="8082" minProcessors="5" maxProcessors="75"
    enableLookups="true"
    acceptCount="100" debug="0" connectionTimeout="20000"
    proxyPort="80" useURIValidationHack="false"
    disableUploadTimeout="true" />
    -->
    <!-- Define a non-SSL legacy HTTP/1.1 Test Connector on port 8083 -->
    <!--
    <Connector className="org.apache.catalina.connector.http.HttpConnector"
    port="8083" minProcessors="5" maxProcessors="75"
    enableLookups="true" redirectPort="8443"
    acceptCount="10" debug="0" />
    -->
    <!-- Define a non-SSL HTTP/1.0 Test Connector on port 8084 -->
    <!--
    <Connector className="org.apache.catalina.connector.http10.HttpConnector"
    port="8084" minProcessors="5" maxProcessors="75"
    enableLookups="true" redirectPort="8443"
    acceptCount="10" debug="0" />
    -->
    <!-- An Engine represents the entry point (within Catalina) that processes
    every request. The Engine implementation for Tomcat stand alone
    analyzes the HTTP headers included with the request, and passes them
    on to the appropriate Host (virtual host). -->
    <!-- You should set jvmRoute to support load-balancing via JK/JK2 ie :
    <Engine name="Standalone" defaultHost="localhost" debug="0" jmvRoute="jvm1">
    -->
    <!-- Define the top level container in our container hierarchy -->
    <Engine name="Standalone" defaultHost="localhost" debug="0">
    <!-- The request dumper valve dumps useful debugging information about
    the request headers and cookies that were received, and the response
    headers and cookies that were sent, for all requests received by
    this instance of Tomcat. If you care only about requests to a
    particular virtual host, or a particular application, nest this
    element inside the corresponding <Host> or <Context> entry instead.
    For a similar mechanism that is portable to all Servlet 2.3
    containers, check out the "RequestDumperFilter" Filter in the
    example application (the source for this filter may be found in
    "$CATALINA_HOME/webapps/examples/WEB-INF/classes/filters").
    Request dumping is disabled by default. Uncomment the following
    element to enable it. -->
    <!--
    <Valve className="org.apache.catalina.valves.RequestDumperValve"/>
    -->
    <!-- Global logger unless overridden at lower levels -->
    <Logger className="org.apache.catalina.logger.FileLogger"
    prefix="catalina_log." suffix=".txt"
    timestamp="true"/>
    <!-- Because this Realm is here, an instance will be shared globally -->
    <!-- This Realm uses the UserDatabase configured in the global JNDI
    resources under the key "UserDatabase". Any edits
    that are performed against this UserDatabase are immediately
    available for use by the Realm. -->
    <Realm className="org.apache.catalina.realm.UserDatabaseRealm"
    debug="0" resourceName="UserDatabase"/>
    <!-- Comment out the old realm but leave here for now in case we
    need to go back quickly -->
    <!--
    <Realm className="org.apache.catalina.realm.MemoryRealm" />
    -->
    <!-- Replace the above Realm with one of the following to get a Realm
    stored in a database and accessed via JDBC -->
    <!--
    <Realm className="org.apache.catalina.realm.JDBCRealm" debug="99"
    driverName="org.gjt.mm.mysql.Driver"
    connectionURL="jdbc:mysql://localhost/authority"
    connectionName="test" connectionPassword="test"
    userTable="users" userNameCol="user_name" userCredCol="user_pass"
    userRoleTable="user_roles" roleNameCol="role_name" />
    -->
    <!--
    <Realm className="org.apache.catalina.realm.JDBCRealm" debug="99"
    driverName="oracle.jdbc.driver.OracleDriver"
    connectionURL="jdbc:oracle:thin:@ntserver:1521:ORCL"
    connectionName="scott" connectionPassword="tiger"
    userTable="users" userNameCol="user_name" userCredCol="user_pass"
    userRoleTable="user_roles" roleNameCol="role_name" />
    -->
    <!--
    <Realm className="org.apache.catalina.realm.JDBCRealm" debug="99"
    driverName="sun.jdbc.odbc.JdbcOdbcDriver"
    connectionURL="jdbc:odbc:CATALINA"
    userTable="users" userNameCol="user_name" userCredCol="user_pass"
    userRoleTable="user_roles" roleNameCol="role_name" />
    -->
    <!-- Define the default virtual host -->
    <Host name="localhost" debug="0" appBase="webapps"
    unpackWARs="true" autoDeploy="true">
    <!-- Normally, users must authenticate themselves to each web app
    individually. Uncomment the following entry if you would like
    a user to be authenticated the first time they encounter a
    resource protected by a security constraint, and then have that
    user identity maintained across all web applications contained
    in this virtual host. -->
    <!--
    <Valve className="org.apache.catalina.authenticator.SingleSignOn"
    debug="0"/>
    -->
    <!-- Access log processes all requests for this virtual host. By
    default, log files are created in the "logs" directory relative to
    $CATALINA_HOME. If you wish, you can specify a different
    directory with the "directory" attribute. Specify either a relative
    (to $CATALINA_HOME) or absolute path to the desired directory.
    -->
    <!--
    <Valve className="org.apache.catalina.valves.AccessLogValve"
    directory="logs" prefix="localhost_access_log." suffix=".txt"
    pattern="common" resolveHosts="false"/>
    -->
    <!-- Logger shared by all Contexts related to this virtual host. By
    default (when using FileLogger), log files are created in the "logs"
    directory relative to $CATALINA_HOME. If you wish, you can specify
    a different directory with the "directory" attribute. Specify either a
    relative (to $CATALINA_HOME) or absolute path to the desired
    directory.-->
    <Logger className="org.apache.catalina.logger.FileLogger"
    directory="logs" prefix="localhost_log." suffix=".txt"
    timestamp="true"/>
    <!-- Define properties for each web application. This is only needed
    if you want to set non-default properties, or have web application
    document roots in places other than the virtual host's appBase
    directory. -->
         <DefaultContext reloadable="true"/>
    <!-- Tomcat Root Context -->
    <Context path="" docBase="ROOT" debug="0"/>
    <!-- Tomcat Examples Context -->
    <Context path="/examples" docBase="examples" debug="0"
    reloadable="true" crossContext="true">
    <Logger className="org.apache.catalina.logger.FileLogger"
    prefix="localhost_examples_log." suffix=".txt"
    timestamp="true"/>
    <Ejb name="ejb/EmplRecord" type="Entity"
    home="com.wombat.empl.EmployeeRecordHome"
    remote="com.wombat.empl.EmployeeRecord"/>
    <!-- If you wanted the examples app to be able to edit the
    user database, you would uncomment the following entry.
    Of course, you would want to enable security on the
    application as well, so this is not done by default!
    The database object could be accessed like this:
    Context initCtx = new InitialContext();
    Context envCtx = (Context) initCtx.lookup("java:comp/env");
    UserDatabase database =
    (UserDatabase) envCtx.lookup("userDatabase");
    -->
    <!--
    <ResourceLink name="userDatabase" global="UserDatabase"
    type="org.apache.catalina.UserDatabase"/>
    -->
    <!-- PersistentManager: Uncomment the section below to test Persistent
    Sessions.
    saveOnRestart: If true, all active sessions will be saved
    to the Store when Catalina is shutdown, regardless of
    other settings. All Sessions found in the Store will be
    loaded on startup. Sessions past their expiration are
    ignored in both cases.
    maxActiveSessions: If 0 or greater, having too many active
    sessions will result in some being swapped out. minIdleSwap
    limits this. -1 or 0 means unlimited sessions are allowed.
    If it is not possible to swap sessions new sessions will
    be rejected.
    This avoids thrashing when the site is highly active.
    minIdleSwap: Sessions must be idle for at least this long
    (in seconds) before they will be swapped out due to
    activity.
    0 means sessions will almost always be swapped out after
    use - this will be noticeably slow for your users.
    maxIdleSwap: Sessions will be swapped out if idle for this
    long (in seconds). If minIdleSwap is higher, then it will
    override this. This isn't exact: it is checked periodically.
    -1 means sessions won't be swapped out for this reason,
    although they may be swapped out for maxActiveSessions.
    If set to >= 0, guarantees that all sessions found in the
    Store will be loaded on startup.
    maxIdleBackup: Sessions will be backed up (saved to the Store,
    but left in active memory) if idle for this long (in seconds),
    and all sessions found in the Store will be loaded on startup.
    If set to -1 sessions will not be backed up, 0 means they
    should be backed up shortly after being used.
    To clear sessions from the Store, set maxActiveSessions, maxIdleSwap,
    and minIdleBackup all to -1, saveOnRestart to false, then restart
    Catalina.
    -->
    <!--
    <Manager className="org.apache.catalina.session.PersistentManager"
    debug="0"
    saveOnRestart="true"
    maxActiveSessions="-1"
    minIdleSwap="-1"
    maxIdleSwap="-1"
    maxIdleBackup="-1">
    <Store className="org.apache.catalina.session.FileStore"/>
    </Manager>
    -->
    <Environment name="maxExemptions" type="java.lang.Integer"
    value="15"/>
    <Parameter name="context.param.name" value="context.param.value"
    override="false"/>
    <Resource name="jdbc/EmployeeAppDb" auth="SERVLET"
    type="javax.sql.DataSource"/>
    <ResourceParams name="jdbc/EmployeeAppDb">
    <parameter><name>username</name><value>sa</value></parameter>
    <parameter><name>password</name><value></value></parameter>
    <parameter><name>driverClassName</name>
    <value>org.hsql.jdbcDriver</value></parameter>
    <parameter><name>url</name>
    <value>jdbc:HypersonicSQL:database</value></parameter>
    </ResourceParams>
    <Resource name="mail/Session" auth="Container"
    type="javax.mail.Session"/>
    <ResourceParams name="mail/Session">
    <parameter>
    <name>mail.smtp.host</name>
    <value>localhost</value>
    </parameter>
    </ResourceParams>
    <ResourceLink name="linkToGlobalResource"
    global="simpleValue"
    type="java.lang.Integer"/>
    </Context>
    </Host>
    </Engine>
    </Service>
    </Server>

    To use servlets u have indeed to update your web.xml...Well I'm not sure this is relevant to your case anyway.
    You have to add a <servlet> element to this file.
    Something like this:
    <servlet>
    <servlet-name>blabla</servlet-name>
    <servlet-class>blablapackage.Blablaclass</servlet-class>
    <init-param>...</init-param>
    </servlet>
    Now this may not solve your problem. Make sure you refer to your servlets using their full qualified names.btw, just to be sure, what is your definition of "servlet"? (i mean: any java class or only javax.servlet.Servlet)

  • File Server Configuration on Windows Server 2012 R2

    Dear All,
    I need your support in my scenario.
    Work Environment: 
    I work in a company where we have multiple departments.
    My Department is elearning.
    All Departments get Internet and Network Access from IT Department.
    My Department Requirement:
    I Need to Install and Configure File Server and assign storage space for 25 users for storing their respective work files.
    Availability:
    I have installed windows 2012 R2 on my physical server which is connected to my local network
    My Question:
    Do I have to make my server a DC to Install and configure File Server for my requirement?
    If No, then How can I fulfill my requirement ?
    Kindly revert.
    Best Regards,
    Ahmed

    Do you already have an AD domain in your environment? If yes then simply add your File Server as member server and use your AD user accounts to grant the required accesses.
    If no then you can:
    Create local user accounts and provide access using them
    or make your server a DC, create AD accounts and then grant the required accesses
    Before proceeding, it would be better to see with your IT department what is already available and their recommendations to manage your needs.
    This posting is provided AS IS with no warranties or guarantees , and confers no rights.
    Ahmed MALEK
    My Website Link
    My Linkedin Profile
    My MVP Profile

  • File Server/AFP gets slow after a week or so

    Hi all,
    I manage a file server at our office here, with around 20-30 users, running 10.3.9 server. Users connect via AFP for general office duties, but also remote-desktop into a windows server using Rdesktop via X11. They use SMB to access their files from the windows side.
    I find after a week or so that logins to the shares, and general file access gets incredibly slow, and grinds to a halt eventually, but comes good after the server is rebooted.
    It is usually quick to authenticate in any case, but takes an age to bring up the list of shares that the users can pick from (probably 10-15 at most depending on their permissions). Even after a fresh reboot it takes about 10 seconds to bring up the list of shares to pick from. Is there any way to fix this?
    I find that is someone has their computer set to calculate folder sizes on a server drive/share it can make the server grind to a standstill, but even with everyone set up correctly it still goes slowly all by itself after a few days/weeks of running. Logging in via Remote Desktop becomes slow at this point too.
    I've tried some of the tips in this thread below with only limited success.
    http://discussions.apple.com/thread.jspa?threadID=343979&tstart=15
    Any help is greatly appreciated

    As you upgraded the router's firmware,reset the router for 30 seconds and reconfigure the router from scratch.
    On the router setup page,Under the Wireless tab,click on the Advanced Wireless Settings...Change the Beacon Interval to 75, Change the Fragmentation Threshold to 2304,Change the RTS Threshold to 2307 and Click on Save Settings...

Maybe you are looking for