Best Practice to clean up WSUS content no longer needed

I have been researching this for days and probably have about 10 hours invested in trying to come to a conclusion and now want some feedback from this group. Below you will find everything I think you might need about my current configuration. I originally
installed my first WSUS server in 2006.Over the years I added new products as needed, changed to express files (seems to have been a bad decision), unselected products no longer needed, and when version 3 came out began running the cleanup wizard monthly.
I have never declined any updates myself and only approved those showing as needed. I watched my content grow to about 65 GB then decided to migrate to a new server. I followed this procedure which worked very well:
http://exchangeserverpro.com/how-to-move-wsus-30-to-a-new-server
After the migration I tested a round of updates and everything was working fine. Then I added Windows Server 2008 (only have one server with this OS) and SQL Server 2008 R2 (only have one server with this OS) products and was amazed when my content went
up to over 100 GB. I decided to remove those two products since only one server needs them and I could update it direct from MS instead thinking it would be easy to recover the space. Was I in for a surprise. No real easy way to do that. From my research these
seem to be the options to do a good clean up on all unneeded content. What do you suggest as the best way for me to proceed?
Option 1: uninstall and re-install fresh
Option 2: reset the content following this procedure:
http://blogs.technet.com/b/gborger/archive/2009/02/27/what-to-do-when-your-wsuscontent-folder-grows-too-large.aspx
Option 3: reset the content following this procedure: (I tested this on my old server and steps 1-4 took about 90 minutes)
1. Decline all previously approved updates. (Yes, all of them. Even the ones you want.)
2. Run the cleanup wizard -- this will remove files in the WsusContent folder.
3. Change the approval of all DECLINED updates from declined to "Not Approved" (and apply inheritance).
4. Run the cleanup wizard again -- this will decline all expired updates.
5. Re-approve needed updates. Content will be re-downloaded.
I do plan to remove the express files option first to reduce the amount of space needed for storage. If you see anything else in my config that should be tweaked please let me know. Thanks for any and all feedback. I appreciate you taking the time to read
this and respond.
My configuration follows:
1 Gbps switched network infrastructure with 10/10 fiber internet access
WSUS – current environment – 1 physical server also used as a file share server for user files
Windows Server 2008 Enterprise (x86)
Dual Intel Xeon 2.8 Ghz
2 GB Ram
1 Gbps Nic
136 GB hard drive for DB and WSUS Content files (currently DB=1.5 GB, Content = 108 GB)
WSUS Server version: 3.2.7600.226
Using the Windows Internal Database (I do have a SQL Server 2008 R2 server available now)
128 Computers in 13 groups manually assigned but computers are directed to this server via GP
(all Servers and 90% of workstations are current on updates through last month)
3189 Updates installed/NA
139 Updates needed
919 Updates with no status
Configuration options:
Update source – Microsoft
Products – Office 2003, Office 2010, Windows 7, Windows Defender, Windows Server 2003, Windows Server 2008 R2, Windows XP (previously selected but no longer needed: Office 2002/XP, SQL Server 2008 R2, Windows 2000, Windows Server 2008)
Classifications – critical, definition, security, service packs, update rollups, updates
Update files and languages – store locally, download only when approved, download express installation files, English
Automatic approvals – above classifications for test group which includes one computer with each of the following OS’s: Windows 7, Windows Server 2003, Windows Server 2008 R2, Windows XP
After those machines are tested then I approve and install needed updates for the server group (8 machines total) which include Windows Server 2003, Windows Server 2003 R2, Windows Server 2008 R2, Windows XP
After those machines are tested then I approve needed updates for all computers and let the users install them

Hi Lawrence,
How did you go with creating the FAQ?
Shortly after this thread was last active, I was acquired by SolarWinds, and shortly after that we created PatchZone.org. I've published a lot of this "FAQ" information as blog posts to that site. If you know of anything that's missing, please let me know.
It seems that finding definitive information on doing "proper" maintenance on WSUS is still hard to find and conflicting.
I don't know about "conflicting"; the process is pretty straight forward. But it has been hard to find. Truth is, until a couple of years ago it wasn't even required, but the voluminous number of updates now published in the catalog make it necessary. I posted
a series of blog articles at patchzone.org addressing this very issue.
Here in Australia the majority of businesses are on download limits with ISPs, so we have to be very careful about what we do with cleaning up WSUS, as it can lead to expensive excess data charges!
Cleaning up WSUS will **NEVER** lead to excessive data charges; failing to properly administer and manage the server absolutely will do that!
If you decline the updates that are no longer needed in your environment, isn't there a risk that, should somebody join or re-join an old computer to the network, the computer will never be identified as needing potentially critical updates? Would it be
a better option to change approved updates, that are no longer needed, to unapproved?
Absolutely! A critical observation, in fact. This is why only updates that are
superseded should ever be declined. Updates that are NOT superseded should not be declined, but merely left in a NotApproved state so that if a computer is introduced to the network that requires one of those updates, you would be able to readily determine
that fact.
Surely this would mean the update will still be cleared out after running the clean-up wizard, but, should somebody attach or reattach an old computer to the network it will still be identified and targeted for the correct updates?
Actually, no. Only if you explicitly decline an update, or if the Server Cleanup Wizard declines an update, will the files associated with that update be physically removed from the filesystem. If you merely remove the approvals, the files previously downloaded
will remain. This, then, brings attention to the conditions under which the Server Cleanup Wizard will DECLINE an update. In order for the Server Cleanup Wizard to decline an update it must either be expired or superseded. Furthermore, superseded updates must
be NotApproved, the replacement update must be Approved, and the superseded update must have been 100% Installed/NotApplicable for at least 30 days.
Lawrence Garvin, M.S., MCSA, MCITP:EA, MCDBA
SolarWinds Head Geek
Microsoft MVP - Software Packaging, Deployment & Servicing (2005-2014)
My MVP Profile: http://mvp.microsoft.com/en-us/mvp/Lawrence%20R%20Garvin-32101
http://www.solarwinds.com/gotmicrosoft
The views expressed on this post are mine and do not necessarily reflect the views of SolarWinds.

Similar Messages

  • Need best practice when accessing an ucm content after being transferred.

    Hi All,
    I have a business requirement where I need to auto-transfer the content to another UCM when this content expires in the source UCM.
    This content needs to be deleted after it spends a certain duration in the target UCM.
    Can anybody advise me the best practice to do this in the Oracle UCM?
    I have set up an expiration date and trying to auto Replicate the content to the target UCM once the content reaches the expiration date.
    I am not aware of the best practice to access the content when it is in the target UCM?
    Any help in this case would be greatly appreciated.
    Regards,
    Ashwin

    SR,
    Unfortunately temp tables are the way to go. In Apex we call them collections (not the same as PL/SQL collections) and there's an API for working with them. In other words, the majority of the leg work has already been done for you. You don't have to create the tables or worry about tying data to different sessions. Start you learning here:
    http://download.oracle.com/docs/cd/E14373_01/appdev.32/e11838/advnc.htm#BABFFJJJ
    Regards,
    Dan
    http://danielmcghan.us
    http://sourceforge.net/projects/tapigen
    http://sourceforge.net/projects/plrecur
    You can reward this reply by marking it as either Helpful or Correct ;-)

  • Best Practice for SUP and WSUS Installation on Same Server

    Hi Folks,
    I have a question, I am in process of deploying SCCM 2012 R2... I was in process of deploying Software Update Point on SCCM with one of the existing WSUS server installed on a separate server from SCCM.
    A debate has started with of the colleague who says that the using remote WSUS server is recommended by Microsoft because of the scalability security  that WSUS will be downloading the updates from Microsoft and SCCM should be working as downstream
    server to fetch updates from WSUS server.
    but according to my consideration it is recommended to install WSUS server on the same server where SCCM is installed... actually it is recommended to install WSUS on a site system and you can used the same SCCM server to deploy WSUS.
    please advice me the best practices for deploying SCCM and WSUS ... what Microsoft says about WSUS to be installed on same SCCM server OR WSUS should be on a separate server then the SCCM server ???
    awaiting your advices ASAP :)
    Regards, Owais

    Hi Don,
    thanks for the information, another quick one...
    the above mentioned configuration I did is correct in terms of planning and best practices?
    I agree with Jorgen, it's ok to have WSUS/SUP on the same server as your site server, or you can have WSUS/SUP on a dedicated server if you wish.
    The "best practice" is whatever suits your environment, and is a supported-by-MS way of doing it.
    One thing to note, is that if WSUS ever becomes "corrupt" it can be difficult to repair and sometimes it's simplest to rebuild the WSUS Windows OS. If this is on your site server, that's a big deal.
    Sometimes, WSUS goes wrong (not because of ConfigMgr)..
    Note that if you have a very large estate, or multiple primary site servers, you might have a CAS, and you would need a SUP on the CAS. (this is not a recommendation for a CAS, just to be aware)
    Don
    (Please take a moment to "Vote as Helpful" and/or "Mark as Answer", where applicable.
    This helps the community, keeps the forums tidy, and recognises useful contributions. Thanks!)

  • SharePoint 2007 - the best practice to detached/remove a content database from a web app without any Farm impact

    Best
    practice to remove content databases from SharePoint 2007 without any Farm
    impact <o:p></o:p>

    Hi  ,
    For removing a content database from a Web application, you can take steps as below:
    1.On the Manage Content Databases page, click the content database that you want to remove.
    2.On the Manage Content Database Settings page, in the Remove Content Database section, select the Remove content database check box.
    If any sites are currently using this database, a message box appears. Click OK to indicate that you want to proceed with the removal.
    3.Click OK.
    Reference:
    http://technet.microsoft.com/en-us/library/cc262440(v=office.12).aspx
    Best Regards,
    Eric
    Eric Tao
    TechNet Community Support

  • Best practices for cleaning up an old CSS?

    We have a legacy help system that we've put into RH 9.  Our CSS has a lot of old styles in it.  Is there a best practices document somewhere that we can use to learn how to clean up the CSS and remove old styles but still preserve the ones we want?
    thanks...

    I don't know of any such document but perhaps this will help.
    First archive a copy of the CSS as it is now so that you can go back and retrieve anything that you later find you shouldn't have deleted.
    Next you should be safe in deleting any styles you see with kadov in them. You will see they are duplicates of another style and were used for the way RoboHelp used to work.
    The rest are a bit more difficult. You need to use the multifile find and replace tool to see if they are used in any topic. Then either change them to what you want now or leave them in the CSS.
    See www.grainge.org for RoboHelp and Authoring tips
    @petergrainge

  • Best practice for cleaning up conversations after a recieve timeout

    We have a pattern where sql server sends a message through service broker and waits for a response (say 10 ten secs), if it doesn't get a response, it will assume a default response and carry on. The response may come later and also end the conversation
    target side. However there is no-one to receive the response (which is no longer needed). We have clean up script on a sql agent job but we were wondering if there is a cleaner way to handle these situations

    Can't you set up an activation procedure that receives the response message and ends the conversation? I would not expect the activation procedure to be fired, if there is already is someone waiting on the queue.
    But it is certainly nothing I have tested.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Best practices for cleaning up and upgrading

    Apologies in advance, as I know there is a wealth of info and ideas on this topic, but I'm looking for the best possible combination of simplicity and effectiveness and the community has done well by me in the past.
    Situation:  I have this old MacBook
    Model Name: MacBook
      Model Identifier: MacBook6,1
      Processor Name: Intel Core 2 Duo
      Processor Speed: 2.26 GHz
      Number of Processors: 1
      Total Number of Cores: 2
      L2 Cache: 3 MB
      Memory: 4 GB
      Bus Speed: 1.07 GHz
      Boot ROM Version: MB61.00C8.B00
      System Version: OS X 10.9.5 (13F34)
      Kernel Version: Darwin 13.4.0
    HD- Available: 6.22 GB
      Capacity: 249.2 GB
    Literally the only paid software that I have on here and need to keep is Microsoft Office.  I do not have the original install disk for this software.
    This machine is running very very slowly, and not much hard disk space left.  I'm not doing very well at cleaning up.
    I have brand new Time Machine backup and I have some pretty old Time Machine backups, pre Mavericks.
    I have a new 2TB external drive.
    I'm looking for the most effective way to speed this one up and make it as lean as possible and, if advisable, upgrade to Yosemite.
    Any recommendations?
    I'm willing to spend a little money if it can make a substantial difference.
    Thanks and apologies for length of post.
    RW

    Hi,
    First thanks for this.  This application has been really helpful.  I've deleted a lot of the easy stuff (music, pics, videos) and am now getting down to some of the stuff I'm less comfortable whacking without some guidance.  Here's an example.  I don't use the Mail application at all (use Gmail).  Here is what I see on OmniDisk Sweeper:
    https://www.dropbox.com/s/6iedkj46u4jaxbf/Screenshot%202014-11-08%2010.26.33.png ?dl=0
    I think the bulk of this is just messages from my gmail account, from a failed attempt to use Mail a bit years ago.  So can you help with at which point I can delete?  Hope question make sense.  You've helped a lot so far and I've freed up about 45 gb so far.
    Rob

  • Best practices for cleaning up after a Bulk REST API v2 export

    I want to make sure that I am cleaning up after my exports (not leaving anything staging, etc). So far I am
    DELETEing $ENTITY/exports/$ID and $ENTITY/exports/$ID/data as described in the Bulk REST API documentation
    Using a dataRetentionDuration when I create an export (as a safety net in case my code crashes before deleting).
    Is there anything else I should do? Should I/can I DELETE the syncs I create (syncs are not listed in the "Delete an Entity" section of the documentation)? Or are those automatically deleted when I DELETE an export?
    Thanks!
    1086203

    Hi Chris,
    I met the same problem as pod
    It happens when I tried to load all historical activities, and one sample is same activityId was given to 2 different types (one is EmailOpen, the other is FormSubmit) that generated in year 2013
    Before full loading, I ever did testing for my job, extracting the activity records from Nov 2014, and there is not unique ID issue
    Seems Eloqua fixed this problem before Nov 2014, right?
    So if I start to load Activity generated since 2015, there will not be PK problem, or else, I have to use ActivityId + ActivityType as compound PK for historical data
    Please confirm and advise
    Waiting for your feedback
    Thanks~

  • Best Practices For Portal Content Objects Transport System

    Hi All,
    I am going to make some documentation on Transport Sytem for Portal content objects in Best Practices.
    Please help in out and send me some documents related to SAP Best Practices for transport  for Portal Content Objects.
    Thanks,
    Iqbal Ahmad
    Edited by: Iqbal Ahmad on Sep 15, 2008 6:31 PM

    Hi Iqbal,
    Hope you are doing good
    Well, have a look at these links.
    http://help.sap.com/saphelp_nw04/helpdata/en/91/4931eca9ef05449bfe272289d20b37/frameset.htm
    This document, gives a detailed description.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/f570c7ee-0901-0010-269b-f743aefad0db
    Hope this helps.
    Cheers,
    Sandeep Tudumu

  • Oracle Best Practices / Guidelines regarding Cleaning TEMP files

    Hi folks,
    Can any one help me with a set of steps / guidelines or best practices to clean TEMP files from OBIEE servers (our PROD environment)?
    Does perhaps OBIEE take care of this for you automatically, how is that process happening?
    Thanks a lot for your time and attention and hope to hear from you soon.

    TEMPS files are deleted from the server once a user logs out. But there might be chance where the TEMP files does not get deleted automaticlly, when the user logs out of the system even before the TEMP file has been generated completely. In this case the temp files get stored in the server, and a bounce of services cleans up the files.
    The best practice would be to create a script to empty out the temp directory during the start up of the services.
    -Amith.

  • Best practice for migrating IDOCs?

    Subject: Best practice for migrating IDOC's? 
    Hi,
    I need to migrate some IDOC's to another system for 'historical reference'.
    However, I don't want to move them using the regular setup as I don't want the inbound processing to be triggered.
    The data that was created in the original system by the processed IDOC's will be migrated to the new system using migration workbench. I only need to migrate the IDOC's as-is due to legal requirements.
    What is the best way to do this? I can see three solutions:
    A) Download IDOC table contents to a local file and upload them in the new system. Quick and dirty approach, but it might also be a bit risky.
    B) Use LSMW. However, I'm not sure whether this is feasible for IDOC's.
    C) Using ALE and setting up a custom partner profile where inbound processing only writes the IDOC's to the database. Send the IDOC's from legacy to the new system. Using standard functionality in this way seems to me to be the best solution, but I need to make sure that the IDOC's once migration will get the same status as they had in the old system.
    Any help/input will be appreciated
    Regards
    Karl Johan
    PS. For anyone interested in the business case: Within EU the utility market was deregulated a few years ago, so that any customer can buy electricity from any supplier. When a customer switches supplier this is handled via EDI, in SAP using ALE and IDOC's. I'm working on a merger between two utility companies and for legal reasons we need to move the IDOC's. Any other data is migrated using migration workbench for IS-U.

    Hi Daniele
    I am not entirely sure, what you are asking, Please could you provide additional information.
    Are you looking for best practice recommendations for Governance, for example: Change transports between DEV, QA and PRD in BPC 7.0?
    What is the best method? Server Manager backup and restore, etc  ?
    And
    Best Practice recommendations on how to upgrade to a different version of BPC, for example: Upgrading from BPC 7.0 to 7.5 or 10.0 ?
    Kind Regards
    Daniel

  • What are best practice for packaging and deploying j2EE apps to iAS?

    We've been running a set of J2EE applications on a pair of iAS SP1b for about a year and it has been quite stable.
    Recently however we have had a number of LDAP issues, particularly when registering and unregistering applications (registering ear files sometimes fails 1st time but may work 2nd time). Also We've noticed very occasionally that old versions of classes sometimes find their way onto our machines.
    What is considered to be best practice in terms of packaging and deployment, specifically:
    1) Packaging - using the deployTool that comes with iAS6 SP1b to package is a big manual task, especially when you have 200+ jsp files. Are people out there using this or are they scripting it with a build tool such as Ant?
    2) Deploying an existing application to multiple iAS's. Are you guys unregistering old application then reregistering new application? Are you shutting down iAS whilst doing the deployment?
    3) Deploying ear files can take 5 to 10 mins, is this normal?
    4) In a clustered scenario where HTTPSession is shared what are the consequences of doing deployments to data stored in session?
    thanks in asvance for your replies
    Owen

    You may want to consider upgrading your application server environment to a newer service pack. There are numerous enhancements involving the deployment tool and run time layout of your application that make clear where you're application is loading its files from.
    If you've at a long running application server environment, with lots of deployments under your belt, you might start to notice slow downs in deployment and kjs start time. Generally this is due to garbage collecting in your iAS registry.
    You can do several things to resolve this. The most complete solution is to reinstall the application server. This will guarantee a clean ldap registry. Of course you've got to restablish your configurations and redeploy your applications. When done, backup your application server install space with the application server and directory server off. You can use this backup to return to a known configuation at some future time.
    For the second method: <B>BE CAREFUL - BACKUP FIRST</B>
    There is a more exhaustive solution that involves examining your deployed components to determine the active GUIDS. You then search the NameTrans section of the registry searching for Applogic Servlet *, and Bean * entries that represent your previously deployed components but are represented in the set of deployed GUIDs. Record these older GUIDs, remove them from ClassImp and ClassDef. Finally remove the older entries from NameTrans.
    Best practices for deployment depend on your particular environmental needs. Many people utilize ANT as a build tool. In later versions of the application server, complete ANT scripts are included that address compiling, assembly and deployment. Ant 1.4 includes iAS specific targets and general J2EE targets. There are iAS specific targets that can be utilized with the 1.3 version. Specialized build targets are not required however to deploy to iAS.
    Newer versions of the deployment tool allow you to specify that JSPs are not to be registered automatically. This can be significant if deployment times lag. Registered JSP's however benefit more fully from the services that iAS offers.
    2) In general it is better to undeploy then redeploy. However, if you know that you're not changing GUIDs, recreating an existing application with new GUIDs, or removing registered components, you may avoid the undeploy phase.
    If you shut down the KJS processes during deployment you can eliminate some addition workload on the LDAP server which really gets pounded during deployment. This is because the KJS processes detect changes and do registry loads to repopulate their caches. This can happen many times during a deployment and does not provide any benefit.
    3) Deploying can be a lengthy process. There have been improvements in that performance from service pack to service pack but unfortunately you wont see dramatic drops in deployment times.
    One thing you can do to reduce deployment times is to understand the type of deployment. If you have not manipulated your deployment descriptors in any way, then there is no need to deploy. Simply drop your newer bits in to the run time space of the application server. In later service packs this means exploding the package (ear,war, or jar) in to the appropriate subdirectory of the APPS directory.
    4) If you've changed the classes of objects that have been placed in HTTPSession, you may find that you can no longer utilize those objects. For that reason, it is suggested that objects placed in session be kept as simple as possible in order to minimize this effect. In general however, is not a good idea to change a web application during the life span of a session.

  • Best Practice Document to Configure Quality Management Analytics

    Hi Experts,
    Can any one help in finding out the best practices document for installing BI content for Quality Management.
    Regards,
    Santhosh

    Hi Reddy,
    Thanx  for the link but I have a requirement to install the BI content for Quality management to present it to the client. So I need to install as a whole not for a particular requirement.
    I want to know do I need to install all the datasources like masterdata and Transactional which are available in RSA5.
    Regards,
    Santhosh.

  • Best Practices - Transports

    Hi there
    In my brief Portal experience, I have only ever transported stuff (iviews, roles etc.) using the manual approach provided in the portal under System Admin >> Transports.
    Now, one of the other portal implementations that our company has done has been scrutinized, and one of the suggestions that has come out of that investigation (outside contractors) was to implement a "Portal Transport System" as documented in the Best Practices document, which is then supposed to automate the content migration process.
    So:
    1. Can anybody tell me where documentation is available on this transport approach?
    2. Can anybody confirm / deny that this is the best approach to use for transports.
    Any information would be greatly appreciated.
    Cheers,
    Andrew

    Hi,
    You can find a best practice document in SDN.
    Please check the below links.
    <a href="https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/ep/EP%206.0%20SP2.0/Best%20Practice%20for%20Transporting%20SAP%20NetWeaver%20Portal%20Content.pdf#search=%22How%20to%20transport%20portal%20packages%3F%20netweaver%22">Best Practice – Transporting SAP NetWeaver Portal Content</a>
    Also check..
    http://www.sapportalseminar.com/agenda/index.cfm?usergroup=regular&confCode=POR6-PHI
    Hope this helps you.
    Thanks,
    Ramakrishna

  • One-time import from external database - best practices/guidance

    Hi everyone,
    I was wondering if there was any sort of best practice or guideline on importing content into CQ5 from an external data source.  For example, I'm working on a site that will have a one-time import of existing content.  This content lives in an external database, in a custom schema from a home-grown CMS.  This importer will be run once - it'll connect to the external database, query for existing pages, and create new nodes in CQ5 - and it won't be needed again.
    I've been reading up a bit about connecting external databases to CQ (specifically this:http://dev.day.com/content/kb/home/cq5/Development/HowToConfigureSlingDatasource.html), as well as the Feed Importer and Site Importer tools in CQ, but none of it really seems to apply to what I'm doing.  I was wondering if there exists any sort of guidelines for this kind of process.  It seems like something like this would be fairly common, and a requirement in any basic site setup.  For example:
    Would I write this as a standalone application that gets executed from the command-line?  If so, how do I integrate that app with all of the OSGi services on the server?  Or,
    Do I write it as an OSGi module, or a servlet?  If so, how would you kick off the process? Do I create a jsp that posts to a servlet?
    Any docs or writeups that anyone has would be really helpful.
    Thanks,
    Matt

    Matt,
    the vault file format is just an xml representation of what's in the
    repository and the same as the package format. In fact, if you work on
    your projects with eclipse and maven instead of crxdelite to do your
    work, you will become quite used to that format throughout your project.
    Ruben

Maybe you are looking for

  • Itunes won't open i get message "

    i get this version of itunes has not been correctly localized for this language please run the english version. how???? and if i try and uninstall i get the message "the feature you are trying to use is on a networkresource that is unavailable??? so

  • AE url is not correct

    Hello, while monitoring AE I observed AE URL is not correct. My server name is ServerXX Then, I should see the URL for testing my message in RWB as below. http://ServerXX:50000/MessagingSystem/receive/AFW/ But it is showing ServerYY which i am not su

  • Macbook Pro screen has dim light beams

    On boot up, my Macook Pro's screen appears to have light beams shinning up from the bottom .. as if every other light beam is off causing the overall screen brightness to be dimmer than normal. This only happens about every third boot up so I doubt i

  • GP Design Time work set is missing in my portal

    Hi, I'm missing GP design time work set in my portal , how should i get it ,any body help me. santhosh

  • Scroll bars disappeared

    Hello everybody, (excuse my bad english, I'm french)... All is the title : Scroll bars disappeared ! ... I try to install again my "Flash prof", but the scroll bars don't be back. Can you help me, please ? I thank you very much, Fred.