Best Practice after moving office and upgrading network

Happy Friday all,
I wanted to know what the best practice is after some pretty dramatic changes on the network. We have just moved office, replaced switches, changed Internet service and got rid of lots of stuff we didn't need any more.
What is the best way forward to clean up the spiceworks inventory etc.
Is it possible to just delete everything and start again.
Short of a brand new installation (which I am leaning towards) are there any alternatives?
Thanks in advance,
Craig
This topic first appeared in the Spiceworks Community

Happy Friday all,
I wanted to know what the best practice is after some pretty dramatic changes on the network. We have just moved office, replaced switches, changed Internet service and got rid of lots of stuff we didn't need any more.
What is the best way forward to clean up the spiceworks inventory etc.
Is it possible to just delete everything and start again.
Short of a brand new installation (which I am leaning towards) are there any alternatives?
Thanks in advance,
Craig
This topic first appeared in the Spiceworks Community

Similar Messages

  • Best practices for cleaning up and upgrading

    Apologies in advance, as I know there is a wealth of info and ideas on this topic, but I'm looking for the best possible combination of simplicity and effectiveness and the community has done well by me in the past.
    Situation:  I have this old MacBook
    Model Name: MacBook
      Model Identifier: MacBook6,1
      Processor Name: Intel Core 2 Duo
      Processor Speed: 2.26 GHz
      Number of Processors: 1
      Total Number of Cores: 2
      L2 Cache: 3 MB
      Memory: 4 GB
      Bus Speed: 1.07 GHz
      Boot ROM Version: MB61.00C8.B00
      System Version: OS X 10.9.5 (13F34)
      Kernel Version: Darwin 13.4.0
    HD- Available: 6.22 GB
      Capacity: 249.2 GB
    Literally the only paid software that I have on here and need to keep is Microsoft Office.  I do not have the original install disk for this software.
    This machine is running very very slowly, and not much hard disk space left.  I'm not doing very well at cleaning up.
    I have brand new Time Machine backup and I have some pretty old Time Machine backups, pre Mavericks.
    I have a new 2TB external drive.
    I'm looking for the most effective way to speed this one up and make it as lean as possible and, if advisable, upgrade to Yosemite.
    Any recommendations?
    I'm willing to spend a little money if it can make a substantial difference.
    Thanks and apologies for length of post.
    RW

    Hi,
    First thanks for this.  This application has been really helpful.  I've deleted a lot of the easy stuff (music, pics, videos) and am now getting down to some of the stuff I'm less comfortable whacking without some guidance.  Here's an example.  I don't use the Mail application at all (use Gmail).  Here is what I see on OmniDisk Sweeper:
    https://www.dropbox.com/s/6iedkj46u4jaxbf/Screenshot%202014-11-08%2010.26.33.png ?dl=0
    I think the bulk of this is just messages from my gmail account, from a failed attempt to use Mail a bit years ago.  So can you help with at which point I can delete?  Hope question make sense.  You've helped a lot so far and I've freed up about 45 gb so far.
    Rob

  • Idoc processing best practices - use of RBDAPP01 and RBDMANI2

    We are having performance problems in the processing of inbound idocs.  The message type is SHPCON, and transaction volume is very high.  I am a functional consultant, not an ABAP developer, but will try my best to explain our current setup.
    1)     We have a number of message variants for the inbound SHPCON message, almost all of which are set to trigger immediately upon receipt under the Processing by Function Module setting.
    2)      For messages that fail to process on the first try, we have a batch job running frequently using RBDMANI2.
    We are having some instances of the RBDMANI2 almost every day which get stuck running for a very long period of time.  We frequently have multiple SHPCON idocs coming in containing the same material number, and frequently have idocs fail because the material in the idoc has become locked.  Once the stuck batch job is cancelled and the job starts running again normally, the materials unlock and the failed idocs begin processing.  The variant for the RBDMANI2 batch job is currently set with a packet size of 1 and without parallel processing enabled.
    I am trying to determine the best practice for processing inbound idocs such as this for maximum performance in a very high volume system.  I know that RBDAPP01 processes idocs in status 64 and 66, and RBDMANI2 is used to reprocess idocs in all statuses.  I have been told that setting the messages to trigger immediately in WE20 can result in poor performance.  So I am wondering if the best practice is to:
    1)     Set messages in WE20 to Trigger by background program
    2)     Have a batch job running RBDAPP01 to process inbound idocs waiting in status 64
    3)     Have a periodic batch job running RBDMANI2 to try and clean up any failed messages that can be processed
    I would be grateful if somebody more knowledgeable than myself on this can confirm the best practice for this process and comment on the correct packet size in the program variant and whether or not parallel processing is desirable.  Because of the material locking issue, I felt that parallel processing was not desirable and may actually increase the material locking problem.  I would welcome any comments.
    This appeared to be the correct area for this discussion based upon other discussions.  If this is not the correct area for this discussion, then I would be grateful if the moderator could re-assign this discussion to the correct area (if possible) or let me know the best place to post it.  Thank you for your help.

    Hi Bob,
    Not sure if there is an official best practice, but the note 1333417 - Performance problems when processing IDocs immediately does state that for the high volume the immediate processing is not a good option.
    I'm hoping that for SHPCON there is no dependency in the IDoc processing (i.e. it's not important if they're processed in the same sequence or not), otherwise it'd add another complexity level.
    In the past for the high volume IDoc processing we scheduled a background job with RBDAPP01 (with parallel processing) and RBDMANIN as a second step in the same job to re-process the IDocs with errors due to locking issues. RBDMANI2 has a parallel processing option, but it was not needed in our case (actually we specifically wouldn't want to parallel-process the errors to avoid running into a lock issue again). In short, your steps 1-3 are correct but 2 and 3 should rather be in the same job.
    Also I believe we had a designated server for the background jobs, which helped with the resource availability.
    As a side note, you might want to confirm that the performance issues are caused only by the high volume. An ABAPer or a Basis admin should be able to run a performance trace. There might be an inefficiency in the process that could be adding to the performance issue as well.
    Hope this helps.

  • MMO best practice. Download music and heavy files to users hard disk?

    MMO best practice. Download music and heavy files to users hard disk?
    I have just downloaded a Hello Kitty MMO app for research (for my kid of course).
    I am developping my English teaching app with LOADS of classical music, mp3 sentences and heavy background bgs. Would the best idea be for client to download these to their hardisk ie I would not need to stream them and therefore save a fortune on bandwidth charges from my ISP???
    Cheers

    I see what you mean ie: they have to get the file to their computer one way or another BUT
    a. If they are going to repeatedly use that file ie: a custom cursor or classical piece of music every week when they log on then it would be better for them to have it on their hardisk wouldn't it? If not, they would have to download it every time they log on. I take it that's why the hello kitty site makes you download 130 megas so you have everything on your hardisk, ie: you will be reusing all those assets MANY times in the future. The experience wil be very FAST as you have it on your local disk and needn't have to wait for streaming.

  • What are the best practices for Database management and performance tuning?

    Hello,
    I want to ensure that I am using the best practices for managing and maintaining our Database.
    Is there any documentation out there that outlines how to maintain and ensure top performance out of our database?
    Thank you!
    John Sefton

    I appreciate the responses, however this is not the information I am looking for.
    I am specificaly looking for best practices invloving the managment and performance tuning.
    Example: are their tools that I can install that will monitor the size and response time of the database and alert me if there is degradation in performance?
    Are there specific periodic activities I should be doing to garuntee that my database will continue to function that way it is supposed to?
    Or is this a fire and forget solution that does not need this attention?

  • Networking issues after a move and upgrade

    I just moved, and did a massive system upgrade, moving over fully to systemd.
    Everything works fine, except my network seems to crap out after a random period of time (sometimes 2 hours, sometimes 10 minutes, sometimes it stays up for days).
    ive used different ports on my router, different ethernet cables, so im pretty sure the issue is with my laptop
    i used the ethernet-static script and put in my own network info (using 10.0.0.0/24 network), a correct default gateway, and working nameservers.
    on my main computer i have MPD going over a SMB share, and sometimes it just stops working. When i open up Wireshark i see im getting ICMP Dest unreachable errors.
    What could be causing this? Any ideas?
    Last edited by ptchinster (2013-03-10 19:16:56)

    No. Something is wrong with modprobe I guess. I noticed yesterday that the line that usually looked lieke this
    eth1      Link encap:Ethernet  HWaddr 00:10:A7:27:5D:A2
    was like this
    eth1-00      Link encap:UNSPEC  HWaddr 00-10-A7-27-5D-A2-00-00-00-00 or something like that
    So I did a manual modprobe 8139too and I got
    eth2      Link encap:Ethernet  HWaddr 00:10:A7:27:5D:A2
    I changed the settings in rc.conf from eth1 to eth2, restarted the networking script and the network was OK. But this doesn's anwer the question what was wrong with it in the first place...

  • Time Machine best practices after Lion to Mountain Lion upgrade

    I've made the upgrade from Lion to Mountain Lion and everything seems to be OK.
    I have been using Time Machine for backups since I deployed my first and, so far, only Mac (Mac Mini running Lion) in 2011.  I run my TM backups manually.  Since upgrading to Mountain Lion, I have not yet kicked off a TM backup, so my questions involve best practices with TM after an upgrade from Lion to Mountain Lion:
    Can I simply use the same drive as I use currently, do what I've always done, start the backup manually, and TM handles gracefully the new backup from the new OS?  
    At this point, since I have only backups to the Lion system, what I see when I doubleclick on the Time Machine drive is a folder called “Backups.backupdb”, then a subfolder called “My Mac mini”, and then all the backup events.   Nothing else.  What will I see once I do a backup now, after the Mountain Lion upgrade?
    If I for some reason needed to boot to my old Lion system (I cloned the startup disk prior to upgrading to ML) and access my old Lion backups with TM, would I be successful?  In other words does the system know that I'm booted to Lion, so give me access to the TM backups created under Lion?   Conversely when booted to the new Mountain Lion system, will I have access only to the backups created since the upgrade to Mountain Lion?
    Any other best practices steps I should take prior to my first ML backup?
    Time Machine is a great straightforward system to use (although I have to say I’ve not (yet) needed to depend on it for recovery...I trust that will go will when needed) but I don't want to make any assumptions as to how it works after a major OS upgrade.
    Thank you for reading.

    1. Correct. If you want to downgrade to OS X Lion, your Mac will still keep backups created with OS X Lion, so just start into Internet Recovery and select one of the backups made with OS X Lion. If you don't want that Time Machine backs up automatically, you may need to use TimeMachineEditor.
    2. After making a backup with Mountain Lion, it will be the same, but with a new folder that belongs to the new backup you have created.
    3. See my first answer.
    4. One advice: when your Time Machine drive gets full, Time Machine deletes old backups, so maybe it can remove all your OS X Lion backups. However, I don't think that you would need to go back to OS X Lion.
    If you have any questions apart from those, see Pondini's website > http://pondini.org

  • Best practice to restart weblogic and managed  servers after changing pass

    hi
    i need to know if have an installation with the following structer what is the best practice to restart the servers after we change weblogic password ?and i know that i should change boot.properties with the new password ,is there anything else to change ? please advise .
    oracle access manager &oid (dose it affected by weblogic password change ?)
    thank you
    Edited by: hsweiss on Jul 31, 2012 10:45 AM

    the order is to restart:All managed servers and followed by Admin server
    If your weblogic server is integrated with oid, you need to change the credentilas in both the places ie weblogic server(boot.properties) and idm server
    if your weblogic server is using your internal ldap for the authentication, your idm and oam wil not effect with this change
    if you change the boot.properties file that is enough, you don't need to change anywhere

  • Configuring AD Sites and Services best practice for multiple office site ?

    Hi People,
    Can anyone here please suggest me or share the link of what is the best practice in configuring the AD Sites and Service for single AD domain with multiple office sites ?
    I'd like to know more about the number and the direction of the connection between Domain Controllers in one site to the Data Center and vice versa.
    Thanks.
    /* Server Support Specialist */

    Hi People,
    Can anyone here please suggest me or share the link of what is the best practice in configuring the AD Sites and Service for single AD domain with multiple office sites ?
    This series can be useful:
    Active Directory Structure Guidelines – Part 1
    Mahdi Tehrani   |  
      |  
    www.mahditehrani.ir
    Please click on Propose As Answer or to mark this post as
    and helpful for other people.
    This posting is provided AS-IS with no warranties, and confers no rights.
    How to query members of 'Local Administrators' group in all computers?

  • Best Practices for Using Photoshop (and Computing in General)

    I've been seeing some threads that lead me to realize that not everyone knows the best practices for doing Photoshop on a computer, and in doing conscientious computing in general.  I thought it might be a good idea for those of us with some exprience to contribute and discuss best practices for making the Photoshop and computing experience more reliable and enjoyable.
    It'd be great if everyone would contribute their ideas, and especially their personal experience.
    Here are some of my thoughts on data integrity (this shouldn't be the only subject of this thread):
    Consider paying more for good hardware. Computers have almost become commodities, and price shopping abounds, but there are some areas where spending a few dollars more can be beneficial.  For example, the difference in price between a top-of-the-line high performance enterprise class hard drive and the cheapest model around with, say, a 1 TB capacity is less than a hundred bucks!  Disk drives do fail!  They're not all created equal.  What would it cost you in aggravation and time to lose your data?  Imagine it happening at the worst possible time, because that's exactly when failures occur.
    Use an Uninterruptable Power Supply (UPS).  Unexpected power outages are TERRIBLE for both computer software and hardware.  Lost files and burned out hardware are a possibility.  A UPS that will power the computer and monitor can be found at the local high tech store and doesn't cost much.  The modern ones will even communicate with the computer via USB to perform an orderly shutdown if the power failure goes on too long for the batteries to keep going.  Again, how much is it worth to you to have a computer outage and loss of data?
    Work locally, copy files elsewhere.  Photoshop likes to be run on files on the local hard drive(s).  If you are working in an environment where you have networking, rather than opening a file right off the network, then saving it back there, consider copying the file to your local hard drive then working on it there.  This way an unexpected network outage or error won't cause you to lose work.
    Never save over your original files.  You may have a library of original images you have captured with your camera or created.  Sometimes these are in formats that can be re-saved.  If you're going to work on one of those files (e.g., to prepare it for some use, such as printing), and it's a file type that can be overwritten (e.g., JPEG), as soon as you open the file save the document in another location, e.g., in Photoshop .psd format.
    Save your master files in several places.  While you are working in Photoshop, especially if you've done a lot of work on one document, remember to save your work regularly, and you may want to save it in several different places (or copy the file after you have saved it to a backup folder, or save it in a version management system).  Things can go wrong and it's nice to be able to go back to a prior saved version without losing too much work.
    Make Backups.  Back up your computer files, including your Photoshop work, ideally to external media.  Windows now ships with a quite good backup system, and external USB drives with surprisingly high capacity (e.g., Western Digital MyBook) are very inexpensive.  The external drives aren't that fast, but a backup you've set up to run late at night can finish by morning, and if/when you have a failure or loss of data.  And if you're really concerned with backup integrity, you can unplug an external drive and take it to another location.
    This stuff is kind of "motherhood and apple pie" but it's worth getting the word out I think.
    Your ideas?
    -Noel

    APC Back-UPS XS 1300.  $169.99 at Best Buy.
    Our power outages here are usually only a few seconds; this should give my server about 20 or 25 minutes run-time.
    I'm setting up the PowerChute software now to shut down the computer when 5 minutes of power is left.  The load with the monitor sleeping is 171 watts.
    This has surge protection and other nice features as well.
    -Noel

  • Best practiceS for setting up Macs on Network

    Greetings.
    We have six Macs on our Windows Server network; three iMacs and three laptops. We have set up all the machines and they are joined to the Active Directory. In the past, we have always created local users on the machines and then "browsed" to the server shares and mounted the them. We've learned things have improved/changed over the years and we're just now realizing we can probably have the machines set up to work better. So, I have a couple of questions for "best practices" when setting up each of the machines.
    1. Since we’re in a network environment, should we not set up “local logins/users” and instead have users login using their AD login? It seems having a local account creates some conflicts with the server since upgrading to lion.
    2. Should we set the computer to not ask for a “list of users” and instead ask for a username and password for logins?
    3. For the user that uses the machine most often, they can still customize their desktop when they use an AD login, correct?
    4. Should we set up Mobile User Accounts? What exactly does this do?
    Any other advice on how we should best be setting up the clients for our environment to make sure we are following best practices would be great!
    Thanks for any help!
    Jay

    Greetings.
    We have six Macs on our Windows Server network; three iMacs and three laptops. We have set up all the machines and they are joined to the Active Directory. In the past, we have always created local users on the machines and then "browsed" to the server shares and mounted the them. We've learned things have improved/changed over the years and we're just now realizing we can probably have the machines set up to work better. So, I have a couple of questions for "best practices" when setting up each of the machines.
    1. Since we’re in a network environment, should we not set up “local logins/users” and instead have users login using their AD login? It seems having a local account creates some conflicts with the server since upgrading to lion.
    2. Should we set the computer to not ask for a “list of users” and instead ask for a username and password for logins?
    3. For the user that uses the machine most often, they can still customize their desktop when they use an AD login, correct?
    4. Should we set up Mobile User Accounts? What exactly does this do?
    Any other advice on how we should best be setting up the clients for our environment to make sure we are following best practices would be great!
    Thanks for any help!
    Jay

  • Best practice for moving from a G5 to a new Mac with SL

    I am receiving my new iMac today (27") and am very excited
    However I want to move over using the best practices to assure that I remain excited and not frustrated
    My initial thoughts are to boot it up and doe the initial set up - to move my iPhoto library over and to use migration assistance to move the rest of my data files
    Then to install all of the extra software that I can find the packages for from the original installation disks
    And then finally to use migration assistant again to move over any software that I can not find original disks for (I've moved from Mac to Mac to Mac over and over and some of the software goes back to OS 9 (and won't run anymore I guess)
    Is this a good way
    OR
    will I mess up doing it this way
    OR
    am I spending far too much time worrying about moving old problems over and would be better off to just turn MA loose and let it do its thing form the beginning?
    BTW - mail crashes a lot on my existing system - pretty much everything else seems ok - except iPhoto is slow - hoping that the new Intel dual core will help that
    LN

    Migration Assistant is not a general file moving tool. MA will migrate your Applications and Home folders transferring only your third-party applications. MA will transfer any application support folders required by your applications, your preferences, and network setup. You do not have a choice of what will be migrated other than the above. MA cannot determine whether anything transferred is compatible with Snow Leopard. I recommend you look at the following:
    A Basic Guide for Migrating to Intel-Macs
    If you are migrating a PowerPC system (G3, G4, or G5) to an Intel-Mac be careful what you migrate. Keep in mind that some items that may get transferred will not work on Intel machines and may end up causing your computer's operating system to malfunction.
    Rosetta supports "software that runs on the PowerPC G3, G4, or G5 processor that are built for Mac OS X". This excludes the items that are not universal binaries or simply will not work in Rosetta:
    Classic Environment, and subsequently any Mac OS 9 or earlier applications
    Screensavers written for the PowerPC
    System Preference add-ons
    All Unsanity Haxies
    Browser and other plug-ins
    Contextual Menu Items
    Applications which specifically require the PowerPC G5
    Kernel extensions
    Java applications with JNI (PowerPC) libraries
    See also What Can Be Translated by Rosetta.
    In addition to the above you could also have problems with migrated cache files and/or cache files containing code that is incompatible.
    If you migrate a user folder that contains any of these items, you may find that your Intel-Mac is malfunctioning. It would be wise to take care when migrating your systems from a PowerPC platform to an Intel-Mac platform to assure that you do not migrate these incompatible items.
    If you have problems with applications not working, then completely uninstall said application and reinstall it from scratch. Take great care with Java applications and Java-based Peer-to-Peer applications. Many Java apps will not work on Intel-Macs as they are currently compiled. As of this time Limewire, Cabos, and Acquisition are available as universal binaries. Do not install browser plug-ins such as Flash or Shockwave from downloaded installers unless they are universal binaries. The version of OS X installed on your Intel-Mac comes with special compatible versions of Flash and Shockwave plug-ins for use with your browser.
    The same problem will exist for any hardware drivers such as mouse software unless the drivers have been compiled as universal binaries. For third-party mice the current choices are USB Overdrive or SteerMouse. Contact the developer or manufacturer of your third-party mouse software to find out when a universal binary version will be available.
    Also be careful with some backup utilities and third-party disk repair utilities. Disk Warrior 4.1, TechTool Pro 4.6.1, SuperDuper 2.5, and Drive Genius 2.0.2 work properly on Intel-Macs with Leopard. The same caution may apply to the many "maintenance" utilities that have not yet been converted to universal binaries. Leopard Cache Cleaner, Onyx, TinkerTool System, and Cocktail are now compatible with Leopard.
    Before migrating or installing software on your Intel-Mac check MacFixit's Rosetta Compatibility Index.
    Additional links that will be helpful to new Intel-Mac users:
    Intel In Macs
    Apple Guide to Universal Applications
    MacInTouch List of Compatible Universal Binaries
    MacInTouch List of Rosetta Compatible Applications
    MacUpdate List of Intel-Compatible Software
    Transferring data with Setup Assistant - Migration Assistant FAQ
    Because Migration Assistant isn't the ideal way to migrate from PowerPC to Intel Macs, using Target Disk Mode, copying the critical contents to CD and DVD, an external hard drive, or networking
    will work better when moving from PowerPC to Intel Macs. The initial section below discusses Target Disk Mode. It is then followed by a section which discusses networking with Macs that lack Firewire.
    If both computers support the use of Firewire then you can use the following instructions:
    1. Repair the hard drive and permissions using Disk Utility.
    2. Backup your data. This is vitally important in case you make a mistake or there's some other problem.
    3. Connect a Firewire cable between your old Mac and your new Intel Mac.
    4. Startup your old Mac in Target Disk Mode.
    5. Startup your new Mac for the first time, go through the setup and registration screens, but do NOT migrate data over. Get to your desktop on the new Mac without migrating any new data over.
    If you are not able to use a Firewire connection (for example you have a Late 2008 MacBook that only supports USB:)
    1. Set up a local home network: Creating a small Ethernet Network.
    2. If you have a MacBook Air or Late 2008 MacBook see the following:
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- Migration Tips and Tricks;
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- What to do if migration is unsuccessful;
    MacBook Air- Migration Tips and Tricks;
    MacBook Air- Remote Disc, Migration, or Remote Install Mac OS X and wireless 802.11n networks.
    Copy the following items from your old Mac to the new Mac:
    In your /Home/ folder: Documents, Movies, Music, Pictures, and Sites folders.
    In your /Home/Library/ folder:
    /Home/Library/Application Support/AddressBook (copy the whole folder)
    /Home/Library/Application Support/iCal (copy the whole folder)
    Also in /Home/Library/Application Support (copy whatever else you need including folders for any third-party applications)
    /Home/Library/Keychains (copy the whole folder)
    /Home/Library/Mail (copy the whole folder)
    /Home/Library/Preferences/ (copy the whole folder)
    /Home /Library/Calendars (copy the whole folder)
    /Home /Library/iTunes (copy the whole folder)
    /Home /Library/Safari (copy the whole folder)
    If you want cookies:
    /Home/Library/Cookies/Cookies.plist
    /Home/Library/Application Support/WebFoundation/HTTPCookies.plist
    For Entourage users:
    Entourage is in /Home/Documents/Microsoft User Data
    Also in /Home/Library/Preferences/Microsoft
    Credit goes to Macjack for this information.
    If you need to transfer data for other applications please ask the vendor or ask in the Discussions where specific applications store their data.
    5. Once you have transferred what you need restart the new Mac and test to make sure the contents are there for each of the applications.
    Written by Kappy with additional contributions from a brody.
    Revised 1/6/2009
    In general you are better off reinstalling any third-party software that is PPC-only. Otherwise update your software so it's compatible with Snow Leopard.
    Do not transfer any OS 9 software because it's unsupported. You can transfer documents you want to keep.
    Buy an external hard drive to use for backup.

  • Best Practice to Atomic Read and Write a Field In Database

    I am from Java Desktop Application background. May I know what is the best practice in J2EE, to atomic read and write a field in database. Currently, here is what I did
    // In Servlet.
    synchronized(private_static_final_object)
        int counter = read_counter_from_database();
        counter++;
        write_counter_back_to_database(counter);
    }However, I suspect the above method will work all the time.
    As my observation is that, if I have several web request at the same time, I am executing code within single instance of servlet, using different thread. The above method shall work, as different thread web request, are all referring to same "private_static_final_object"
    However, my guess is "single instance of servlet" is not guarantee. As after some time span, the previous instance of servlet may destroy, with another new instance of servlet being created.
    I also came across [http://code.google.com/appengine/docs/java/datastore/transactions.html|http://code.google.com/appengine/docs/java/datastore/transactions.html] in JDO. I am not sure whether they are going to solve the problem.
    // In Servlet.
    Transaction tx = pm.currentTransaction();
    tx.begin();
        int counter = read_counter_from_database();  // Line 1
        counter++;                                                  // Line 2
        write_counter_back_to_database(counter);    // Line 3
    tx.commit();Is the code guarantee only when Thread A finish execute Line 1 till Line 3 atomically, only Thread B can continue to execute Line 1 till Line 3 atomically?
    As I do not wish the following situation happen.
    (1) Thread A read counter from Database as 0
    (2) Thread A increment counter to 1
    (3) Thread B read counter from Database as 0
    (4) Thread A write counter as 1 to database
    (5) Thread B increment counter to 1
    (6) Thread B write counter as 1 to database
    What I wish is
    (1) Thread A read counter from Database as 0
    (2) Thread A increment counter to 1
    (4) Thread A write counter as 1 to database
    (3) Thread B read counter from Database as 1
    (5) Thread B increment counter to 2
    (6) Thread B write counter as 2 to database
    Thanks you.
    Edited by: yccheok on Oct 27, 2009 8:39 PM

    This is my understanding of the issue (you should research it further on you own to get a better understanding):
    I suggest you use local variables (ie, defined within a function), and keep away from static variables. Those local variables are thread safe. If you call functions within functions, its still thread safe. If you read or write to one record in a database using sql, its thread safe (you dont need a transaction). If you read/write to multiple tables and/or records, you probably need a transaction. Servlets are thread safe. You usually dont need the 'synchronized' word anywhere unless you have a function updating/reading a static variable and therefore want to ensure only one user is accessing the static varible at a time. Also do so if you are writing to some resource (such as a file, a variable in applicaton scope, session scope, or resource that everyone uses such as email server). You dont want more than one person at a time to write to it). Note the database is one of of those resources that is handled by transactions rather than the the synchronized keyword (the synchronized keyword is applied to your application only (not other applications that someone is running), whereas the transaction ensures all applications are locked out while you update those records in the database).
    By the way, if you have a static variable, you should have one and only one function (synchronized) that updates it that everyone uses. If you have more than one synchronized function, that updates it, its probably not thread safe.
    An example of a static variable you would use is a Datasource object (to obtain your database connections). You only need one connection pool in your application and you access it via the datasource variable.
    If you're unsure your code is thread safe, you can create two seperate threads that call the same block of functions repeatedly to ensure it works as expected.

  • Unity -- Best practice for Attendant Console and Operator

    I am fairly new to Voice Over IP. I am in a new position for managing our phones, e.g., Moves, Adds, Changes and Call Handler management. We use version 8.6.2.
    Our system is comprised of 19 different buildings, with 1 to 2 Attendant Consoles per building. When our phone network was set up, people who have Attendant Consoles did not want callers to have the option to leave a message if someone is away from their Attendant Console momentarily. Is this best practice? What are most companies doing with their setups?
    In addition, in what capacity is the Operator account of value or can it be deleted? We have discovered that if people choose the System Directory Handler, they can "find" their way to the Operator voicemail by pressing "0" and leave a message. However, this voicemail box has not been monitored, thus their are over 300 messages in it.
    I know I will have other questions, but if someone has a a good way of setting this up, I would be very thankful.
    Laurel

    I'll give it a go!
    "In addition, in what capacity is the Operator account of value or can it be deleted? We have discovered that if people choose the System Directory Handler, they can "find" their way to the Operator voicemail by pressing "0" and leave a message. However, this voicemail box has not been monitored, thus their are over 300 messages in it.'
    You can't actually delete the Operator (It's a default User Account). Just go into the System Directoy Handler, click edit > caller input > if you're feeling bloody minded, choose "hang up" on the "If Caller Presses 0", otherwise either create a new call handler or point it to an existing one.
    "Our system is comprised of 19 different buildings, with 1 to 2 Attendant Consoles per building. When our phone network was set up, people who have Attendant Consoles did not want callers to have the option to leave a message if someone is away from their Attendant Console momentarily. Is this best practice? What are most companies doing with their setups?"
    I have to deal with both situations. Some locations refuse a voice mail, some want it. It's normally at the wish of the department head. Just make sure that call does not ring forever or otherwise make the caller upset.
    Gareth

  • Best practices for using .load() and .unload() in regards to memory usage...

    Hi,
    I'm struggling to understand this, so I'm hoping someone can explain how to further enhance the functionality of my simple unload function, or maybe just point out some best practices in unloading external content.
    The scenario is that I'm loading and unloading external swfs into my movie(many, many times over) In order to load my external content, I am doing the following:
    Declare global loader:
    var assetLdr:Loader = new Loader();
    Load the content using this function:
    function loadAsset(evt:String):void{
    var assetName:String = evt;
    if (assetName != null){
      assetLdr = new Loader();
      var assetURL:String = assetName;
      var assetURLReq:URLRequest = new URLRequest(assetURL);
      assetLdr.load(assetURLReq);
      assetLdr.contentLoaderInfo.addEventListener( Event.INIT , loaded)
      assetLdr.contentLoaderInfo.addEventListener(ProgressEvent.PROGRESS, displayAssetLoaderProgress);
      function loaded(event:Event):void {
       var targetLoader:Loader = Loader(event.target.loader);
       assetWindow.addChild(targetLoader);
    Unload the content using this function:
    function unloadAsset(evt:Loader) {
    trace("UNLOADED!");
    evt.unload();
    Do the unload by calling the function via:
    unloadAsset(assetLdr)
    This all seems to work pretty well, but at the same time I am suspicious that the content is not truly unloaded, and some reminents of my previously loaded content is still consuming memory. Per my load and unload function, can anyone suggest any tips, tricks or pointers on what to add to my unload function to reallocate the consumed memory better than how I'm doing it right now, or how to make this function more efficient at clearing the memory?
    Thanks,
    ~Chipleh

    Since you use a single variable for loader, from GC standpoint the only thing you can add is unloadAndStop().
    Besides that, your code has several inefficiencies.
    First, you add listeners AFTER you call load() method. Given asynchronous character of loading process, especially on the web, you should always call load() AFTER all the listeners are added, otherwise you subject yourself to unpredictable results and bud that are difficult to find.
    Second, nested function are evil. Try to NEVER use nested functions. Nested functions may be easily the cause for memory management problems.
    Third, your should strive to name variables in a manner that your code is readable. For whatever reason you name functions parameters evt although a better way to would be to name them to have something that is  descriptive of a parameter.
    And, please, when you post the code, indent it so that other people have easier time to go through it.
    With that said, your code should look something like that:
    function loadAsset(assetName:String):void{
         if (assetName) {
              assetLdr = new Loader();
              assetLdr.contentLoaderInfo.addEventListener(Event.INIT , loaded);
              assetLdr.contentLoaderInfo.addEventListener(ProgressEvent.PROGRESS, displayAssetLoaderProgress);
              // load() method MUST BE CALLED AFTER listeners are added
              assetLdr.load(new URLRequest(assetURL));
    // function should be outside of other function.
    function loaded(e:Event):void {
         var targetLoader:Loader = Loader(event.target.loader);
         assetWindow.addChild(targetLoader);
    function unloadAsset(loader:Loader) {
         trace("UNLOADED!");
         loader.unload();
         loader.unloadAndStop();

Maybe you are looking for