Best practice for external drive?

A really basic set of questions, for which I ought to know the answers:
with four external firewire drives connected, three of them daisy-chained to one FW800 port, the fourth to a second FW800 port, do I:
a) when shutting down the computer, eject them, or can I just shut down the computer then shut down the drives?
b) when ejecting the drives, is there a particular order (eg, the drive that is actually connected to the computer of the three which are tied together being last to eject?)
c) when sleeping the computer, is it all right to leave the drives spinning and let them sleep when they realize the computer is sleeping?
d) when starting up, start the drives before turning on the computer? I've actually forgotten this, and started the drives after the computer was up and running, and it doesn't SEEM to have any ill effect, but am I doing something dangerous?

OS X is pretty tolearant of whatever you do. Just don't turn them off or unplug them unless you either turn off the computer or eject the volume by dragging to the Trash.
So
a) shut down the computer then shut down the drives: yes
b) doesn't matter with my FW drives, but I suppose there could be some that will not pass the FW signals if it is shut off. I believe if they are built correctly they should work like mine and the FW communication will pass through the devices that are powered down.
c) OS X should be in control of drive sleep. Some HD manufacturers have firmware that conflicts with OS X and insist upon sleeping the drives when they ought not to, or vice versa. When sleeping the computer, just ignore the drives. Do not shut them off though or else OS X will scold you for having removed them unexpectedly. They should sleep on their own, but if their firmware insists otherwise, there is nothing you can do about it.
d) This doesn't matter at all, unless (obviously) you need to boot the computer from one of the external volumes.
Read about OS X file system journaling - it should ease your mind about potential corruption: http://support.apple.com/kb/HT2355

Similar Messages

  • Best Practice for External Libraries Shared Libraries and Web Dynrpo

    Two blogs have been written on sharing libraries with Web Dynpro DC, but I would
    like to know the best practice for doing this.
    External libraries seem to work great at compile time, but when deploying there is often an error related to the external library not being a deployed component. 
    Is there a workaround for this besides creating a shared J2EE library which I have been able to get working?  I am not interested in something that works, but really
    what are the best practice for this. What is the best way to  limit the number of jars that need to be kept in a shared library/ext library.  When is sharing ref service/etc a valid approach vs. hunting down the jars in the portal libraries etc and storing in an external library.

    Security is mainly about mitigation rather than 100% secure, "We have unknown unknowns". The component needs to talk to SQL Server. You could continue to use http to talk to SQL Server, perhaps even get SOAP Transactions working but personally
    I'd have more worries about using such a 'less trodden' path since that is exactly the areas where more security problems are discovered. I don't know about your specific design issues so there might be even more ways to mitigate the risk but in general you're
    using a DMZ as a decent way to mitigate risk. I would recommend asking your security team what they'd deem acceptable.
    http://pauliom.wordpress.com

  • Best practice for external but secure access to internal data?

    We need external customers/vendors/partners to access some of our company data (view/add/edit).  It’s not so easy as to segment out those databases/tables/records from other existing (and put separate database(s) in the DMZ where our server is).  Our
    current solution is to have a 1433 hole from web server into our database server.  The user credentials are not in any sort of web.config but rather compiled in our DLLs, and that SQL login has read/write access to a very limited number of databases.
    Our security group says this is still not secure, but how else are we to do it?  Even if a web service, there still has to be a hole in somewhere.  Any standard best practice for this?
    Thanks.

    Security is mainly about mitigation rather than 100% secure, "We have unknown unknowns". The component needs to talk to SQL Server. You could continue to use http to talk to SQL Server, perhaps even get SOAP Transactions working but personally
    I'd have more worries about using such a 'less trodden' path since that is exactly the areas where more security problems are discovered. I don't know about your specific design issues so there might be even more ways to mitigate the risk but in general you're
    using a DMZ as a decent way to mitigate risk. I would recommend asking your security team what they'd deem acceptable.
    http://pauliom.wordpress.com

  • Best practices with external drives connected to Time Capsule?

    I'm enjoying my new MBP but the storage limitations are a challenge with twenty+ years of data. I know it's not technically sanctioned by Apple but in another forum I found a discussion about successfully connecting an external drive to the Time Capsule. I tried this for a few months and had some problems, so I wonder if anyone here might have some insight into which of my drives would be best to use in this way and what, if anything, I must do to bless it for the task.
    The drive most highly recommended in the forums at the time is the WD My Book. After buying it I learned that is Windows users install a package and update the driver for use with Time Machine, which Mac users are not able to do. The driver is available on the web but it's said to cause problems with Mountain Lion, which I can confirm. It's USB 3.0 and powered, though, which my two alternatives are not. It's also 2 TB and presently in two partitions. I also have a Seagate Free Agent GoFlex 500 which gets extremely hot with any use and an Iomega eGo Helium 500 I've been using for backups. Both are USB 2.0.
    My goal is to put my iTunes Library and Time Capsule on the net (running  2.4 Ghz and 5 Ghz, averages 50Mbs). When I did this for a few months with the WD both its shares frequently dropped off my Desktop, as if powered down or something. Also there were the issues with Time Machine many others are having. Right now our video collection sits on the Time Capsule. Would it make sense to move our video to the WD and run iTunes and Time Machine on it? If so, do you have any tips for formatting, petitioning, or drivers (or anything else) to help this run smoothly? The WD works fine when hard-wired and the TC, too is fine without the WD at the party.
    With apologies for the tome.

    kevlee64 wrote:
    Thanks, but the drive is connected directly to the back of the Time Capsule by an ethernet cable.
    Ah, that's a different colored horse.   
    You have a NAS (Network Attached Storage) drive, not a USB external drive connected to the USB port.
    Many of those need software/firmware updates to work with Lion.  Check with Iomega.

  • Best practice for # of drives for Oracle on a Windows 2003 server

    I need to know what the best practice is concerning the # of drives that should be built on a Windows 2003 server to ensure best performance and most effective back up and recovery, for both the application itself and the data base.
    I'm not certain, but it may be only a 32 bit machine. I'll update this once I know for sure.

    We are in the process of migrating our Oracle 10 database (20G) to a new maschine.
    How should we configure our disks (8 in total)?
    1. SAME: "Stripe and mirror everything"?
    2. Doc 30286.1 "I/O tuning with different RAID configurations" and 148342.1 "Avoiding I/O disk contention" say:
    database files on RAID01
    redo and archive logs on RAID1
    temp on RAID1
    So, what is the best practice?

  • Best setup for external drives and backup

    I'm using Aperture to organize several thousand photos. I've gotten good advice her before about how to get started on this. I'm not a professional and I'm not an experienced user of Aperture, so don't assume a lot of knowledge when you answer. I'm wondering what the best way to set up the file storage would be. I use a MacBook Pro, so obviously that's out for storage, and financially, purchasing a more powerful desktop is out for the time being. I was thinking of purchasing a mirror drive system, like this: http://www.newertech.com/products/gmax.php
    But then, there still remains the problem of backing up the photos in case of a fire or theft, etc. I have many of them on DVD, but not with all the metadata that I've added. Can I back the external drives up to a cloud-based storage system through the wireless on the MacBook?
    Or, is the answer none of the above? What recommendations do you folks have for managing this?

    more:
    Paula-
    Mirror drives are very much less than ideal for images backup. Mirroring occurs in real-time, so errors, breaks, etc. simply get copied. With images work (unlike fanancial work, for instance) we do not need real-time backup we just need regular accurate backup. Just have 2-3 external drives and rotate them regularly, always having one drive off-site (could be in your car or whatever). Back up manually rather than automatically so that you can be reasonably certain that the backup is not backing up something that just broke.
    I suggest the below workflow. Note that most important is that original images from the camera card are copied to two locations before reformatting the card and before importing into Aperture or any other images management application.
    Original images never change, so I prefer to manually copy them to two locations asap after capture: one location is the computer drive and the other is an external backup HD that itself gets backed up off site regularly. That assures me that "the pic is in the can." Until two digital files exist on different media I do not consider the pic in the can.
    Then reformat the card in-camera, not before.
    The Masters then get imported into Aperture from the Mac internal drive by reference (i.e. "Storing Files: in their current location" on the Mac internal drive). After editing is complete (may take weeks or months), from within Aperture I relocate the referenced Masters to an external hard drive for long-term storage.
    I do use Time Machine routinely on the MBP, but for the daily-volatile activities going of the MBP. I prefer not to have TM be my backup-of-originals protocol. Instead TM backs up the Mac internal drive on the TM schedule and I back up original images asap based on my shooting schedule. Also the TM drive is a different drive than the drives used for long-term original image files archiving.
    TM does back up my Library because the Library lives on the Mac internal drive but I do not assume that TM works for Library backup. I back the Library up to Vaults (on the same drives I put archives of Masters on) independent of TM. IMO one should back up image files and back up Vaults manually after verifying that what is being backed up is not broken, because automatic backup will just back up a broken Library or whatever.
    Note that Masters need only be backed up once (but to multiple backup locations) and that backup should happen immediately after copying to the hard drive from the camera card, before involving Aperture or any other images management app.
    Sorry for the redundant verbosity above but some was copied from previous posts. Also, I reinforce what Léonie said about DVDs. DVDs are way too slow, unreliable, etc. Instead rotate multiple hard drives to achieve redundancy.
    HTH
    -Allen

  • What is the best practice for External Repairing

    Hi Experts,
    When i go through the thread search I found there are so many way to repair a equipment exteranlly.But could we please tell me which is the best one(subcontracting/through refurbishment order).
    AR
    Edited by: Amit  Rana on Mar 17, 2010 5:46 PM

    I have a question recently that sounds something like yours.  Did some research and here is what i gathers.
    You have to look at what is need.  Do you want to monitor the status and location of the faulty equipment out of your company?  If all you need is to track the repair cost, you can do it directly from the original maintenance order.  just create a PR, convert to PO.  Receive the service and its done.  This is pretty simple process to begin with.
    If you want to monitor the equipment out of your company then you can do it the refurbishment (subcontracting) way.  In this method, you treat the refurbishment work like a subcontracting job where you buy the repair service while providing the faulty equipment as material.  If you are doing this, there are some pre-req:
    1. ideally your material should have split valuation to reflect the different valuation at different condition.
    2. Your material is serialized.
    3. i am no MM expert but if your material need to be marked as sub-con material (if there is any MM expert here, pls help to explain this).
    4. you use a normal work order and not a refurbishment order.
    The process would be:
    1. Create a normal work order with external activity (of type PM02).  PR will be created.
    2. Create PO from the PR.
    3. GI the faulty part to the vendor against the PO.
    4. GR the refurbished part.
    5. Close the order (the usual TECO, settlement + Biz close).
    With this subcontracting method, you can monitor the faulty eq. status via the subcon monitor (Tcode ADSUBCON).
    The reason why you don't use a refurbishment order is with a refurbishment order, you GI the faulty parts to the refurbishment order.  You can't monitor the status via ADSUBCON.
    There are also a simpler way to go around this.  You can use a refurbishment order.  Fellow forumer Pete suggest that we can update user status of the equipment master.  Pick up an un-use field to enter the vendor who is repairing.
    Hope this helps.

  • Best Practices for Service Entry Sheet Approval

    Hi All
    Just like to get some opinion on best practices for external service management - particularly approval process for Service Entry Sheet.
    We have a 2 step approval process using workflow:
    1 Entry Sheet Created (blocked)
    2. Workflow to requisition creator to verify/unblock the Entry Sheet
    3. Workflow to Cost Object owner to approve the Entry Sheet.
    For high volume users (e.g. capital projects) this is cumbersome process - we looking to streamline but still maintain control.
    What do other leaders do in this area?  To me mass release seems to lack control, but perhaps by using a good release strategy we could provide a middle ground? 
    Any ideas or experiences would be greatly appreciated.
    thanks
    AC.

    Hi,
    You can have purchasing group (OME4) as department and link cost center to department (KS02). Use user exit for service entry sheet release and can have two characteristics for service entry sheet release, one is for value (CESSR- LWERT) and another one for department (CESSR-USRC1) .Have one release class for service entry sheet release & then add value characteristics (CESSR- LWERT) and department characteristics (CESSR-USRC1). Now you can design release strategies for service entry sheet based on department & value, so that SES will created and then will be released by users with release code based on department & value assigned to him/her.
    Regards,
    Biju K

  • Best practice for integrating oracle atg with external web service

    Hi All
    What is the best practice for integrating oracle atg with external web service? Is it using integration repository or calling the web service directly from the java class using a WS client?
    With Thanks & Regards
    Abhishek

    Using Integration Repository might cause performance overhead based on the operation you are doing, I have never used Integration Repository for 3rd Party integration therefore I am not able to make any comment on this.
    Calling directly as a Java Client is an easy approach and you can use ATG component framework to support that by making the endpoint, security credentials etc as configurable properties.
    Cheers
    R
    Edited by: Rajeev_R on Apr 29, 2013 3:49 AM

  • Best practices for setting up users on a small office network?

    Hello,
    I am setting up a small office and am wondering what the best practices/steps are to setup/manage the admin, user logins and sharing privileges for the below setup:
    Users: 5 users on new iMacs (x3) and upgraded G4s (x2)
    Video Editing Suite: Want to connect a new iMac and a Mac Pro, on an open login (multiple users)
    All machines are to be able to connect to the network, peripherals and external hard drive. Also, I would like to setup drop boxes as well to easily share files between the computers (I was thinking of using the external harddrive for this).
    Thank you,

    Hi,
    Thanks for your posting.
    When you install AD DS in the hub or staging site, disconnect the installed domain controller, and then ship the computer to the remote site, you are disconnecting a viable domain controller from the replication topology.
    For more and detail information, please refer to:
    Best Practices for Adding Domain Controllers in Remote Sites
    http://technet.microsoft.com/en-us/library/cc794962(v=ws.10).aspx
    Regards.
    Vivian Wang

  • Best Practices for Using Photoshop (and Computing in General)

    I've been seeing some threads that lead me to realize that not everyone knows the best practices for doing Photoshop on a computer, and in doing conscientious computing in general.  I thought it might be a good idea for those of us with some exprience to contribute and discuss best practices for making the Photoshop and computing experience more reliable and enjoyable.
    It'd be great if everyone would contribute their ideas, and especially their personal experience.
    Here are some of my thoughts on data integrity (this shouldn't be the only subject of this thread):
    Consider paying more for good hardware. Computers have almost become commodities, and price shopping abounds, but there are some areas where spending a few dollars more can be beneficial.  For example, the difference in price between a top-of-the-line high performance enterprise class hard drive and the cheapest model around with, say, a 1 TB capacity is less than a hundred bucks!  Disk drives do fail!  They're not all created equal.  What would it cost you in aggravation and time to lose your data?  Imagine it happening at the worst possible time, because that's exactly when failures occur.
    Use an Uninterruptable Power Supply (UPS).  Unexpected power outages are TERRIBLE for both computer software and hardware.  Lost files and burned out hardware are a possibility.  A UPS that will power the computer and monitor can be found at the local high tech store and doesn't cost much.  The modern ones will even communicate with the computer via USB to perform an orderly shutdown if the power failure goes on too long for the batteries to keep going.  Again, how much is it worth to you to have a computer outage and loss of data?
    Work locally, copy files elsewhere.  Photoshop likes to be run on files on the local hard drive(s).  If you are working in an environment where you have networking, rather than opening a file right off the network, then saving it back there, consider copying the file to your local hard drive then working on it there.  This way an unexpected network outage or error won't cause you to lose work.
    Never save over your original files.  You may have a library of original images you have captured with your camera or created.  Sometimes these are in formats that can be re-saved.  If you're going to work on one of those files (e.g., to prepare it for some use, such as printing), and it's a file type that can be overwritten (e.g., JPEG), as soon as you open the file save the document in another location, e.g., in Photoshop .psd format.
    Save your master files in several places.  While you are working in Photoshop, especially if you've done a lot of work on one document, remember to save your work regularly, and you may want to save it in several different places (or copy the file after you have saved it to a backup folder, or save it in a version management system).  Things can go wrong and it's nice to be able to go back to a prior saved version without losing too much work.
    Make Backups.  Back up your computer files, including your Photoshop work, ideally to external media.  Windows now ships with a quite good backup system, and external USB drives with surprisingly high capacity (e.g., Western Digital MyBook) are very inexpensive.  The external drives aren't that fast, but a backup you've set up to run late at night can finish by morning, and if/when you have a failure or loss of data.  And if you're really concerned with backup integrity, you can unplug an external drive and take it to another location.
    This stuff is kind of "motherhood and apple pie" but it's worth getting the word out I think.
    Your ideas?
    -Noel

    APC Back-UPS XS 1300.  $169.99 at Best Buy.
    Our power outages here are usually only a few seconds; this should give my server about 20 or 25 minutes run-time.
    I'm setting up the PowerChute software now to shut down the computer when 5 minutes of power is left.  The load with the monitor sleeping is 171 watts.
    This has surge protection and other nice features as well.
    -Noel

  • Best Practices for new iMac

    I posted a few days ago re failing HDD on mid-2007 iMac. Long story short, took it into Apple store, Genius worked on it for 45 mins before decreeing it in need of new HDD. After considering the expenses of adding memory, new drive, hardware and installation costs, I got a brand new iMac entry level (21.5" screen,
    2.7 GHz Intel Core i5, 8 GB 1600 MHz DDR3 memory, 1TB HDD running Mavericks). Also got a Superdrive. I am not needing to migrate anything from the old iMac.
    I was surprised that a physical disc for the OS was not included. So I am looking for any Best Practices for setting up this iMac, specifically in the area of backup and recovery. Do I need to make a boot DVD? Would that be in addition to making a Time Machine full backup (using external G-drive)? I have searched this community and the Help topics on Apple Support and have not found any "checklist" of recommended actions. I realize the value of everyone's time, so any feedback is very appreciated.

    OS X has not been officially issued on physical media since OS X 10.6 (arguably 10.7 was issued on some USB drives, but this was a non-standard approach for purchasing and installing it).
    To reinstall the OS, your system comes with a recovery partition that can be booted to by holding the Command-R keys immediately after hearing the boot chimes sound. This partition boots to the OS X tools window, where you can select options to restore from backup or reinstall the OS. If you choose the option to reinstall, then the OS installation files will be downloaded from Apple's servers.
    If for some reason your entire hard drive is damaged and even the recovery partition is not accessible, then your system supports the ability to use Internet Recovery, which is the same thing except instead of accessing the recovery boot drive from your hard drive, the system will download it as a disk image (again from Apple's servers) and then boot from that image.
    Both of these options will require you have broadband internet access, as you will ultimately need to download several gigabytes of installation data to proceed with the reinstallation.
    There are some options available for creating your own boot and installation DVD or external hard drive, but for most intents and purposes this is not necessary.
    The only "checklist" option I would recommend for anyone with a new Mac system, is to get a 1TB external drive (or a drive that is at least as big as your internal boot drive) and set it up as a Time Machine backup. This will ensure you have a fully restorable backup of your entire system, which you can access via the recovery partition for restoring if needed, or for migrating data to a fresh OS installation.

  • Best practice for iTunes' music folder

    i keep my music on an external drive, but want itunes to be able to play the songs.
    currently, the itunes music folder is set to its default location. i changed the preference to prevent iTunes from copying music to this location. i added music to iTunes using File | 'Add folder to Library' menu.
    my friend, who also has his music on an external drive, set his itunes music folder to the Music folder on his external drive.
    what are the differences between these two approaches? what are the issues?
    is there a best practice for using iTunes w/ music stored on an external drive?
    thanks for your time.
    craig

    Thanks Paul for helping
    I am getting the symbol and can locate the song but it is very time consuming and I can't do whole albums .
    I tried dragging the entire music folder into iTunes . Is this it , iTunes Music Library.xml ? These are all the files and folders I found
    iTunes 3 Music Library Data file
    iTunes 4 Music Library Data File
    iTunes 4 Music Library (Old) Data File
    iTunes Music folder
    iTunes Music Library.xml document
    Temp File Document
    I unchecked the "Copy files to iTunes Music folder "
    before I dragged the xml. doc into the iTunes symbol in the dock .
    This seems to have made matters worse . Now I can't find the file at all except through the finder .
    Remember this is 10.3.9 with v4.7
    Powerbook   Mac OS X (10.4.6)   Panther eMac

  • What are best practices for managing my iphone from both work and home computers?

    What are best practices for managing my iphone from both work and home computers?

    Sync iPod/iPad/iPhone with two computers
    Although it isn't possible to sync an Apple device with two different libraries it is possible to sync with the same logical library from multiple computers. Each library has an internal ID and when iTunes connects to your iPod/iPad/iPhone it compares the local ID with the one the device normally syncs with. If they are the same you can go ahead and sync...
    I have my library cloned to a small 1Tb USB drive which I can take between home & work. At either location I use SyncToy 2.1 to update the local copy with the external drive. Mac users should be able to find similar tools. I can open either of the local libraries or the one on the external drive and update the media content of my iPhone. The slight exception is Photos which normally connects to a specific folder on a specific machine, although that can easily be remapped to the current library if you create a "Photos" folder inside the iTunes Media folder so that syncing the iTunes folders keeps this up to date as well. I periodically sweep my library for new files & orphans withiTunes Folder Watch just in case I make changes at one location but then overwrite the library with a newer copy from the other. Again Mac users should be able to find similar tools.
    As long as your media is organised within an iTunes Music or Tunes Media folder, in turn held inside the main iTunes folder that has your library files (whether or not you let iTunes keep the media folder organised) each library can access items at the same relative path from the library folder so the library can be at different drives/paths on different machines. This solution ensures I always have adequate backups of my library and I can update my devices whenever I can connect to the same build of iTunes.
    When working with an iPhone earlier builds of iTunes would remove any file not physically present in the local library, even if there was an entry for it, making manual management practically redundant on the iPhone. This behaviour has been changed but it will still only permit manual management with a library that has the correct internal ID. If you don't want to sync your library between machines on a regular basis just copy the iTunes Library.itl file from the current "home" machine to any other you want to use, then clean out the library entires and import the local content you have on that box.
    tt2

  • (Request for:) Best practices for setting up a new Windows Server 2012 r2 Hyper-V Virtualized AD DC

    Could you please share your best practices for setting up a new Windows Server 2012 r2 Hyper-V Virtualized AD DC, that will be running on a new WinSrv 2012 r2 host server.   (This
    will be for a brand new network setup, new forest, domain, etc.)
    Specifically, your best practices regarding:
    the sizing of non virtual and virtual volumes/partitions/drives,  
    the use of sysvol, logs, & data volumes/drives on hosts & guests,
    RAID levels for the host and the guest(s),  
    IDE vs SCSI and drivers both non virtual and virtual and the booting there of,  
    disk caching settings on both host and guests.  
    Thanks so much for any information you can share.

    A bit of non essential additional info:
    We are small to midrange school district who, after close to 20 years on Novell networks, have decided to design and create a new Microsoft network and migrate all of our data and services
    over to the new infrastructure .   We are planning on rolling out 2012 r2 servers with as much Hyper-v virtualization as possible.
    During the last few weeks we have been able to find most of the information we need to undergo this project, and most of the information was pretty solid with little ambiguity, except for
    information regarding virtualizing the DCs, which as been a bit inconsistent.
    Yes, we have read all the documents that most of these posts tend point to, but found some, if not most are still are referring to performing this under Srvr 2008 r2, and haven’t really
    seen all that much on Srvr2012 r2.
    We have read these and others:
    Introduction to Active Directory Domain Services (AD DS) Virtualization (Level 100), 
    Virtualized Domain Controller Technical Reference (Level 300),
    Virtualized Domain Controller Cloning Test Guidance for Application Vendors,
    Support for using Hyper-V Replica for virtualized domain controllers.
    Again, thanks for any information, best practices, cookie cutter or otherwise that you can share.
    Chas.

Maybe you are looking for