What's a good offsite backup?

What is a good on-line backup service for 2+ TB?

I would use physical backups using this information (https://discussions.apple.com/docs/DOC-3107) and keep a thrid copy that you keep at a friend's house and rotate through.  2 TB downloads are slow enough for smple files, and I do not like relying on ethernet for my perosnal files anyway.

Similar Messages

  • HT201250 if my drive with my time machine backup becomes corrupt, what is a good tool to recover it?

    if my drive with my time machine backup becomes corrupt, what is a good tool to recover it?

     Data recovery efforts explained
     Most commonly used backup methods explained

  • What to do for getting a good battery backup for E...

    I had brought the E6 after a convincing conversation at the Nokia Priority Partner shop and the executive there told me that E6 is a smartphone and also gives a good battery backup. Only on this condition I bought this new phone but to my misfortune this does not have a good battery backup. Everyday morning I need to charge my battery or the phone is dead. This is very much annoying as well as disappointing. I even suffered software issues with my E6. Thrice I needed to get its software reinstalled and even now having issues with the same. The priority centre only tells each time that you get the software reinstalled and your E6 will work fine. But I'm surprised that E6 only borrows my time to get it checked on the go.
    No doubt this phone is one of the best which Nokia has ever brought, but, still if it could help me sort this issue then I would rank E6 to be the best in its class.

    Which country did you buy your phone in?
    There are many posts dealing with this issue and steps to be taken to increase battery life.
    Some important ones are:
    1. Switch off 3G if you do not need it.
    2. Reduce the mail retrieval frequency if you have configured email accounts.
    3. Switch off automatic WiFi scanning.
    4. Reduce number of homescreens.
    5. Close any background application that you do not need.
    6. Switch to power saving mode when you do not actively use the phone for longer periods.

  • What's a good way to do a thread dump into a separate file

    What is a good way to do a thread dump automatically into a separate file.
    Example. I run a script to do the thread dump, but unfortunetly, it goes into my stdout log file with the rest of my weblogic errors.
    Any ideas? I want it in a separate file when I run my script?

    Do a Google search on "Drobo S" "benchmark."  I don't have a Drobo S, only the regular Drobo.  But here's a guy who tested one on Windows:
    http://mansurovs.com/drobo-s-review-usb-3-0-2nd-generation
    This one has it a bit faster:
    http://the-gadgeteer.com/2011/12/31/drobo-s-storage-array-review/
    Do read up on a few reviews of it, and be absolutely clear that interface speed (i.e. eSATA versus Firewire versus Thunderbolt) is NOT the same as the performance of the system.  The Drobo cannot keep up with any interfaces... at least the Drobo and the Drobo S cannot.
    I am not using the FS model which is a NAS.  I am using the plain old "Drobo" which is slower than the Drobo S, but that's not to say that the Drobo S is fast, because it is not.
    The Drobo in theory is really attractive: Dead simple to manage, can mix and match drive sizes, offers you some data protection, etc.  However do note that protected storage is not, in and of itself, a backup.  You need other backups besides just the data on the Drobo.  And, because it's so slow, it's really not a great fit for photo storage.  See this review from a guy who used to think the Drobo was great for that and then appended his review:
    http://www.stuckincustoms.com/drobo-review/
    To be as clear as possible, IMO the BEST backup strategy with something like Aperture (so long as your managed Aperture library is of a manageable size, like < 800 GB), is to get a few small portable Firewire 800 drives and keep vaults on each one.  They are great because they are easy to use, to have with you, are bus powered, and you WILL offsite them.

  • Best method for incremental offsite backups?

    I currently have multiple Macs and a Windows PC running XP.  I also have a 1TB Time Capsule, so all the Macs are backed up via Time Machine.  However, I want to have offsite backups as well, so I purchased 2 matching Seagate GoFlex 3TB drives.  My plan was to backup up verything, then have one drive offsite and another at home away from the computers.
    However, it takes a very long time to move the drives from machine to machine, backing up all the files.  I would like to find a good solution to do an initial backup, then do incremental backups after that.  It would also be great if I could hook the drives into the network, so I didn't have to move them around to the various computers.
    So, I have a few questions:
    Can I hook up the external drive to the Time Capsule via the USB port, point Time Machine to the external drive and do a backup that way?  If so, will it screw up my regular Time Machine backup to the Time Capsule?  Will it just do an incremental the next time I hook up the external drive to the Time Capsule?
    If Time Machine is not the answer, is there third party software to do it?
    How do I handle the Windows machine backup?
    Is there a better option than doing incrementals?
    Keep in mind--I've tried cloud solutions like Carbonite and Mozy, and neither worked for me based on the large multimedia files I use.  I could never upload fast enough to make it practical.  That's why I went to external drives for offsite backup.   I also don't want to leave the external drives hooked up--just hook them up long enough to do the backups.
    Any suggestions would be appreciated.

    I would suggest CrashPlan.  They offer a mode that allows you to use your own computers (or even your own local devices) as the backup destination.  CrashPlan runs on Windows and Macs.
    I currently use CrashPlan to backup my laptop to a in house Mac mini.  I think that as long as your Macs and PCs can see the Seagate GoFlex 3TB drives, either attached to the TimeCapsule, or attached to any Mac or PC in the house, then CrashPlan on any device at home should be able to incrementally backup each system.  And if the GoFlex is attached to a system in the house that is running a CrashPlan server, you could even use CrashPlan to backup a laptop while you are away on vacation or business trips.
    You will be using your own local (hopefully faster) home network, which should be much better than the Carbonite and Mozy options which require pushing your data out of the house using your generally much slower up-link broadband connection.
    NOTE:  CrashPlan (again the free option) can specify a computer that you control (yours, or a family member, or friend) can be used as the backup target.
    An alternative for Macs ONLY would be SuperDuper (shareware) and/or Carbon Copy Cloner (donationware), which do very good incremental backups.  Generally both are used with direct attached external disks, and they make bootable clones.  HOWEVER, both of them can be used to backup over the network.  In the case of SuperDuper, you would mount the remote volume and then backup into a .dmg container file.  For Carbon Copy Cloner you can backup over an ssh connection to a directory subtree on a remote system.  I think it should be possible for Carbon Copy Cloner to use a container file (similar to SuperDuper).
    Sorry, but I do not know what backup utilities are good on Windows.
    Finally, you might be able to roll your own using rsync.  In fact Carbon Copy Cloner uses rsync as its copy engine, and the rsync buried in CCC is generally the most up-to-date rsync for Mac OS X.  Not sure if rsync is available for Windows, but if it is, then that might be something to think about.
    Message was edited by: BobHarris

  • Offsite Backup

    I recently switched to macbook pro from windows laptop using XP. A year ago I set up an offsite backup process using an account with iBakcup and using their software iBackup for Windows. Since I am struggling replacing this. Backup in Tiger is limited to 90meg. I have about 12Gb of data that I want backuped up offsite adn accessible from anywhere (through the web). I am travelling regularly.
    I also set .mac for 20Gb but is looking for an adequate software to backup.
    Any third party software you know that would do the job? or any other idea? I tried to sync, but it does not really work for that.
    Help would be appreciated.

    Consider buying a larger portable HD, so you can make a "bootble clone" on it, via CarbonCopyCloner, SuperDuper, or the like.
    They aren't very expensive any more, and you won't have to worry about what you missed, reinstalling apps, your "older" drive failing, etc.
    Several folks on these forums, including me, have one of the LaCie "Rugged" models, just as an example, and they get very good reviews: http://www.lacie.com/us/products/range.htm?id=10036
    CCC is donationware; SD has a free version, but you need the paid one (about $30) to do updates instead of full replacements, or scheduling. Either is easily found via Google.

  • Informal poll: breaking raid as offsite backup

    this isn't specific to os x server, but i was wondering what other people think about breaking a mirrored raid (external or internal hot swap) and swapping in a spare as a means of providing easy offsite backup.
    thanks for any feedback. i'll post my thoughts in a bit but am looking for others' take on this.

    The following discussion has a lot of good information you may want to read:
    http://discussions.apple.com/thread.jspa?messageID=6266460
    I've seen this method mess things up a few times, so I wouldn't rely on it. RAID1 was intended for redundancy in case of drive failures, not for backups. There are several things that can go wrong when you break the mirror, especially if the system is live.

  • How do I create an Oracle 10g offsite backup

    Total size of the datafiles that make up the tablespaces of the DB = 1532121.6875 Mb
    This size will eventually grow to about 6-7 terrabytes
    I used the following query to determine the above size:
    SELECT sum(bytes/1048576) FROM dba_extents; I saw this question poseted on another site
    QUESTION POSED ON: 15 August 2007
    I recently handled a project whereby we have many site servers at various places in our country and there is a central server in Delhi (India). My project is to take a backup of the local server and post that backup to the central server via Internet. I chose the RMAN (Recovery Manager) approach. Using that approach I can back up the database and connect to the remote database. But I can't find how to post the backup to the remote server. Please give me guidance regarding this project.
    This was the answer:
    EXPERT RESPONSE
    Basically, there are two options to have RMAN back up to a remote server. One, back up locally and then FTP the backup to the remote server. Two, create a network file share from the database server to the remote server. Have RMAN back up to the network file share. It will think the file system is local, but in actuality, the file system is in another location.
    Currently a level 1 incremental backup is done at 10:00PM every night to the array (this takes approximately 1 hour to be completed). I am interested in doing a full hot offsite backup of the DB to a disk on another server.
    Can anyone tell me how would I be able to do this using RMAN and/or backing locally and then FTP'ing the backup to a remote sever.
    Can anyone also confirm if the calculation for the size of the DB is accurate?
    ( I did the calculations based on what I saw at this site and this site)
    I would like to do this in preperation for some firmware upgrade that our Sun support would like to do on the array, I would like to implement a script (or maybe use RMAN) to do a nightly offsite backup. I would still need to do the DB backup to the array...but what files do I copy off the array (to the remote server) for an offsite backup of the DB.
    I would really appreciate any help.
    MY CURRENT RMAN CONFIGS
    using target database control file instead of recovery catalog                
    RMAN configuration parameters are:                                            
    CONFIGURE RETENTION POLICY TO REDUNDANCY 1;                                   
    CONFIGURE BACKUP OPTIMIZATION OFF; # default                                  
    CONFIGURE DEFAULT DEVICE TYPE TO DISK; # default                              
    CONFIGURE CONTROLFILE AUTOBACKUP ON;                                          
    CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE DISK TO '%F'; # default
    CONFIGURE DEVICE TYPE DISK PARALLELISM 2 BACKUP TYPE TO BACKUPSET;            
    CONFIGURE DATAFILE BACKUP COPIES FOR DEVICE TYPE DISK TO 1; # default         
    CONFIGURE ARCHIVELOG BACKUP COPIES FOR DEVICE TYPE DISK TO 1; # default       
    CONFIGURE MAXSETSIZE TO UNLIMITED; # default                                  
    CONFIGURE ENCRYPTION FOR DATABASE OFF; # default                              
    CONFIGURE ENCRYPTION ALGORITHM 'AES128'; # default                            
    CONFIGURE ARCHIVELOG DELETION POLICY TO NONE; # default                       
    CONFIGURE SNAPSHOT CONTROLFILE NAME TO '/opt/oracle/product/10.2.0/dbs/snapcf_MYSID.f'; # default########################################
    I am using the following software/hardware :
    Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bi
    PL/SQL Release 10.2.0.3.0 - Production
    CORE 10.2.0.3.0 Production
    TNS for Solaris: Version 10.2.0.3.0 - Production
    NLSRTL Version 10.2.0.3.0 - Production
    Recovery Manager: Release 10.2.0.3.0 - Production
    SunOS myserver 5.9 Generic_118558-34 sun4u sparc SUNW,Netra-T12
    SunFire E2900 Server with Sun Storage Tek 6140

    Thanks for the help, I took a look at the script. Suppose however I didn't/couldn't do any modification of the Oracle RMAN settings (without voiding my support contract). Could I simply copy the necessary DB files to a remote server, would I be able to do a restore (if necessary) from those copied files?
    Like I have pointed out before, currently an incremental level 1 backup (to the array) is done nightly at 10 (and it takes approximately 1 hour)
    [....I am also working on updating the support contract ]

  • What is the best online backup for OS 10.5.8? And if I Reinstall OS will I have problems?

    My old Mac has had the flashing question mark file on a blue screen. I waited a couple days, turned it back on and it worked. However, in between that time I went and bought a new Macbook Air. I am relieved to have all of my pics and docs back. I need a good online backup. I downloaded BackBlaze Backup but it is taking forever and I am not sure if it's the best choice.
    My plan is to backup all of my files and then wipe my computer clean with the install disks and give my computer to my little sister. So, my question is two fold; which is best online backup and will I have any issues wiping clean the computer and doing a reinstall? For example, will all the updates still be available? What is the best thing for me to do?
    Thank you : )

    Just understand it gets pretty expensive after you exceed the free space. 2-5 GBs isn't much space.
    Basic Backup
    For some people Time Machine will be more than adequate. Time Machine is part of OS X. There are two components:
    1. A Time Machine preferences panel as part of System Preferences;
    2. A Time Machine application located in the Applications folder. It is
         used to manage backups and to restore backups. Time Machine
         requires a backup drive that is at least twice the capacity of the
         drive being backed up.
    3. Time Machine requires a backup drive that is at least double the
         capacity of the drive(s) it backs up.
    Alternatively, get an external drive at least equal in size to the internal hard drive and make (and maintain) a bootable clone/backup. You can make a bootable clone using the Restore option of Disk Utility. You can also make and maintain clones with good backup software. My personal recommendations are (order is not significant):
      1. Carbon Copy Cloner
      2. Get Backup
      3. Deja Vu
      4. SuperDuper!
      5. Synk Pro
      6. Tri-Backup
    Visit The XLab FAQs and read the FAQ on backup and restore.  Also read How to Back Up and Restore Your Files. For help with using Time Machine visit Pondini's Time Machine FAQ for help with all things Time Machine.
    Although you can buy a complete external drive system, you can also put one together if you are so inclined.  It's relatively easy and only requires a Phillips head screwdriver (typically.)  You can purchase hard drives separately.  This gives you an opportunity to shop for the best prices on a hard drive of your choice.  Reliable brands include Seagate, Hitachi, Western Digital, Toshiba, and Fujitsu.  You can find reviews and benchmarks on many drives at Storage Review.
    Enclosures for FireWire and USB are readily available.  You can find only FireWire enclosures, only USB enclosures, and enclosures that feature multiple ports.  I would stress getting enclosures that use the Oxford chipsets especially for Firewire drives (911, 921, 922, for example.)  You can find enclosures at places such as;
      1. Cool Drives
      2. OWC
      3. WiebeTech
      4. Firewire Direct
      5. California Drives
      6. NewEgg
    All you need do is remove a case cover, mount the hard drive in the enclosure and connect the cables, then re-attach the case cover.  Usually the only tool required is a small or medium Phillips screwdriver.

  • What is a good strategy for keeping the OEM GRID up and running?

    Hi,
    What is a good strategy for keeping OEM GRID up and running? Currently, we have grid installed on one linux64bit box with 4 cpus and 8 gigs of RAM, oms 10204, repdb 10204, agent 10204. In addition to using this oem grid for notifications and performance monitring, we are also using this oem grid for scheduling over 700 jobs running over 30 targets.
    What is a good strategy to have to backup the grid so that when this box goes down, we can restore and then recover the grid? Please lead to white papers or documentations. Thank you.

    Take a look at Oracle Maximum Availability Architecture, a set of architectural recommendation on setting Oracle software up for high availability.
    The information is available on this page:
    http://www.oracle.com/technology/deploy/availability/htdocs/maa.htm
    Look under HA Best Practices for Grid Control.
    Chung Wu
    [Application Management Blog|http://www.appmanagementblog.com/]

  • What is a good storage size when purchasing an iPad?

    what is a good storage size when purchasing an iPad?

    Keep in mind >  How Mac OS X and iOS report storage capacity
    Actual capacity on a 16GB iPad is approximately 13.82GB.
    Video files can take up space fast not including other data such as Mail, Contacts, Calendars, Photos, Music, Notes, Apps, and Bookmarks so you may want to keep that in mind before you select the 16GB.
    Your Mac must be running at the minimum v10.5.8 in order to sync, restore, and backup an iPad.
    Mac system requirements
    Mac computer with USB 2.0 port
    Mac OS X v10.5.8 or later
    iTunes 9.1 or later (free download from www.itunes.com/download)
    iTunes Store account

  • Best way to do offsite backups of iMac?

    I have a new iMac -- the first Mac for a longtime Windows user...
    One of my normal practices with the PC was to create an offsite backup each month, which is stored in my desk at work. In the past, I've always used DVDs for the backup, but now that my iTunes and digital photo libraries are growing, that's getting to be too much of a chore and requires too many disks to do easily. So, I'd like to start doing my offsite on a portable hard drive I can store in the office and just take home once a month for the backups.
    I'm already using Time Machine to back up my iMac to a G-Tech external hard drive, but for the offsite backup, I'd prefer to only back up my iTunes library, photos, and documents. I don't see a way to do that with Time Machine.
    What's the best way to have only these folders backed up to an external hard drive?

    ecd1211, Welcome to the discussion area!
    I'd prefer to only back up my iTunes library, photos, and documents. I don't see a way to do that with Time Machine.
    You can customize Time Machine to only do the folders you want.
    In Time Machine preferences, Choose Backup Disk, then click Options. A list of locations that are backed up appears.
    To add a new "do not back up" location to the listing, click the "+" button below the list, navigate to the location you want to exclude and click Exclude.
    To delete a listing, select that listing and click the "-" button.
    Click Done to return to Time Machine preferences.
    Drag the Time Machine OFF-ON slider to turn Time Machine on or off.
    KB 306681, Mac 101: Using Time Machine in Mac OS X 10.5 Leopard

  • Development System Backup - Best Practice / Policy for offsite backups

    Hi, I have not found any recommendations from SAP on best practices/recommendations on backing up Development systems offsite and so would appreciate some input on what policies other companies have for backing up Development systems. We continuously make enhancements to our SAP systems and perform daily backups; however, we do not send any Development system backups offsite which I feel is a risk (losing development work, losing transport & change logs...).
    Does anyone know whether SAP have any recommendations on backuping up Development systems offsite? What policies does your company have?
    Thanks,
    Thomas

    Thomas,
    Your question does not mention consideration of both sides of the equation - you have mentioned the risk only.  What about the incremental cost of frequent backups stored offsite?  Shouldn't the question be how the 'frequent backup' cost matches up with the risk cost?
    I have never worked on an SAP system where the developers had so much unique work in progress that they could not reproduce their efforts in an acceptable amount of time, at a acceptable cost.  There is typically nothing in dev that is so valuable as to be irreplaceable (unlike production, where the loss of 'yesterday's' data is extremely costly).  Given the frequency that an offsite dev backup is actually required for a restore (seldom), and given that the value of the daily backed-up data is already so low, the actual risk cost is virtually zero.
    I have never seen SAP publish a 'best practice' in this area.  Every business is different; and I don't see how SAP could possibly make a meaningful recommendation that would fit yours.  In your business, the risk (the pro-rata cost of infrequently  needing to use offsite storage to replace or rebuild 'lost' low cost development work) may in fact outweigh the ongoing incremental costs of creating and maintaining offsite daily recovery media. Your company will have to perform that calculation to make the business decision.  I personally have never seen a situation where daily offsite backup storage of dev was even close to making  any kind of economic sense. 
    Best Regards,
    DB49

  • What is a good way to load 80 million documents in DocumentDB?

    I am having problems loading a large set of data.  We want to load 80 million documents.  We are trying to do testing for an IOT solution that will end up having a lot of data in it.  I followed the instructions to use a stored procedure to
    do a bulk load provided by Ryan CrawCour on the Microsoft Site:
    https://code.msdn.microsoft.com/windowsazure/Azure-DocumentDB-NET-Code-6b3da8af
    But it throws exceptions when we load 2,000 - 5,000 documents.  Our documents are only about 80 characters.
    Error: One or more errors occurred., Message: Exception: Microsoft.Azure.Documen
    ts.RequestRateTooLargeException, message: {"Errors":["Request rate is large"]},
    What is a good way to load large datasets?  ( Load backups, migrate data, ... )  Or is DocumentDB just a wrong choice when you have millions of rows to load?
    Thanks

    Hi,
    I had been working on this from around one month and I am happy to say that my code works. Was able to upload around 0.5 million documents in 20 min. 
    The configuration was Document DB with S3 , I scaled out for 16 collection for sharding, I think you need to shard out more. This will depend on how much each document takes.
    So the Calculation goes like if each document is lets say on an average 2Kb of Size and you have 80 million documents that will come out to be 160GB
    That means you will need more than 16 collections to store as at Max 1 collection can have 10GB of Data. So to be on the safer side I would say lets go for 3 times the storage so 48 collections . If all of them are at S3 than you have 2500 RUs spread across
    48 collections and I am sure if you do insertion now you will not get Request Rate too large exception.
    I have come up with this code hopefully it will help you as well.
    https://social.msdn.microsoft.com/Forums/azure/en-US/d036afe2-78ec-45ee-8b0d-297f0f5320fe/azure-documentdb-bulk-insert-using-stored-procedure.
    For Sharding you can have look at
    https://msdn.microsoft.com/en-us/library/dn589797.aspx?f=255&MSPPError=-2147217396.

  • Special RAID 1 with offsite backup?

    In order to ditch tape backups, here's what we're trying to accomplish but am a bit confused on how to do it most easily: I want to set up drives 1 and 2 for onsite data redundancy, and use drive 3 with a pair of harddrive trays for offsite backup. (one always in bay 3, the other always offsite, with a weekly switch back and forth taking place.)
    Would this be the appropriate way to accomplish this?
    1. set up xserve drive bays 1 and 2 as mirrors. (raid 1 - this part I can do just fine)
    2. use a drive cloning program to create a snapshot of drive 1 into drive 3, and then take drive 3 out and take it offsite, and then every week just redo the cloning process?

    That will work. Just remember to unmount drive3 before removing it.
    I do something like this on a daily basis. I run my root drive as just a drive. In the morning after I've checked logs I clone it to the drive in the middle bay and the drive in the right bay. I send the right drive offsite. In the evening, after the root drive has had it's important changes I clone it to the other drives again. BTW, I do these clones while the machine is being used.
    A RAID 1, while giving you up to the millisecond backup, also will have up to the millisecond mistakes, that may need to be restored from backup. Cloning them like I do does mean the clone is behind, but I have the version before the mistake was made. It's individual choice.
    Roger

Maybe you are looking for

  • How to stop the mozilla page from loading at start up

    After updating Firefox, every I open the browser a Mozilla page and my home page opens. The Mozilla page say Committed to you, your privacy etc. I have tried several ways to stop this but none of them worked and I would like for Firefox to just open

  • JDBC access to Oracle Lite from midlet on j9 pocket pc

    Hi all, Is it possible to access an Oracle Lite database from a MIDLet applicatie running on a POCKET with the J9 virtual machine? I want to use MIDP and not the Personal Profile to be able to use the Oracle webservice proxy with the J2ME client SDK.

  • Layouts in CO24 missing

    Hello! After the Upgrade from R/3 46C to ECC 600 the Layouts of the CO24 are missing. Only the standard layouts "Material View" and "Order View" are available. Could anybody help me, thanks a lot, Barbara

  • Can I charge the ink with an iPad charger ?

    Can I charge the ink with an iPad or iPhone charger?

  • Where is a relationship's arc indicator displayed?

    When looking at a relationship properties, where is it indicated that it participates in an arc, and what the name or ID of that arc is? I'm asking this because I have created an arc on a relationship in one diagram, but that same relationship in a d