[SOLVED] File backup strategy

Hi,
I have a cronjob running backup script every 8hrs to make timestamped tar archive of important stuff.
Basically, i just wanted to know if there's some more elegant way of achieving the same/better effect as, for some reason, i don't feel this is the right way of doing it.
0 */8 * * * sh /home/pentago/.backup/cron-backup > /dev/null 2>&1
Backup script:
#!/bin/zsh
cd /home/pentago/.backup && \
tar cvpf backup-`date +%d.%m.%y--%I.%M%p`.tar $(cat /home/pentago/.backup/files) \
--exclude="/home/pentago/Public" \
--exclude="/home/pentago/.irssi/logs" \
--exclude="/home/pentago/.config/google-chrome" \
--exclude="/home/pentago/.config/sublime-text-3/Cache" \
--exclude="/home/pentago/.config/sublime-text-3/Backup" \
--exclude="/home/pentago/.config/sublime-text-3/Index" \
--exclude="/home/pentago/.Skype/shared_dynco" \
--exclude="/home/pentago/.Skype/shared_httpfe" \
--exclude="/home/pentago/.Skype/DbTemp" \
--exclude="/home/pentago/.Skype/shared.lck" \
--exclude="/home/pentago/.Skype/user.name/chatsync" \
--exclude="/home/pentago/.Skype/user.name/httpfe" \
--exclude="/home/pentago/.Skype/user.name/voicemail" \
--exclude="/home/pentago/.Skype/user.name/*.db" \
--exclude="/home/pentago/.Skype/user.name/*.db-journal" \
--exclude="/home/pentago/.Skype/user.name/*.lock" \
--exclude="/home/pentago/.Skype/user.name/*.lck" \
--exclude="/home/pentago/.Skype/user.name2/chatsync" \
--exclude="/home/pentago/.Skype/user.name2/httpfe" \
--exclude="/home/pentago/.Skype/user.name2/voicemail" \
--exclude="/home/pentago/.Skype/user.name2/*.db" \
--exclude="/home/pentago/.Skype/user.name2/*.db-journal" \
--exclude="/home/pentago/.Skype/user.name2/*.lock" \
--exclude="/home/pentago/.Skype/user.name2/*.lck" \
Backed up files:
/home/pentago/Downloads
/home/pentago/Music
/home/pentago/Pictures
/home/pentago/Public
/home/pentago/Videos
/home/pentago/.config
/home/pentago/.asoundrc
/home/pentago/.elinks
/home/pentago/.filezilla
/home/pentago/.fonts
/home/pentago/.gimp-2.8
/home/pentago/.irssi
/home/pentago/.mpd
/home/pentago/.mpdconf
/home/pentago/.packages
/home/pentago/.share-credentials
/home/pentago/.Skype
/home/pentago/.ssh
/home/pentago/.vim
/home/pentago/.gtk-bookmarks
/home/pentago/.tmux.conf
/home/pentago/.vimrc
/home/pentago/.xinitrc
/home/pentago/.Xresources
/home/pentago/.zshrc
/etc/fstab
/etc/hostname
/etc/locale.conf
/etc/locale.gen
/etc/locale.nopurge
/etc/localtime
/etc/ntp.conf
/etc/ssh/sshd_config
/etc/pacman.conf
/etc/makepkg.conf
/etc/netctl/wlp3s0
/etc/netctl/enp4s0
/etc/pacman.d/mirrorlist
/etc/samba/smb.conf
/etc/ufw/ufw.conf
/etc/vconsole.conf
/etc/X11/xorg.conf.d/50-synaptics.conf
/usr/lib/ufw/user.rules
/usr/lib/ufw/user6.rules
Thanks.
Last edited by pentago (2013-11-22 18:29:52)

As graysky said, this isn't exactly a backup, it's more a history of your files. If you're willing to use BTRFS (I am on many computers without a problem) you might be better off using subvolumes and snapshots. I setup a script which takes snapshots of my home directory which is called through cron like:
*/10 * * * * snap 10m 12
0 * * * * snap 1h 24
0 0,12 * * * snap hd 14
0 13 * * * snap 1d 30
The first argument to snap is an arbitrary label and the second is the number of copies of that label to keep (first in first out). So I keep 2 hours of 10-minute snapshots and 2 weeks of half-day snapshots of my home directory. Also, I have ~/Local and ~/.cache as separate subvolumes so that they are not included.
I also take snapshots in two other places:
- Whenever I log in I run "snap profile 5" so that I have a snapshot created for my previous 5 logins (tty, X11, ssh, tmux, etc).
- Whenever I run my personal rsync wrapper to another host.
Elaborating in the second item, I have 5 machines which are configured in the same way. I manually run my rsync wrapper to sync among them (so I usually need to keep track of my most recently used machine to avoid my files from diverging). The rsync wrapper calls "snap <source-hostname> 3" so that I also have snapshots created before I sync files between two hostnames.
So you get the idea, I currently have all of these snapshots:
@2013-11-17 00:00:01 MST - hd @2013-11-25 18:58:13 MST - profile @2013-12-03 17:00:01 MST - 1h @2013-12-04 20:00:01 MST - 1h
@2013-11-17 12:00:01 MST - hd @2013-11-25 19:01:41 MST - spruce @2013-12-03 18:00:01 MST - 1h @2013-12-04 20:30:01 MST - 10m
@2013-11-17 13:00:01 MST - 1d @2013-11-26 10:32:28 MST - profile @2013-12-03 19:00:01 MST - 1h @2013-12-04 20:40:01 MST - 10m
@2013-11-18 00:00:01 MST - hd @2013-11-26 10:33:19 MST - profile @2013-12-03 20:00:01 MST - 1h @2013-12-04 20:50:01 MST - 10m
@2013-11-19 00:00:01 MST - hd @2013-11-26 12:26:14 MST - profile @2013-12-03 22:00:01 MST - 1h @2013-12-04 21:00:01 MST - 10m
@2013-11-22 00:00:01 MST - hd @2013-11-26 13:00:01 MST - 1d @2013-12-03 23:00:01 MST - 1h @2013-12-04 21:00:01 MST - 1h
@2013-11-22 09:53:13 MST - sunflower @2013-11-26 22:52:20 MST - profile @2013-12-04 07:00:01 MST - 1h @2013-12-04 21:10:01 MST - 10m
@2013-11-22 09:59:24 MST - sunflower @2013-12-02 14:02:55 MST - profile @2013-12-04 08:00:01 MST - 1h @2013-12-04 21:20:01 MST - 10m
@2013-11-22 12:00:01 MST - hd @2013-12-02 14:16:07 MST - profile @2013-12-04 09:00:01 MST - 1h @2013-12-04 21:30:01 MST - 10m
@2013-11-23 00:00:01 MST - hd @2013-12-03 09:00:01 MST - 1h @2013-12-04 10:00:01 MST - 1h @2013-12-04 21:40:01 MST - 10m
@2013-11-24 13:00:26 MST - 1d @2013-12-03 10:00:01 MST - 1h @2013-12-04 11:00:01 MST - 1h @2013-12-04 21:50:01 MST - 10m
@2013-11-24 20:30:27 MST - sunflower @2013-12-03 11:00:01 MST - 1h @2013-12-04 12:00:01 MST - 1h @2013-12-04 22:00:01 MST - 10m
@2013-11-25 09:34:19 MST - profile @2013-12-03 12:00:01 MST - 1h @2013-12-04 12:00:01 MST - hd @2013-12-04 22:00:01 MST - 1h
@2013-11-25 09:43:12 MST - old mutt config @2013-12-03 12:00:01 MST - hd @2013-12-04 15:53:35 MST - spruce @2013-12-04 22:10:01 MST - 10m
@2013-11-25 09:43:51 MST - old mutt config @2013-12-03 13:00:01 MST - 1d @2013-12-04 16:00:01 MST - 1h @2013-12-04 22:20:01 MST - 10m
@2013-11-25 12:00:01 MST - hd @2013-12-03 13:00:01 MST - 1h @2013-12-04 17:00:01 MST - 1h @head
@2013-11-25 13:00:01 MST - 1d @2013-12-03 16:31:21 MST - profile @2013-12-04 18:00:01 MST - 1h
@2013-11-25 14:54:05 MST - profile @2013-12-03 16:31:32 MST - spruce @2013-12-04 19:00:01 MST - 1h
It seems like a lot, but they barely take up any space.
@head is my actual home directory and is the only read/write subvolume, all of the other snapshots are immutable.
Last edited by dgbaley27 (2013-12-05 05:30:27)

Similar Messages

  • Large Aperture file backup strategy

    Hi All,
    I my Aperture library is just a little north of 800GB in size on a 1TB LaCie FW 800 HD (primary).  My previous backup strategy was to copy the library via a spare Mac Server  via 1 GB ethernet to a 36TB Isilon NAS and then copy this back to an identical 1 TB LaCie (backup).  This gave me three backups in three locations. 
    This worked fine and only took about 4 to 5 hours to backup the primary disk to the Isilon and I didn't care how long it took to make the LaCie backup.
    Once I hit 800 GB the time to backup the drive to the Isilon started to get in the range of 12 to 14 hours with more than a 1/2 million files being copied.
    Obviously this isn't working from an efficiency standpoint and I need to come up with a better backup plan one thats reasonably easy to use and faster to get done without sacrificing security of multiple backups in multiple locations.
    Thanks
    Mike

    First, make and confirm two back-ups of your Library and of any Referenced Masters.  Your originals and each of your back-up sets should be on separate spindles.  Move one of the back-up sets off-site.  End of terror: it is now virtually impossible to lose any work you've done up to the point you made the back-ups.
    A good set-up to end up with is this:
    - Library on SSD
    - Masters older than 60 days on external FW800 drive (these will be Referenced Masters; let's call this drive your Referenced Masters Drive).
    - - - These constitute a complete Aperture "set".
    - Back-up Library _and_ Referenced Masters to _another_ external drive (1st Back-up Set)
    - Back-up Library _and_ Referenced Masters to a third external drive (2nd Back-up Set)
    - Keep one Back-up Set off-site.  Never have all three Sets in the same location.
    - Import new files into Library as Managed Masters.  Set Previews to be the size of your laptop screen.
    - Every 60 days, using a Smart Folder to identify Images with Managed Masters older than 60 days, relocate those Managed Masters to your Referenced Masters external drive
    - Back-up regularly
    The above is just a template:  adjust to suit.  (Added:) How you get there will depend on where you are.  I didn't untangle your post.
    When Images' Masters are off-line (you laptop is not connected to your Referenced Masters drive), you will be able to do all metadata administration, including moving Images, etc.  You will not be able to make Adjustments, Export, or Print.  (But you can drag the Preview out of Aperture and use it.)
    Message was edited by: Kirby Krieger

  • IPhoto file and backup strategy

    I am new to iPhoto in last 9 months, but before I get in too deep, would love some advice about how I am structuring the library and about my backup strategy.
    Spent many years on Windows where I would use a typical file Windows file structure. Broke my pics down into years, then events, then pics. Some pictures I would rename "Ella Bday 2006" and others are simply "jpeg-10006". When I moved to the Mac, I've started to use events "Ella Bday 2008" or else "Jan-Mar 2009". I've read that many prefer to create albums and not rely so much on the events. I am not there yet, but that's my next goal.
    My questions are multiple. First, what is the best way to backup all this data so that if I crash, I can easily recreate? I backup the windows files in the same file structure as they existed on my HD, but with photos, events, albums, etc I am not sure the best way on the Mac.
    I did import all the windows pics into iPhoto -- so now I have a single app for all photos dating back to 2004. I am not using Time Machine (remember I am a newbie) at the moment, and my initial plan is to back up each year to a separate DVD for off-site storage. Once this is done, I will look for a more permanent and recurring backup strategy - Time Machine I suppose - that will use an external HD.
    Sorry if my post is convoluted, or if my questions are ignorant, but I was a lifetime Windows guy who is just exploring all the features/functions of Mac -- so any advice or redirection would be fantastic!
    Thanks in advance

    First, what is the best way to backup all this data so that if I crash, I can easily recreate?
    Make a copy of the iPhoto Library in your Pictures Folder. That gets all the photos and database files which contain your organisation.
    If you want to burn a copy of the Library to DVD use the Share => Burn command.
    FWIW here's my back up system. I used to use DVDs but it was too tedious and a bit expensive too. Remember, if you burn, say 2005 to DVD ths year, you will be burning it again in a couple of years as DVDs to degrade over time.
    My Library lives on my iMac.It’s Backed up to two external hard disks every day. These disks are permanently attached to the iMac. These back ups run automatically. One is done by Time Machine, one is a bootable back up done by SuperDuper
    It’s also backed up to a portable hard disk when ever new photos are added. This hard disk lives in my car. For security, this disk is password protected. For this job I use DejaVu because it makes a simple back up that is clear and can be tested easily without doing an full restore.
    I have a second off-site back up at a relative’s house across town. That’s updated every 3 or 4 months.
    My Photos are backed up online. Personally I use SmugMug but there are many options including flickr. However, check the terms of your account carefully. While most sites have free uploading, you will often find that these uploads are limited in terms of the file size, or the bandwidth you can use per month. For access that allows you to upload full size pics with no restrictions you will need to pay.
    Every couple of months I test the back ups to make sure they are working correctly. It’s very easy to mis-configure a back up application, and the only way to protect against that is to do a restore.
    Regards
    TD

  • LR5 multi-platform backup strategy

    > PC (2010):
    SSD: Intel SSD 120Gb (Installed apps: Windows 7, CS6 Master Collection, Lightroom 5)
    HDD1: 1TB (Lightroom gallery 1)
    HDD2: 1TB  (Lightroom gallery 2)
    HDD3: Backup 1 (3TB SDD + HDD1 + HDD3 + OS X)
    > rMBP (2013): i5 2.6 GHz/16GB/512GB (Lightroom temporary Gallery)
    OS X: (200GB): LR5, Windows 7 (300GB): CS6, LR5
    > External Western Digital HDDs (current backup strategy):
    WD1 2TB – Backup 2 (copy of Backup 1)
    WD2 1TB – Backup 3 (copy of Lightroom gallery 1, 2)
    Hi guys, I have recently purchased Lightroom 5 and would appreciate some advice on how to set up a backup strategy prior to syncing all my images with it. Currently, above PC is my primary machine (HDD1, HDD2 with image galleries and HDD3 is primary backup). I also have a mac where OS X and Windows 7 (bootcamp) both have LR5 installed. Images stored on the laptop is temporary while I am out and about which eventually gets moved back to HDD1/HDD2 (with backups in HDD3, WD1). I also a spare 1TB external HDD that I would like to utilize to exclusively store Lightroom files.
    My queries are as follows:
    1) What's the best way to move my laptop's (OS X and bootcamp) temporary LR5 files to my primary PC (HDD1, HDD2)? Can I do this over home network easily?
    2) Best way to sync all Lightroom files (from HDD1 and HDD2) with 1TB external HDD? This is so that I have a copy of my images on the go and can sync the laptop to external drive on longer trips.
    3) As my images (along with other data) gets backed up to HDD3 automatically, should I exclude this drive from sycing with LR5?
    4) Is there any backup software I can use to simplify this backup process?
    Cheers!

    it is interesting topic!
    how do you solve this?
    user-friendly  and   multi-platform-friendly
    backup strategy for LR and PS is maybe interesting for everybody
    (I am looking for help for about 5? main LR install and backup configurations ...   )

  • Best backup strategy

    Time Machine is brilliant. It's saved me many times.
    But recently, the backup drive with a year's worth of backups, died.
    I therefore lost ALL my backups.
    Not a major problem as I still have my current data and having re-formatted the Time Machine drive it's merrily backing it all up again.  (I just hope I don't need to recover to last week's backup ... as I no longer have it.)
    But until that's finished I have NO backups!  Eeek!
    So what is the best backup strategy, bearing in mind drives can fail, houses can burn down, etc.  Should I have two or three Time Machine backup discs and keep swapping them every day so if one dies I've still got a one-day-old version?
    Making DVD backups and storing them elsewhere is very time consuming and while my data is really important to me, it defeats the object if I can get on with any work on that data as I'm constantly burning to lotsof DVDs!
    Your views would be appreciated!

    I pretty much do a similar thing, but my offsite backup goes to a locked cabinet in my office (easier to get to weekly to bring home and update then using a bank - I honestly cannot remember when I last physically went to my bank, its been years).
    TM makes a great source for restoring single files or being able to go back to an earlier version of a file.  And the clones are easier for full restoration, or even for booting from temporarily if the machines boot drive dies and you just want to get in to deauthorize it before sending in for repairs or such.  Always good to have a bootable clone around, for many reasons.
    My external clones are on firewire 800 bus powered portable drives, again for simplicity (no power cables or bricks to go with them).
    I also still burn to optical disc for specific things like filed tax returns, and other financial documents I need to keep around in the fire safe for some period of time.

  • Backup strategy

    Hi,
    short story:
    Due to the structure of the library I have the "originals" and "modified" images, when edited some. When importing a library into another without copying them I have some/many thumbnails twice. How do I prevent iPhoto to use the original thumb when a modified exists?
    Long story:
    I'm looking for a working backup strategy for iPhoto. I'd like to collect fotos over the time and when the library reaches some GByte I'd like to copy it to two external drives in a software raid set (mirror). I know that I can merge the two libraries (one on raid, one on MacBook) with some piece of software. So far so good.
    Now I'd like to have the thumbnails of the fotos on the ext. drive on my MacBook as well. I can import them to a library with the option not to copy the images but leave them where they are (on ext. drive). When I'm on the go I can access the thumbnails. When viewing them I am asked to insert the drive.
    But now comes my problem: Due to the structure of the library I have the "originals" and "modified" images, when edited some. So I have some thumbnails twice. How do I prevent iPhoto to use the original thumb when a modified exists?
    Thanks a lot.

    What you really need to use isMedia Expression. It creates catalogs containing thumbnails of the cataloged photos and the catalog can be used without the source file being available.
    You can add keywords and other identifiers to the photos while just using the catalog and then when you get back and have the source files available the new metadata can be applied to the actual file.
    Expression Media appears to be available for the upgrade price of $99 if you are using iPhoto as shown here:
    $99 (Full Version $199)
    For qualifying owners of:
    Licensed copy of an earlier version of Expression Media or any iView Multimedia product.
    OR
    Any licensed photographic software, including Windows Photo Gallery or iPhoto
    That info is from this site: http://www.microsoft.com/expression/products/Upgrade.aspx and by clicking on the "here" link on that page. I've emailed the developers to see if I've intrepreted that page correctly.
    In the meantime you can download and run the demo. You can also catalog your iPhoto library with EM and write all metadata entered in iPhoto back to the original files. Steps 1-7 of Old Toad's Tutorial #1 describe how. That page is in revision since iView is no longer available and Step #2 is slightly different with the EM demo. It should be changed by the end of the day today.
    I use it in conjunction with iPhoto. EM is my primary DAM (digital asset management) application and I use iPhoto for projects like books, calendars, slideshows, etc.
    TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto (iPhoto.Library for iPhoto 5 and earlier) database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
    I've created an Automator workflow application (requires Tiger or later), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. It's compatible with iPhoto 6 and 7 libraries and Tiger and Leopard. Just put the application in the Dock and click on it whenever you want to backup the dB file. iPhoto does not have to be closed to run the application, just idle. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file.
    Note: There's now an Automator backup application for iPhoto 5 that will work with Tiger or Leopard.

  • Backup strategy in FRA

    Hi Experts,
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
    PL/SQL Release 11.1.0.6.0 - Production
    CORE     11.1.0.6.0     Production
    TNS for HPUX: Version 11.1.0.6.0 - Production
    NLSRTL Version 11.1.0.6.0 - ProductionI would like to ask some advice to place my backup in FRA.
    I read in a book that it is adviced not to put the archivelog backup in the FRA since if something happen with the disk that store FRA, everything will be gone. However the concept of FRA is a centralized backup logically.
    so based on your experiences, what is the best way to utilize FRA for backup strategy?
    is it safe to put everything in FRA or we should still split the backup for archivelog to different location?
    thanks

    The idea is that you should never have a single copy of your backup anyways, be it of whatever file type. Its true that the FRA gone would lead to the non-availability of the backups but then , you should had pushed your backups from it to the tape drive eventually to safeguard them. So there wont' be an harm in putting the backup in the FRA as long as you multiplex the backup and keep it on some another location as well.
    HTH
    Aman....

  • Backup strategy for ebs R12?

    Hi,
    I have a couple of questions regarding to backup strategy for ebs R12.
    Let's say that we have one application server and one database server running, and we would like to have a backup periodically for both servers.
    This means that we want to have the exactly same backup of configurations and other files for the application server and the database server in case of any fatal crashes in the servers.
    I went through some discussion threads and it seems common way is to use RMAN for the data of database and file system copy for application and database servers. But, I would like to know what's the best way or recommended way for the backup strategy for data and server files of Oracle e-business suite.
    Do you guys have any suggestion for this?
    Thanks in advance.
    Cheers,
    SH

    Hi SH;
    I suggest check first our previous discussion
    backup and recovery startegy for EBS
    Re: Backup System
    Backup Policy for EBS can
    If you make search you can find many other thread which is mention same question
    For your issue basicly what we do
    1. For our prod db, if possible we take weekly cold backup(sunday) and take rman backup also daily(full backup, but it depends system you can prefer to take incremental)
    2. For our system we take file system backup weekly(OS base)
    3. For application we take backup every sunday(for some system 15 days or 30 days period). The most important point here if your system prod, and if you need to apply patch to your system you need to have valid backup and also need backup for after patching
    One other important issue is we run preclone sh before take backup for apps and db tier
    Regard
    Helios

  • What is the backup strategy for your rpool?

    Hi all,
    May I know what is your backup strategy for rpool? like how to integrate with the netbackup or any other backup software
    Is it using zfs send a data stream to a file and backup that file using backup software??
    Aside the above method, can we also use the UFS method?
         backup software to backup entire / /var..........  etc....
         1. re install the OS
         2. install the backup client
         3. create a zpool (rpool-restore)  using a 2nd disk,
         4. mount the new zpool  (rpool-restore) to /restore
         5. restore all the file in  /restore
         6. install boot blk,
         7. boot from 2nd disk
    Any more idea?

    HI Willy,
    According to the Flash Archives limitation stated in Oracle website, if the OS is with a child zone, it will not work right?
    http://docs.oracle.com/cd/E19253-01/821-0436/6nlg6fi8u/index.html
    I am thinking that using the traditional way to do it.
    Use a backup software, for example, networker, TSM, netbackup.. to back up the following:
    /rpool
    /export/
    /export/home
         1. re install the OS
         2. install the backup client
         3. create a zpool (rpool-restore)  using a 2nd disk,
         4. mount the new zpool  (rpool-restore) to /restore
         5. restore all the file in  /restore
         6. install boot blk,
         7. boot from 2nd disk
    will it still works?

  • Any new backup strategy with Oracle VM 3.1 ?

    Now, since Oracle VM 3.1 was released, could we be updated with backup strategy.
    Backing up OCFS2 repositories allow us protect OVF files right?, but what happen with apps runing on VMs, these data is also backed up? what about open files, databases, etc.
    I will appreciate any comment.
    Thanks in advance.

    try running this on each server:
    /usr/bin/rescan-scsi-bus.sh

  • Synchronous multimaster replication Backup strategy using RMAN

    Hi all,
    I am using synchronous multimaster replication. my question is backup strategy has to perofirm in all the master site or only in any of 1 mastersite? what all problems will encounter in RMAn backup in my scenario please help me out with your suggestion
    nagaraj.K
    [email protected]

    You ask: "I want to configure backup strategy using RMAN. any one can help me that"
    And the answer is clearly no we can not.
    An RMAN backup strategy depends on your SLA (Service Level Agreement) with your customers that you didn't post. What is your down-time window? What is your skill set?
    You ask: "How to configure for RMAN Incremental backup?"
    Read the docs at http://tahiti.oracle.com for your version (which you didn't mention).
    You ask: "What will be backup space and there size ?"
    We have no idea. Are you going to create an image copy or a backup set? Read the docs and learn the difference. Are you going to turn on compression? Are you going to back up only changed blocks? We don't know.
    You ask: "how to manage growing online archiving files?"
    Again we can't help you without knowing your SLA. How many MG/GB are they? What period of time do you need to retain them to meet your SLA? When you figure this out back up to tape those you don't need to keep on the server.
    You ask: "how to manage growing data and there disk space?"
    This is one we can answer: BUY MORE DISK! That was easy.
    You ask: "How we can give good performance our CPU and memory?"
    Do you really expect that we can with zero knowledge of your environment, your version, your application, etc. distill into a few short paragraphs the collective wisdom of Cary Millsap, Jonathan Lewis, and the rest of the Oak Table Network membership?
    Of course we can not. So what you need to do is buy Cary's book, Jonathan's book, and take their classes.
    You ask: "we need keep all archive log in backup files or we need to remove old archive files?"
    Remove the old ones after being absolutely sure you have a good backup.
    You ask: "where we can take backup tape drive,SAN,disk removable hard disk? which one is better?"
    No one solution is better than the other. They are all adequate.

  • Questions on Oracle EBS backup strategy

    Hi all,
    Currently I am planning the backup strategy for the EBS. May I know apart from backup the database using RMAN, is there anything (e.g. log files) that I need to backup such that I can fully restore the system?
    Thanks a lot in advance.
    Alex

    I see that this is your first post, and you have not searched on this forum or on google about n-number of responses to your question that various people have already asked.
    You didn't mentioned the applicaiton version you are using.
    Anyways, you need to take the complete backup of the application tier directories i.e., appl_top, common_top, inst_top, etc.

  • Hyperion backup strategy

    Due to limited resources at my client's expense, we have only 2 environments to work with: Development and Production.
    We are trying to come up with the fittest strategy for disaster recovery and backup purposes.
    As of now, we have our backup process for Hyperion Data Files separate from another backup process for Hyperion outline, calc scripts, rule files, and reports. The backup process for Data Files takes places once a week, whereas the backup process for outline/calc scripts/rule files/reports runs every night.
    The greatest reason for needing a backup strategy is data loss causing by human error. For instance, someone has accidentally deleted members from the outline in Production so we need to restore the outline in Production from the backup. With the only 2 environments we have on hand, we would like to use Development server as our backup server and try to have everything in Development in sync to Production as frequent as no more than 2 days. So for example, someone has deleted the members on Thursday. We want to make sure to have a backup copy in Development ready to use and restore from Tuesday.
    We still want to use our Development environment for developing and testing purposes. However, because of this making room for backup, we seem to have no choice but to also use Development as our backup repository. I was wondering if we could create multiple applications in Development for both purposes: backup and testing.
    Last but not least, even the backup in Development is no more than 2 days old in sync to Production. The changes we have made during those 2 days are still nowhere to be recorded. What kind of practice does the industry usually go about this issue during those days when the data are lost? Should users document all the changes they make during those 2 days so that when a disaster happens and the backup takes place, the users would still have references as a reminder in case if they don't remember all the details and things that they've done in those 2 days?
    I know it's long, but thanks for taking time reading this.

    Thrusday level1 backup - Now this will consider level0 backup to continue for the level1 back or it will consider previous level1 backup?level 0 backup.
    As per my understanding, level 1 backup is considers Level0 for incremental back up.Right

  • Backup strategy and hardware usage

    I am trying to establish a backup policy for Small Business Suite 6.6
    (netware 6.5). On Friday evening I do a daily backup scheduled at 10P but
    on Saturday, I also do a Weekly backup of system files etc. Since these
    are unattended backups, there is no one to change the tape for the weekly
    backup. I have another server on line with identical hardware but I can
    not see the tape drive of the second server from the first. Is there a
    way to use the tape drive (server1)from server0 so I can put the two
    tapes to be used in their respective drives before I leave on Friday
    night? I am using Backup Exec 9.2 Any other suggestions on backup
    strategy would also be appreciated.
    Thanks, Roger

    Roger -
    I do the following 12-tape backup:
    Mon - Thur, week 1 (4 tapes)
    Mon - Thur, Week 2 (4 tapes)
    Fri 1, Fri 2, etc (4 tapes).
    The rare 5th Friday can be a problem in the first year, but... You reuse
    year-old tapes as month-end tapes after the first year.
    The Thursday tapes go out the door on Friday afternoon, meaning that they're
    always at least one tape offsite. Month-end tapes go to a long-term storage
    spot, either offsite or in a fire-proof safe.
    I make a calendar at the beginning of the year for all my clients to help
    them keep the tapes straight.
    In addition, I use rsync to do an onsite backup to a NAS box (free software,
    old workstation) as a second backup. It doesn't back up eDir, and of course
    it doesn't go offsite, but it's quick and easy when you need to restore a
    file or two!
    <[email protected]> wrote in message
    news:F1DRg.2891$[email protected]..
    >I am trying to establish a backup policy for Small Business Suite 6.6
    > (netware 6.5). On Friday evening I do a daily backup scheduled at 10P but
    > on Saturday, I also do a Weekly backup of system files etc. Since these
    > are unattended backups, there is no one to change the tape for the weekly
    > backup. I have another server on line with identical hardware but I can
    > not see the tape drive of the second server from the first. Is there a
    > way to use the tape drive (server1)from server0 so I can put the two
    > tapes to be used in their respective drives before I leave on Friday
    > night? I am using Backup Exec 9.2 Any other suggestions on backup
    > strategy would also be appreciated.
    >
    >
    > Thanks, Roger

  • Time Capsule backup strategy for multiple drives and OS

    I'd like a backup strategy based on Time Capsule but have a few twists from a straightforward situation and would appreciate any guidance with setting it up. My hardware situation is:
    - Two MacBooks, on 10.5, one with an 80 gb drive, the other with a 120 gb drive.
    - One G4 iMac, on 10.4, with a 60 gb drive
    - One LaCie NAS drive, used primarily as the storage space for music and video, and currently not backed up anywhere (making me nervous!)
    Ideally I'd like each of these four units to be backed up on Time Capsule; and my configuration questions are in two areas:
    1) Backup software: I'll use Time Machine to backup the MacBooks, but not sure about the G4 and NAS drive. These could be backed up on a more ad hoc basis (I don't need hourly updates). Is this best handled with a third party app like SuperDuper to allow for incremental backups?
    2) Partition/disk image: is it wise (or indeed necessary) to create partitions for each of the above sources (I gather from reading other posts is seems the best way to achieve a partition is to create a disk image). I'm not sure if there would be any unintended consequences by doing so (or not),and if so if a sparse image disk image is better than a sparse bundle disk image given the above, so any guidance to help make sure it's set up as flexibly as possible from the outset would be greatly appreciated.
    Thanks in advance for any advice.

    G'day Michael,
    Sounds like it is time to buy OS X Server!!!!
    I think you are on the right track. As a "left field" suggestion, why not get two big drives, and use one of them as a file server. Get everyone to use the file server for any critical documents that they want backed up centrally/continuously. Then back the file server system up onto the other big drive using TM.
    This reduces administration somewhat, and privacy can be retained with careful username management across the network.
    Assuming that the three MacBooks are similarly configured from a system perspective, there is not a lot of point in having full bootable clone's of each one. Perhaps create a bootable clone from one of them each week - and clone the other two into disk images (which are not bootable, but are restorable) as frequently as needed.
    Cheers,
    Rodney

Maybe you are looking for

  • Issue printing PDF Packages

    I'm having issues printing PDF Packages.Clients have contacted us stating that our PDF Packages don't print properly. Essentially they were only able to print the "coversheet" and nothing else. I ran some test scenarios and noted that PDF Packages pr

  • Saving word doc in database and retrieving

    hai all, Is it possible to save a word file in database and retrieve the word file in web dynpro application? Thanks n Regards Sharanya.R

  • I was forced to update Itunes and now it wont open on my computer??

    It just keeps telling me to uninstall and reinstall but that doesn't work.

  • How to draw a border around a TreeNode control?

    I have a TreeView that contains levels of TreeNodes. How do I draw borders around the text of the TreeNode? The TreeNode class does not accept a border style property. Assigning the property to TreeView only draws an outside window around the entire

  • Does 3.0.9 Portlet Work in 9.0.2?

    Hi folks, I have a forum portlet built with PL/SQL PDK version 3.0.9. I would like to know If it is going to work with Portal 9.0.2 version. If not, is there anything that I need to change in my code ? any tips? Thanks in advance.