Backup Mail Procedure: Best Practices

We are a mixed-environment company, 5 Windows & 4 Apple servers. We have a dedicated backup server (Windows 2003) where we run Symantec Backup Exec to go out and back up data across our network during the night.
I would like to incorporate our new Mail Server (10.5) into this Backup Plan, but need to know:
1) What directories should I completely backup?
2) Do I need to stop and restart the mail service (postfix / cyrus) during the backup of these directories or can I have Backup Exec copy them 'on the fly'?
Thanks to all in advance.

I am not familiar with Symantec Backup Exec, but I do know that you have to stop mail services to get a reliable backup of your mail store. I use mailbfr to backup the mail store and settings each night to an external hard drive. I have not really bothered to further incorporate the mailbfr backups into our regular Retrospect backup, but I suppose it could be done. It could work with your Symantec Backup Exec, but you'd have to try it. Perhaps some one else on these forums has a bit more experience with that.
Just so you know, I have used a mailbfr backup to do a full and successful restore of a crashed mail server and have used it to migrate from 10.4 server to 10.5 server. I highly recommend its use.
It is available here:
http://osx.topicdesk.com/

Similar Messages

  • Apple Mail Archive - Best Practices

    Dear Fellow Apple Heads:
    Does anyone have any best practices to archive mail so that I can backup an mbox file and remove the archived items from current use?

    Greetings,
    There are two ways to do this, but since I don't really use 10.5.x I can't be sure of one of them. In Mail, there should be an option to Archive Mailbox somewhere, but I just don't know because I don't use it. The other way is just to copy the folder of the mailbox you're looking to archive onto your Desktop or wherever you want to save it.
    The drawback is that if you want to open these messages again, you can't do it from Mail; you would have to double-click on each one to open it, and the message name doesn't tell you anything about who it's from or what it's about. There's at least one third-party mail archiving solution on the market, but I don't remember the name offhand.
    The best option is to always have a current, bootable backup of your entire drive so you can restore your Mac to the way it was before the hard drive failed or that unexpected power outage causes a bunch of files to get corrupted or just disappear.

  • Procedure/Best Practices - For providing SAP BASIS SUPPORT

    Hi All,
    I need information on SAP BEST PRACTICES on providing BASIS SUPPORT for SAP4.7.
    If you can give me any link or document regarding this that would be of great help to me.
    Expecting the replies...
    Thanks
    Cheenu

    Hi Cheenu,
    I'm not quite sure what do you mean by "providing SAP BASIS SUPPORT".
    Run SAP roadmaps describe the recommended methodology for End-to-end operations of SAP solution, including technical operations (basis). You can check them at service.sap.com/roadmaps (go to "Run SAP Roadmap"). If you need a list of tasks you should perform in a scope of solution operations you'll find it there as well.
    SAP solution management is based on ITIL methodology. You can find more about ITIL at www.best-management-practice.com/IT-Service-Management-ITIL/.
    Hope this helps.
    KR,
    Grigor

  • BO4 - backup .unx locally - best practice?

    Hi,
    In addition to our standard cms + filestore backups, we wish to backup the new .unx files locally as we do with our .unv files.
    What is the best way to do this?
    Understand a .unx composed of the data foundation and business layer and these together with the secured connection comprise the .unx file on the repository.
    How can we backup a sinlge .unx file such that we could use it at later date if one deleted from repository.
    Thanks

    I would use Process Explorer / File MOnitor utilities (download from MS SysInternals website), and maybe Dependency Explorer. ..
    Then spend a little time determining 'when', 'where' and which files & folders, are I/O during your the local save-open workflow from IDT
    Regards,
    H

  • SCM / APO shut down start up; (restore) activities,procedures,best practice

    Dear ALL,
    What are the generic steps that need to be followed from a Functional Consultant perspective?
    I am not sure if this must be part of a Basis thread
    Anyone can share & enlighten the forum on this
    Some of the steps module wise are
    FUNCTIONAL steps
    1.    DP
    1.1  Downloading of CVC's 
    1.2  Backup of all infocubes / remote query cube
    1.3  Download of Planning book data
    2.     SNP PPDS
    2.1   Download Product Master, PPM, Resource data of '000'
    2.2   Download 'all order data' from the product overview
    TECHNICAL steps
    0. Complete off line back up of the system
    1. Lock users except admin
    2. Stop CIF
    3. Stop live cache
    What are the restore steps
    Anything missed out functionally / technically is welcome
    Points would be awarded

    Hi Sridhar,
    During shut down & start up of APO system, functional part
    involvement is mandatory and it is not an isolated work of
    basis alone.
    Moreover, in addition to the steps that you have mentioned,
    the following are some of the steps that you can follow it up.
    1) DP:-
    Before shut down:-Copy of data to other simulation versions
    from active versions if required. Locking users, Postponing or
    rescheduling of jobs accordingly
    After shutdown:- Time series consistency checks
    Validation of data, Scheduling jobs, unlock users
    2) SNP PPDS:-
    Before shut down:- Jobs postponement or rescheduling,
    Informing users to take the necessary deliveries from stos
    from snp for execution
    After shutdown:- Download of planning data and validation,
    Scheduling of jobs, Order conversions,
    3) Technical Steps:- (CIF / Livecache)
    Before shutdown:- Rescheduling of integration model jobs
    if any,
    After shutdown:- Scheduling of integration model jobs,
    releasing locked users, Livecache consistency checks,
    Delta report, Monitoring and clearing inbound/outbound
    queues etc.,
    Regards
    R. Senthil Mareeswaran.

  • Server backups, Time Machine, best practices

    Hey guys, I've got a Mac Mini running Lion Server that is providing VPN, Profile Manager, and some file sharing to macs and i-devices in my household.  I also am using portable home folders for the kids accounts. (and soon I may be doing the same for my account and the wife's account)  Connected to my Mini is a Drobo. 
    Currently I'm using Crashplan to backup data on all of our client machines.  (an iMac, and a couple of macbooks)  However, I would like to add Time Machine to the mix.  Here are my thoughts:
    client machines:
    - use TM to backup to the mac mini server.  (backups stored on drobo)
    - EXCLUDE the 'local' (sync'd) home folders for network users, since their home folders are actually stored on the server. 
    server:
    - use TM locally on the server to backup itself to a separate external disk.
    - this backup should INCLUDE the users home folders since they're not getting backed-up on the client side. 
    - optionally install crashplan on the server and use it to backup users' folders as well??
    The whole goal here is convenience.  I trust Crashplan to backup my important stuff offsite (pictures, videos, etc), but in the event of a disk failure I want an easy, no-hassle way to fully restore a machine - either client or server machine - back to its original state before the disk failure.  The only thing that has me scratching my head a bit is the user folder stuff.  If everyone had local accounts it would be easy - just use TM to backup everything.  But since the home folders (for the network users) are actually stored on the server, with a syncronized version on the client, it complicates things a bit.
    Love to hear anyone's feedback on how to proceed.  Thanks!

    It certainly sounds like a solid plan, you're correct in that it doesn't make sense to backup the Home folders for network users, since those are on the Server, and in fact you might be able to get by without TM by considering the local Macs themselves the backups of the home folder, the odds of losing both are highly unlikely, except by some sort of proximity event, like a burglary or house fire (knock on wood).  In that way, I would actually say cloud storage would be better, but again, I would be satisfied with just the "backups" being the local machines themselves, and you're going that extra mile by backing said Home folders via TM (I'm assuming via a Time Capsule or external hard drive?).
    Since the goal is convenience, you may also consider adding a second drive to your Mac Mini, assuming it doesn't already have one.  Currently, I'm using RAID1 (mirroring) with two hard drives in my Mac Mini, so should a hard drive fail, it will automatically boot from the second hard drive and let you know the first drive failed.  It's certainly cheaper than a Time Capsule, especially if you have the tech prowess to take apart your Mac Mini and install a second hard drive.  I purchased iFixit's kit for adding a second hard drive, and while the $50 cost wasn't ideal, it beat several hundred dollars for a Time Capsule, especially since I don't need Terabytes of storage and I happened to have a spare 500GB HDD laying around... That could be you, too!
    As for the Home folders, again I would go the simple route and just use a RAID1 array in the Mac Mini, that way there's no need to backup the Home folders, they're already automatically backed up to the second hard drive with RAID1.
    As you can tell, I like RAID, but also, if you go the RAID route, they say HDD read speeds increase (there's some conflict on the Internet it seems, but I believe it... I see around a 20MB/s speed boost, not a huge deal, but it's something).

  • Backup validation best practice  11GR2 on Windows

    Hi all
    I am just reading through some guides on checking for various types of corruption on my database. It seems that having DB_BLOCK_CHECKSUM set to TYPICAL takes care of much of the physical corruption and will alert you to the fact any has occurred. Furthermore RMAN by default does its own physical block checking. Logical corruption on the other hand does not seem to be checked automatically unless the CHECK LOGICAL is added to the RMAN command. There are also various VALIDATE commands that could be run on various objects.
    My question is really, what is best practice for checking for block corruption. Do people even bother regularly checking this and just allow Oracle to manage itself, or is it best practice to have the CHECK LOGICAL command in RMAN (even though its not added by default when configuring backup jobs through OEM) or do people schedule jobs and output reports from a VALIDATE command on a regular basis?
    Many thanks

    To use CHECK LOGICAL clause is considered best practice at least by Oracle Support according to
    NOTE:388422.1  Top 10 Backup and Recovery best practices
    (referenced in http://blogs.oracle.com/db/entry/master_note_for_oracle_recovery_manager_rman_doc_id_11164841).

  • Database Administration - Best Practices

    Hello Gurus,
    I would like to know various best practices for managing and administering Oracle databases. To give you all an example what I am thinking about - for example, if you join a new company and would like to see if all the database conform to some kind of standard/best practices, what would you look for - for instance - are the control files multiplexed, are there more than one member for each redo log group, is the temp tablespace using TEMPFILE or otherwise...something of that nature.
    Do you guys have some thing in place which you use on a regular basis. If yes, I would like to get your thoughts and insights on this.
    Appreciate your time and help with this.
    Thanks
    SS

    I have a template that I use to gather preliminary information so that I can at least get a glimar of what is going on. I have posted the text below...it looks better as a spreedsheet.
    System Name               
    System Description               
         Name      Phone     Pager
    System Administrator               
    Security Administrator               
    Backup Administrator               
    Below This Line Filled Out for Each Server in The System               
    Server Name               
    Description (Application, Database, Infrastructure,..)               
    ORACLE version/patch level          CSI     
              Next Pwd Exp     
    Server Login               
    Application Schema Owner               
    SYS               
    SYSTEM               
         Location          
    ORACLE_HOME               
    ORACLE_BASE               
    Oracle User Home               
    Oracle SQL scripts               
    Oracle RMAN/backup scripts               
    Oracle BIN scripts               
    Oracle backup logs               
    Oracle audit logs               
    Oracle backup storage               
    Control File 1               
    Control File 2               
    Control File 3                    
    Archive Log Destination 1                    
    Archive Log Destination 2                    
    Datafiles Base Directory                    
    Backup Type     Day     Time     Est. Time to Comp.     Approx. Size
    archive log                    
    full backup                    
    incremental backup                    
    As for "Best" practices, well I think that you know the basics from your posting but a lot of it will also depend on the individual system and how it is integrated overall.
    Some thoughts I have for best practices:
    Backups ---
    1) Nightly if possible
    2) Tapes stored off site
    3) Archives backed up through out day
    4) To Disk then to Tape and leave backup on disk until next backup
    Datafiles ---
    1) Depending on hardware used.
    a) separate datafiles from indexes
    b) separate high I/O datafiles/indexes on dedicated disks/lungs/trays
    2) file names representative of usage (similar to its tablespace name)
    3) Keep them of reasonable size < 2 GB (again system architecture dependent)
    Security ---
    At least meet DOD - DISA standards where/when possible
    http://iase.disa.mil/stigs/stig/database-stig-v7r2.pdf
    Hope that gives you a start
    Regards
    tim

  • Best practice for E-business suite 11i or R12 Application backup

    Hi,
    I'm taking RMAN backup of database. What would be "Best practice for E-business suite 11i or R12 Application backup" procedure?
    Right now I'm taking file level backup. Please suggest if any.
    Thanks

    Please review the following thread, it should be helpful.
    Reommended backup and recovery startegy for EBS
    Reommended backup and recovery startegy for EBS

  • Portal backup best practice!!

    Hi ,
    Can anyone let me know where can i get Portal Backup Best Practices!! We are using SAP EP7.
    Thanks,
    Raghavendra Pothula

    Hi Tim: Here's my basic approach for this -- I create either a portal dynamic page or a stored procedure that renders an HTML parameter form. You can connect to the database and render what ever sort of drop downs, check boxes, etc you desire. To tie everything together, just make sure when you create the form, the names of the fields match that of the page parameters created on the page. This way, when the form posts to the same page, it appends the values for the page parameters to the URL.
    By coding the entire form yourself, you avoid the inherent limitations of the simple parameter form. You can also use advanced JavaScript to dynamically update the drop downs based on the values selected or can cause the form to be submitted and update the other drop downs from the database if desired.
    Unfortunately, it is beyond the scope of this forum to give you full technical details, but that is the approach I have used on a number of portal sites. Hope it helps!
    Rgds/Mark M.

  • Best practices for speeding up Mail with large numbers of mail?

    I have over 100,000 mails going back about 7 years in multiple accounts in dozens of folders using up nearly 3GB of disk space.
    Things are starting to drag - particularly when it comes to opening folders.
    I suspect the main problem is having large numbers of mails in those folders that are the slowest - like maybe a few thousand at a time or more.
    What are some best practices for dealing with very large amounts of mails?
    Are smart mailboxes faster to deal with? I would think they would be slower because the original emails would tend to not get filed as often, leading to even larger mailboxes. And the search time takes a lot, doesn't it?
    Are there utilities for auto-filing messages in large mailboxes to, say, divide them up by month to make the mailboxes smaller? Would that speed things up?
    Or what about moving older messages out of mail to a database where they are still searchable but not weighing down on Mail itself?
    Suggestions are welcome!
    Thanks!
    doug

    Smart mailboxes obviously cannot be any faster than real mailboxes, and storing large amounts of mail in a single mailbox is calling for trouble. Rather than organizing mail in mailboxes by month, however, what I like to do is organize it by year, with subfolders by topic for each year. You may also want to take a look at the following article:
    http://www.hawkwings.net/2006/08/21/can-mailapp-cope-with-heavy-loads/
    That said, it could be that you need to re-create the index, which you can do as follows:
    1. Quit Mail if it’s running.
    2. In the Finder, go to ~/Library/Mail/. Make a backup copy of this folder, just in case something goes wrong, e.g. by dragging it to the Desktop while holding the Option (Alt) key down. This is where all your mail is stored.
    3. Locate Envelope Index and move it to the Trash. If you see an Envelope Index-journal file there, delete it as well.
    4. Move any “IMAP-”, “Mac-”, or “Exchange-” account folders to the Trash. Note that you can do this with IMAP-type accounts because they store mail on the server and Mail can easily re-create them. DON’T trash any “POP-” account folders, as that would cause all mail stored there to be lost.
    5. Open Mail. It will tell you that your mail needs to be “imported”. Click Continue and Mail will proceed to re-create Envelope Index -- Mail says it’s “importing”, but it just re-creates the index if the mailboxes are already in Mail 2.x format.
    6. As a side effect of having removed the IMAP account folders, those accounts may be in an “offline” state now. Do Mailbox > Go Online to bring them back online.
    Note: For those not familiarized with the ~/ notation, it refers to the user’s home folder, i.e. ~/Library is the Library folder within the user’s home folder.

  • Backup "best practices" scenarios?

    I'd be grateful for a discussion of different "best practices" regarding backups, taking into consideration that a Time Machine backup is only as good as the external disk it is on.
    My husband & I currently back up our two macs, each to its own 500 GB hard disk using TM. We have a 1TB disk and I was going to make that a periodic repository for backups for each of our macs, in case one of the 500GB disks fails, but was advised by a freelance mac tech not to do that. This was in the middle of talking about other things & now I cannot remember if he said why.
    We need the general backup, and also (perhaps a separate issue) I am particularly interested in safeguarding our iPhoto libraries. Other forms of our data can be redundantly backed up to DVDs but with 50GB iPhoto libraries that doesn't seem feasible, nor is backing up to some online storage facility. My own iP library is too large for my internal disk; putting the working library on an external disk is discouraged by the experts in the iPhoto forum (slow access, possible loss if connexion between computer & disk interrupted during transfer) so my options seem to be getting a larger internal disk or using the internal disk just for the most recent photos and keeping the entire library on an external disk, independent of incremental TM backups.

    It's probably more than you ever wanted to know about backups, but some of this might be useful:
    There are three basic types of backup applications: *Bootable Clone, Archive, and Time Machine.*
    This is a general explanation and comparison of the three types. Many variations exist, of course, and some combine features of others.
    |
    _*BOOTABLE "CLONE"*_
    |
    These make a complete, "bootable" copy of your entire system on an external disk/partition, a second internal disk/partition, or a partition of your internal disk.
    Advantages
    If your internal HD fails, you can boot and run from the clone immediately. Your Mac may run a bit slower, but it will run, and contain everything that was on your internal HD at the time the clone was made or last updated. (But of course, if something else critical fails, this won't work.)
    You can test whether it will run, just by booting-up from it (but of course you can't be positive that everything is ok without actually running everything).
    If it's on an external drive, you can easily take it off-site.
    Disadvantages
    Making an entire clone takes quite a while. Most of the cloning apps have an update feature, but even that takes a long time, as they must examine everything on your system to see what's changed and needs to be backed-up. Since this takes lots of time and CPU, it's usually not practical to do this more than once a day.
    Normally, it only contains a copy of what was on your internal HD when the clone was made or last updated.
    Some do have a feature that allows it to retain the previous copy of items that have been changed or deleted, in the fashion of an archive, but of course that has the same disadvantages as an archive.
    |
    _*TRADITIONAL "ARCHIVE" BACKUPS*_
    |
    These copy specific files and folders, or in some cases, your entire system. Usually, the first backup is a full copy of everything; subsequently, they're "incremental," copying only what's changed.
    Most of these will copy to an external disk; some can go to a network locations, some to CDs/DVDs, or even tape.
    Advantages
    They're usually fairly simple and reliable. If the increments are on separate media, they can be taken off-site easily.
    Disadvantages
    Most have to examine everything to determine what's changed and needs to be backed-up. This takes considerable time and lots of CPU. If an entire system is being backed-up, it's usually not practical to do this more than once, or perhaps twice, a day.
    Restoring an individual item means you have to find the media and/or file it's on. You may have to dig through many incremental backups to find what you're looking for.
    Restoring an entire system (or large folder) usually means you have to restore the most recent Full backup, then each of the increments, in the proper order. This can get very tedious and error-prone.
    You have to manage the backups yourself. If they're on an external disk, sooner or later it will get full, and you have to do something, like figure out what to delete. If they're on removable media, you have to store them somewhere appropriate and keep track of them. In some cases, if you lose one in the "string" (or it can't be read), you've lost most of the backup.
    |
    _*TIME MACHINE*_
    |
    Similar to an archive, TM keeps copies of everything currently on your system, plus changed/deleted items, on an external disk, Time Capsule (or USB drive connected to one), internal disk, or shared drive on another Mac on the same local network.
    Advantages
    Like many Archive apps, it first copies everything on your system, then does incremental backups of additions and changes. But TM's magic is, each backup appears to be a full one: a complete copy of everything on your system at the time of the backup.
    It uses an internal OSX log of what's changed to quickly determine what to copy, so most users can let it do it's hourly incremental backups without much effect on system performance. This means you have a much better chance to recover an item that was changed or deleted in error, or corrupted.
    Recovery of individual items is quite easy, via the TM interface. You can browse your backups just as your current data, and see "snapshots" of the entire contents at the time of each backup. You don't have to find and mount media, or dig through many files to find what you're looking for.
    You can also recover your entire system (OSX, apps, settings, users, data, etc.) to the exact state it was in at the time of any backup, even it that's a previous version of OSX.
    TM manages it's space for you, automatically. When your backup disk gets near full, TM will delete your oldest backup(s) to make room for new ones. But it will never delete it's copy of anything that's still on your internal HD, or was there at the time of any remaining backup. So all that's actually deleted are copies of items that were changed or deleted long ago.
    TM examines each file it's backing-up; if it's incomplete or corrupted, TM may detect that and fail, with a message telling you what file it is. That way, you can fix it immediately, rather than days, weeks, or months later when you try to use it.
    Disadvantages
    It's not bootable. If your internal HD fails, you can't boot directly from your TM backups. You must restore them, either to your repaired/replaced internal HD or an external disk. This is a fairly simple, but of course lengthy, procedure.
    TM doesn't keep it's copies of changed/deleted items forever, and you're usually not notified when it deletes them.
    It is fairly complex, and somewhat new, so may be a bit less reliable than some others.
    |
    RECOMMENDATION
    |
    For most non-professional users, TM is simple, workable, and maintenance-free. But it does have it's disadvantages.
    That's why many folks use both Time Machine and a bootable clone, to have two, independent backups, with the advantages of both. If one fails, the other remains. If there's room, these can be in separate partitions of the same external drive, but it's safer to have them on separate drives, so if either app or drive fails, you still have the other one.
    |
    _*OFF-SITE BACKUPS*_
    |
    As great as external drives are, they may not protect you from fire, flood, theft, or direct lightning strike on your power lines. So it's an excellent idea to get something off-site, to your safe deposit box, workplace, relative's house, etc.
    There are many ways to do that, depending on how much data you have, how often it changes, how valuable it is, and your level of paranoia.
    One of the the best strategies is to follow the above recommendation, but with a pair of portable externals, each 4 or more times the size of your data. Each has one partition the same size as your internal HD for a "bootable clone" and another with the remainder for TM.
    Use one drive for a week or so, then take it off-site and swap with the other. You do have to tell TM when you swap drives, via TM Preferences > Change Disk; and you shouldn't go more than about 10 days between swaps.
    There are other options, instead of the dual drives, or in addition to them. Your off-site backups don't necessarily have to be full backups, but can be just copies of critical information.
    If you have a MobileMe account, you can use Apple's Backup app to get relatively-small amounts of data (such as Address book, preferences, settings, etc.) off to iDisk daily. If not, you can use a 3rd-party service such as Mozy or Carbonite.
    You can also copy data to CDs or DVDs and take them off-site. Re-copy them every year or two, as their longevity is questionable.
    Backup strategies are not a "One Size Fits All" sort of thing. What's best varies by situation and preference.
    Just as an example, I keep full Time Machine backups; plus a CarbonCopyCloner clone (updated daily, while I'm snoozing) locally; plus small daily Backups to iDisk; plus some other things to CDsDVDs in my safe deposit box. Probably overkill, but as many of us have learned over the years, backups are one area where +Paranoia is Prudent!+

  • Which are the best practices with mail for machos and any email client for PC

    I got some mac`s at the office, and I want make a best practices manual for my users, like, please open the clip to attach the file intead of drag and drop, beacuse the pc users see the photo embedded into the message body.
    could someone helpme?

    Why not print out Mail's Help files?

  • Best practice for calling stored procedures as target

    The scenario is this:
    1) Source is from a file or oracle table
    2) Target will always be oracle pl/sql stored procedures which do the insert or update (APIs).
    3) Each failure from the stored procedure must log an error so the user can re-submit the corrected file for those error records
    There is no option to create an E$ table, since there is no control option for the flow around procedures.
    Is there a best practice around moving data into Oracle via procedures? In Oracle EBS, many of the interfaces are pure stored procs and not batch interface tables. I am concerned that I must build dozens of custom error tables around these apis. Then it feels like it would be easier to just write pl/sql batch jobs and schedule with concurrent manager in EBS (skip ODI completely). In that case, one could write to the concurrent manager log and the user could view the errors and correct.
    I can get a simple procedure to work in ODI where the source is the SQL, and the target is the pl/sql call to the stored proc in the database. It loops through every row in the sql source and calls the pl/sql code.
    But I can not see how to set which rows have failed and which table would log errors to begin with.
    Thank you,
    Erik

    Hi Erik,
    Please, take a look in these posts:
    http://odiexperts.com/?p=666
    http://odiexperts.com/?p=742
    They could help you in a way to solve your problem.
    I already used it to call Oracle EBS API's and worked pretty well.
    I believe that an IKM could be build to automate all the work but I never stopped to try...
    Does it help you?
    Cezar Santos
    http://odiexperts.com

  • OVM Repository and VM Guest Backups - Best Practice?

    Hey all,
    Does anybody out there have any tips/best practices on backing up the OVM Repository as well ( of course ) the VM's? We are using NFS exclusively and have the ability to take snapshots at the storage level.
    Some of the main points we'd like to do ( without using a backup agent within each VM ):
    backup/recovery of the entire VM Guest
    single file restore of a file within a VM Guest
    backup/recovery of the entire repository.
    The single file restore is probably the most difficult/manual. The rest can be done manually from the .snapshot directories, but when we're talking about having hundreds and hundreds of guests within OVM...this isn't overly appealing to me.
    OVM has this lovely manner of naming it's underlying VM directories off of some abiguous number which has nothing to do with the name of the VM ( I've been told this is changing in an upcoming release ).
    Brent

    Please find below the response from the Oracle support on that.
    In short :
    - First, "manual" copies of files into the repository is not recommend nor supported.
    - Second we have to go back and forth through templates and http (or ftp) server.
    Note that when creating a template or creating a new VM from a template, we're tlaking about full copies. No "fast-clone" (snapshots) are involved.
    This is ridiculous.
    How to Back up a VM:1) Create a template from the OVM Manager console
    Note: Creating a template requires the VM to be stopped (this is required because the if the copy of the virtual disk is done with the running will corrupt data) and the process to create the template make changes to the vm.cfg
    2) Enable Storage Repository Back Ups using the step above:
    http://docs.oracle.com/cd/E27300_01/E27309/html/vmusg-storage-repo-config.html#vmusg-repo-backup
    2) Mount the NFS export created above on another server
    3) Them create a compress file (tgz) using the the relevant files (cfg + img) from the Repository NFS mount:
    Here is an example of the template:
    $ tar tf OVM_EL5U2_X86_64_PVHVM_4GB.tgz
    OVM_EL5U2_X86_64_PVHVM_4GB/
    OVM_EL5U2_X86_64_PVHVM_4GB/vm.cfg
    OVM_EL5U2_X86_64_PVHVM_4GB/System.img
    OVM_EL5U2_X86_64_PVHVM_4GB/README
    How to restore up a VM:1) Then upload the compress file (tgz) to an HTTP, HTTPS or FTP. server
    2) Import to the OVM manager using the following instructions:
    http://docs.oracle.com/cd/E27300_01/E27309/html/vmusg-repo.html#vmusg-repo-template-import
    3) Clone the Virtual machine from the template imported above using the following instructions:
    http://docs.oracle.com/cd/E27300_01/E27309/html/vmusg-vm-clone.html#vmusg-vm-clone-image
    Edited by: user521138 on Sep 5, 2012 11:59 PM
    Edited by: user521138 on Sep 6, 2012 3:06 AM

Maybe you are looking for

  • Is it possible to upgrade GPU on my Satellite L750?

    I have a Satellite L750,Windows 7 Home Premium 64-bit,Intel Core i3-380M Processor,640 GB in hard disk,4,096 MB DDR3 RAM (1,066 MHz),Intel Graphics Media Accelerator HD. I'm sure by now you know why i want to upgrade my graphic card,everything is goo

  • Windows 8.1 with Bing ISO

    Hi, I need to replace Windows 8.1 with Bing x64 for the x86 version in order to use some old apps. I tried to use mediacreationtool to download the setup, but it does not have the "with Bing" edition. Where can I download the x86 ISO for this edition

  • Google Docs issues

    Being the only real editing capability, I've been using Google docs alot. I'm having several usability issues: Text selection issues: similar to my Evernote problem, selecting text doesn't work very well at all. Double-tap doesn't work and press and

  • Error message Deploying the sample HRapp using OBE

    I a novice to 9ias and I am using the OBE documents to get hands on on the Oracle 9ias. I tried to deploy the sample hrweb following the OBE document and encountered the error below. I need some help as to what to do next to correct this situation. O

  • Where can I find explanations for all the export options?

    [ Re-posting this from the iPhoto & iDVD branches ] When I export a slideshow in iPhoto 9, choose 'Custom export' and then hit the 'Options' button, a dialog pops up with Video & Sounds options that can further be explored. In the Video section, I ca