Best Practices for Backing Up Large (+10TB) Servers?

As we migrate to OS Lion Server, I need to revisit backup scenarios. I'm interested in researching best practices, which may include Time Machine for incrementals but also needs some sort of off-site possibilities (such as tape and then store somewhere else).
(In Snow Leopard, we're having real trouble with BRU (unintelligible) and Roxio's retrospect for our DLT tape backups.)
I would think this would be a great discussion for many to have.

Mark as answered!

Similar Messages

  • Best practice for backing bean population? (also, ActionListener RANT)

    Hello,
    I am about 3/4 of the way through development of a small to medium size JSF application. Sometimes I really like JSF, but much of the time I am left puzzled or frustrated for hours trying to find workarounds to JSF's bugs/glitches and design flaws.
    For example, early on, I was impressed with how easily it was to invoke a method from a page using an actionlistener. Now that I'm actually building things with JSF, the actionlistener funtionality still seems cool, but incredibly half baked. I find myself using request parameters LIKE CRAZY to work around the fact that JSF doesnt support passing parameters directly to backing bean methods. This feels awkward and wrong considering the fact that JSF is intended to abstract the HTTP underpinnings. To add insult to injury, I often have to iterate through ALL of the request parameters looking for one that has an id with an ending matching my desired property name (since JSF appends it's own crap to the beginning). I don't like doing things in a hacky way. This seems very hacky, and I feel dirty doing it.
    So, my first question is, what is the best practice for populating backing beans??? How do others accomplish this. I can think of several other approaches, but none feel less hacky.
    Second, are there plans in the next spec (please say there are) to allow parameters to be passed to backing bean methods? If not, WHY THE HECK NOT?
    Even though JSF expert group people have been conspicuously absent from this forum of late, I'd really appreciate responses from you as well.
    Thank you for your thoughts.

    Hi BrownBear,
    I've been using JSF for about 6 months now and I'd be glade to help as much as I can.
    Concerning parameters, I'm not sure what your issue is but I use the f:param tag to pass them. If you could post an example of what you are trying to do, I could see exactly what your issue is. Maybe the f:param can't help you.
    As for best practice for populating backing beans, I personaly try to let JSF do as much as possible. For example, if I have a backing bean with five properties, I make sure that they all are on the JSP page the bean serves. If one of the property is just there as an Id like, lets say, a Person ID (DB row key), then I put it on my JSP page as a hidden input field. I do the same with the properties that only for display, if I want them to be back in my bean when request comesback.
    Hope this help some how. Please, feel free to ask specific questions related to your specific problem and I monitor this post and trnasfer to you the ;little JSF experience I have.
    I'm pretty happy with JSF as it is but it sure needs improvements. :) What the heck, it's version 1.01 after all, and the next release should be a great one with the integration of JSTL.
    Cheers

  • Best practice for highly available management / publishing servers

    I am testing a highly available appv 5.0 environment, which will deploy appv packages to a Xenapp farm.  I have two SQL 2012 servers configured as an availability group for the backend, and two publishing / management servers for the front end. 
    What is the best practice to configure the publishing / management servers for high availability?  Should I configure them as an NLB cluster, which I have tested and does seem to work, or should I just use the GPO to configure the clients to use both
    publishing servers, which I have also tested and appears to work?
    Thanks,
    Patrick Sullivan

    In App-V 5.0 the Management and Publishing Servers are hosted in IIS, so use the same approach for HA as you would any web application.
    If NLB is all that's available to you, then use that; otherwise I would recommend a proper load balancing solution such as Citrix NetScaler or KEMP LoadManager.
    Please remember to click "Mark as Answer" or "Vote as Helpful" on the post that answers your question (or click "Unmark as Answer" if a marked post does not actually
    answer your question). This can be beneficial to other community members reading the thread.
    This forum post is my own opinion and does not necessarily reflect the opinion or view of my employer, Microsoft, its employees, or other MVPs.
    Twitter:
    @stealthpuppy | Blog:
    stealthpuppy.com |
    The Definitive Guide to Delivering Microsoft Office with App-V

  • What is a best practice for managing a large amount of ever-changing hyperlinks?

    I am moving an 80+ page printed catalog online. We need to add hyperlinks to our Learning Management System courses to each reference of a class - there are 100s of them. I'm having difficulty understanding what the best practice is for consistent results when I need to go back and edit (which we will have to do regularly).
    These seem like my options:
    Link the actual text - sometimes when I go back to edit the link I can't find it in InDesign but can see it's there when I open up the PDF in Acrobat
    Draw an invisible box over the text and link it - this seems to work better but seems like an extra step
    Do all of the linking in Acrobat
    Am I missing anything?
    Here is the document in case anyone wants to see it so far. For the links that are in there, I used a combination of adding the links in InDesign then perfecting them using Acrobat (removing additional links or correcting others that I couldn't see in InDesign). This part of the process gives me anxiety each month we have to make edits. Nothing seems consistent. Maybe I'm missing something obvious?

    what exatly needs to be edited, the hyperlink or content or?

  • Best practice for backing up and restoring forms

    Greetings, I would like to pose a question to the forum and understand how many of you, if at all, "backup" your interactive forms so that in the event one or multple become corrupt, you have a method to recover the forms. 
    We recently experienced such a scenario in which the forms we developed in ABAP, and access through our portal, had become corrupt.  What would happen to us when we attempted to access the form via the portal would be a SOAP error.  SOAP errors, I understand, can happen for various reasons but prior to the incident, our forms were working just fine.  We attempted to retrace our steps to identify the cause of the problem but found we could not replicate the issue.  Through analysis of the forms, we identified the corruption in the master page and found that if we copied the sections of the form that were not corrupt to a new master page, the form would work properly again.  Our thought is that this can not be the only method to recover from an incident like this and would like to know if others have experienced or have practices in place that would minimize the impact. 
    We asked SAP support whether or not there was a method to back up Interactive forms and the simple answer we received was to download the XML file from txn SFP.  Can others relate to this strategy as a proper "backup" method or do other best practices exist that would be more ideal??  
    Thank you in advance!

    I had many difficulties with this kind of errors like 2 years ago. Of course it was getting better with every patch level, and with LCD 8, LCD 8.1 etc.I don´t have any problems with the newest solution, but remember the feelings. I had to throw away a three days work once because of this "errors". But the solution is easy:
    - use the versioning like you do in your ABAP development, that works ok
    - if you would like to have an extra backup, copy the form into a backup, I mean with name like Z_YOURNAME_BCK_1
    - if you still feel that is not enough, you can always set your forms to be dynamic, what makes the ADS webservice to change the way the forms are constructed (now it will have a internal structure, you can check this out, if you open a dynamic form in LCD outside SAP, it works like a charm, if you open just a printform, it asks for an import, which does not have anything in common with the template earlier). And you can backup every form you generate with these settings. If a problem appears, you can always open this outside SAP in LCD, copy the hierarchy and paste it into your SAP window with LCD development.
    Hope that helps.
    Regards Otto

  • Best practices for working with large placed bitmap images?

    Hey all,
    I need some advice on the best way to approach building these files. I've been working on some banners that are very large: 3 x 7 feet.
    Each banner has a simple vector graphic treatment at the top and bottom (rectangle with a different colored rule on top, and vector logo) and a small amount of text, just a URL and a headline.The headline is type (not converted to outlines) and usually has some other effect applied to it, say a drop shadow or outer glow. Under these graphics is a full bleed image. The placed images need to be 150ppi at actual size, so they're honking big, sometimes up to 2GB. Once the layouts are approved, they have to go to a vendor for output.
    The Illustrator docs are really large, and I've read in other threads how to combat that (PDF compatibility, raster settings). But even still, does anyone have any insight into the best way to deal with these things? The dimensions are large, and then the images are large, and it just makes for lots of looking at the spinning ball of death...
    If it were me, I'd build them in InDe, but the vector graphics need to be edited for each one, and I so don't like to do that in InDe unless forced. To me, it's still ultimately a page layout app, not a drawing app. (Old school here.)
    FYI, our machines are all MBPs with 8G ram and the latest Intel Core 2 Duo chips, 2.66 and 2.8GHz. If we keep the files local (as opposed to working on the server) it should be fairly zippy... No?
    Any advice is appreciated, thanks!

    You can get into memory trouble with very large placed pdf files. Tiffs too.
    This has to do with the preview, which contains much more information than you need for working with.
    On the other hand if you place EPSs and take care not to turn on overprint preview you can get away with huge files.
    If you do turn on overprint preview your machine will slow down a lot and the file may become totally unmanageable.
    Compare this with to InDesign where you can control the quality of the preview. A hi-res preview will slow you down and most often you don't need it anyway.
    I was working (in Illie) the other day on much larger files than you mention – displays for whole walls – and had some considerable trouble until I reverted to the old EPS format. They say it's dying but it ain't dead yet .

  • What is current best practice for backing up a hard disc?

    With the availability of various cloud applications, I would like to know if backing up an hard disc is now best done to the cloud?  For example, should you buy space in Amazon.com, or somewhere else on the Internet,  and then backup your hard disc to that space? Could you backup an entire harddisc to iCloud?  Your knowledge and experience is greatly appreciated.  Thank you.

    iCloud doesn't provide storage except for pictures
    and your music, and only under limited circumstances.
    Merchant online backup services such as Carbonite
    and others exist for this reason. And you can sync
    your files with/to Dropbox. Many users keep a local
    copy of their files on  Time Machine or other hard
    drives in addition to remote backups (in case of a
    fire or other event compromising the local solution).
    Backup software or cloning programs are useful in
    all the above. You have to work at backup a bit -
    it's like insurance, a bit of expense now to save a
    lot of heartache later.Good to see you're determined
    to forestall potential disaster - too many lack it.
    Good luck, Tom

  • Best practice for backing up iPhone??

    I have authorized two computers in iTunes. I would like to know if it is a good idea to backup the iPhone on more than one computer? If so, will a normal sync create a backup on the second computer? For those of you who are already doing this, please let me know of any issues.

    Authorizing iTunes on both doesn't mean you can sync all data to both...
    Since iPhone won't sync the "same" data on two machines then in order to sync data you must make sure one machine syncs music etc and other is like calendar email etc...With that you should be able to get two back ups but again to me that's not needed at all...
    In theory the back up should happen on both although honestly not really necessary at all in my opinion...

  • Best practice for backing up Keywords in Elements 8?

    I have had mutliple issues with losing my keywords in Elements 8 (new computer, then later a new hard drive).  Everytime I've lost all my keywords.  Whats the best way to avoid that in the future? 

    Jellis96 wrote:
    I have had mutliple issues with losing my keywords in Elements 8 (new computer, then later a new hard drive).  Everytime I've lost all my keywords.  Whats the best way to avoid that in the future? 
    The best way is to do regular backups from the Organizer. The whole catalog is saved as well as all your media files. This is the best way when you want to move your pictures to a new drive.

  • What are best practices for back up drives when they become full?

    I've got an external hard drive that is now full. What do I do? Delete the oldest files on it, right? If so, do I delete the full backup and ALL the incremental backups since then? Do I have to do this manually now; isn't there a way to automate this?

    The "usual" procedure is to move the oldest incremental backups from the hard drive to permanent storage media such as DVD.
    However, if your backup drive is the same size as your main drive, then you could do incremental backups that delete files on the backup druve that have been deleted from the source drive. This prevents the drive from filling up with old and out of date files. However, if you need to maintain archives of even the incremental backups, then move the oldest to permanent storage media and delete them from the backup drive to free space.
    Visit The XLab FAQs and read the FAQ on backup and restore for more information.
    Why reward points?(Quoted from Discussions Terms of Use.)
    The reward system helps to increase community participation. When a community member gives you (or another member) a reward for providing helpful advice or a solution to their question, your accumulated points will increase your status level within the community.
    Members may reward you with 5 points if they deem that your reply is helpful and 10 points if you post a solution to their issue. Likewise, when you mark a reply as Helpful or Solved in your own created topic, you will be awarding the respondent with the same point values.

  • What are the best practices for backing up SAP B1?

    Right now, we are using a nightly tape back up. Is this sufficient? Or are there other, better options?
    Thank you!
    John Sefton

    It is sufficient,as you would have default SBO-Backup utility
    whcih can be automatted.
    In addition, you can even take a manual backup from the back end.

  • Best practices for organizing a large "project" with multiple programmers

    should i put everyone in one application? should everyone get their own applcation? my understanding (limited) is the level of granularity for CVS is the application. sounds to me like multiple developers checking the same application in and out would be a disaster. if every developer has there own application what problems will i have deploying. how do i handle my configuration (navigation) diagram? any thoughts would be appreciated.

    Have a read through these:
    http://download.oracle.com/docs/html/B25947_01/team_productivity.htm#BABBEFFF
    http://brendenanstey.blogspot.com/2006/11/tips-for-using-cvs-with-jdeveloper.html

  • Kernel: PANIC! -- best practice for backup and recovery when modifying system?

    I installed NVidia drivers on my OL6.6 system at home and something went bad with one of the libraries.  On reboot, the kernel would panic and I couldn't get back into the system to fix anything.  I ended up re-installing the OS to recovery my system. 
    What would be some best practices for backing up the system when making a change and then recovering if this happens again?
    Would LVM snapshots be a good option?  Can I recovery a snapshot from a rescue boot?
    EX: File system snapshots with LVM | Ars Technica -- scroll down to the section discussing LVM.
    Any pointers to documentation would be welcome as well.  I'm just not sure what to do to revert the kernel or the system when installing something goes bad like this.
    Thanks for your attention.

    There is often a common misconception: A snapshot is not a backup. A snapshot and the original it was taken from initially share the same data blocks. LVM snapshot is a general purpose solution which can be used, for example, to quickly create a snapshot prior to a system upgrade, then if you are satisfied with the result, you would delete the snapshot.
    The advantage of a snapshot is that it can be used for a live filesystem or volume while changes are written to the snapshot volume. Hence it's called "copy on write (COW), or copy on change if you want. This is necessary for system integrity to have a consistent data status of all data at a certain point in time and to allow changes happening, for example to perform a filesystem backup. A snapshot is no substitute for a disaster recovery in case you loose your storage media. A snapshot only takes seconds, and initially does not copy or backup any data, unless data changes. It is therefore important to delete the snapshot if no longer required, in order to prevent duplication of data and restore file system performance.
    LVM was never a great thing under Linux and can cause serious I/O performance bottlenecks. If snapshot or COW technology suits your purpose, I suggest you look into Btrfs, which is a modern filesystem built into the latest Oracle UEK kernel. Btrfs employs the idea of subvolumes and is much more efficient that LVM because it can operate on files or directories while LVM is doing the whole logical volume.
    Keep in mind however, you cannot use LVM or Btrfs with the boot partition, because the Grub boot loader, which loads the Linux kernel, cannot deal with LVM or BTRFS before loading the Linux kernel (catch22).
    I think the following is an interesting and fun to read introduction explaining basic concepts:
    http://events.linuxfoundation.org/sites/events/files/slides/Btrfs_1.pdf

  • Best practices for speeding up Mail with large numbers of mail?

    I have over 100,000 mails going back about 7 years in multiple accounts in dozens of folders using up nearly 3GB of disk space.
    Things are starting to drag - particularly when it comes to opening folders.
    I suspect the main problem is having large numbers of mails in those folders that are the slowest - like maybe a few thousand at a time or more.
    What are some best practices for dealing with very large amounts of mails?
    Are smart mailboxes faster to deal with? I would think they would be slower because the original emails would tend to not get filed as often, leading to even larger mailboxes. And the search time takes a lot, doesn't it?
    Are there utilities for auto-filing messages in large mailboxes to, say, divide them up by month to make the mailboxes smaller? Would that speed things up?
    Or what about moving older messages out of mail to a database where they are still searchable but not weighing down on Mail itself?
    Suggestions are welcome!
    Thanks!
    doug

    Smart mailboxes obviously cannot be any faster than real mailboxes, and storing large amounts of mail in a single mailbox is calling for trouble. Rather than organizing mail in mailboxes by month, however, what I like to do is organize it by year, with subfolders by topic for each year. You may also want to take a look at the following article:
    http://www.hawkwings.net/2006/08/21/can-mailapp-cope-with-heavy-loads/
    That said, it could be that you need to re-create the index, which you can do as follows:
    1. Quit Mail if it’s running.
    2. In the Finder, go to ~/Library/Mail/. Make a backup copy of this folder, just in case something goes wrong, e.g. by dragging it to the Desktop while holding the Option (Alt) key down. This is where all your mail is stored.
    3. Locate Envelope Index and move it to the Trash. If you see an Envelope Index-journal file there, delete it as well.
    4. Move any “IMAP-”, “Mac-”, or “Exchange-” account folders to the Trash. Note that you can do this with IMAP-type accounts because they store mail on the server and Mail can easily re-create them. DON’T trash any “POP-” account folders, as that would cause all mail stored there to be lost.
    5. Open Mail. It will tell you that your mail needs to be “imported”. Click Continue and Mail will proceed to re-create Envelope Index -- Mail says it’s “importing”, but it just re-creates the index if the mailboxes are already in Mail 2.x format.
    6. As a side effect of having removed the IMAP account folders, those accounts may be in an “offline” state now. Do Mailbox > Go Online to bring them back online.
    Note: For those not familiarized with the ~/ notation, it refers to the user’s home folder, i.e. ~/Library is the Library folder within the user’s home folder.

  • Best practice for using messaging in medium to large cluster

    What is the best practice for using messaging in medium to large cluster In a system where all the clients need to receive all the messages and some of the messages can be really big (a few megabytes and maybe more)
    I will be glad to hear any suggestion or to learn from others experience.
    Shimi

    publish/subscribe, right?
    lots of subscribers, big messages == lots of network traffic.
    it's a wide open question, no?
    %

Maybe you are looking for