Archive Libraries - Best Practices?

Trying to figure out how I want to archive Libraries that I am not currently working on to save HD space.
I currently copy the Events in a Library to a new Library titled "XXX Library Archive", and elect not to include render files or optimized media to cut down on file size.
I'm wondering if it's best to simply store that Library on an external hard drive as is, or maybe to create a Disk Image and copy it to that for storage. Are there any benefits to this in terms of file size or stability of storage?
I suppose another option would be to burn that Library (or Disk Image) to DVD instead of having it on an external drive that could crash/become corrupted inthe future.
What do others do with Libraries they are no longer working with, but want to be able to access later?

What you're doing good to me. I'd just add Connsolidta eEvent Files to round up any strays.
Speaking only for myself, I typically put my arcjived projects and /or libraries on hard drives. They don't need to ve expensive drives. I con't use optical media to archive these days.
Russ

Similar Messages

  • Multiple Libraries - Best practice

    I have been using iphoto for the last 5 years, At the end of the year I move the library to an external drive and use iphoto buddy to point to it. Now I have 6-7 libraries and I would rather have everything, together.
    What has people found out is the best way to combine libraries so that all files end up in 1 directory on the firewire drive and then the best way to move this years photos from the hardrive to the ext drive, later? Or is having multiple libraries the best?
    Seems like i am always moving files to and from libraries and i loss any structure and backing it up is a pain. am i missing something here? Thanks in advance
    jf

    Jeff:
    Welcome to the Apple Discussions. The only way at present to merger multiple libraries into one library and retain keywords, rolls, etc, is to use the paid version of iPhoto Library Manager. That will get you one library. If that library gets very large then you might have a storage problem on your boot drive where you want to keep adequate free space for optimal performance. In that case multiple libraries might be best for you.
    FWIW here's my photo strategy. I keep my photos in folder based on the shoot (each upload from the camera after an event, usually family get togethers) in folders dated for the event and a brief description. I then import that folder into iPhoto to get a roll with the same name. Since I take the step to upload before importing to iPhoto I also batch rename the files with the international date format: YYYY-MM-DD-01.jpg and so forth.
    With iPhoto's new capability of using only aliases for the original/source files (I kept my original folders as a backup on a second drive) I used those original source folders for my combined library by importing those folders into a new V6 library using the alias method. So, for 17,400+ photos that take up 27G on my second drive, my iPhoto library on my boot drive only takes up 8G. This is very valuable for PB owners who can't afford that much library space on the boot drive. The one downside to the alias method is that if you delete a photo from the library the source/original file does not get deleted. Hopefully that will be in the next update.
    If the external is dismounted the library is still useable in that you can view thumbnails, create, delete and move albums, add comments and keywords (but with some effort). You can't do anything that requires moving the thumbnails or edit. I've created a workflow to convert from a conventional iPhoto library structure to an alias based one.
    The downside to converting as I mentioned is that you will lose keywords, albums, etc. due to the conversion. So it's not for everyone.
    Bottom line: for combining/merging libraries iPhoto Library Manager is the only option at the present time. Hope this has bee of some help to you.

  • Redo / Archive Log Best Practices?

    I am a newb when it comes to Oracle administration. The problem is that our "DBA" knows even less about it.
    I'd like to get some advice/recommendations on redo and archive logs.
    We are currently running:
    Windows 2000 Server
    Oracle 8.1.7
    Oracle DB is ~50gb
    ~250 users
    Database is under fairly heavy load, as it is used to run the company's primary accounting software.
    Recently when reviewing back up procedures, I realized that our "DBA" did not have archive logging turned on. This obviously is a problem. Our DBA was relying solely on a data dump every night that was then backed up to tape. I was forced to take care of this, as the "DBA" didn't have any knowledge on this subject. I got archive logging turned on, changed the init file, etc. etc.
    Where the problem comes in, and where my questions come from... The database was writing archive logs ~2-3 mins, sometimes less depending on the database load. Oracle was configured to use 3 redo logs @ ~105mb each. The server was getting "Archive process error: Oracle instance xxxx - Cannot allocate log, archival required." I changed the redo logs to run 5 logs at ~200mb each. I also added a scsi drive to the server for the sole purpose of storing the archive logs. Log Buffer was set at 64k, I upped this to 1mb.
    My specific questions are:
    - How fast should logs be being written?
    - Should I up the number of redo logfiles, or up the size of each?
    - Should I be writing the redo logs to multiple destinations?
    - Should I archive to multiple destinations? If so, would archiving to a network drive lag the archive process, and kill the bandwidth to the server/database since it would be writing 200mb+ files to the network every few minutes?
    - What are some recommended file size minimums / maximums under the current environment listed above?
    - Other tips/suggestions?
    Any help would be appreciated.
    Thanks.

    hi,
    havce u configured LOG_ARCHIVE_START = TRUE ???
    How fast should logs be being written?Should I up the number of redo logfiles, or up the size of each?
    - Should I be writing the redo logs to multiple destinations?
    - Should I archive to multiple destinations? If so, would archiving to a network drive lag the archive process, and kill the bandwidth to the server/database since it would be writing 200mb+ files to the network every few minutes?
    - What are some recommended file size minimums / maximums under the current environment listed above?
    IF U WANT TO KEEP TIME BETWEEN FAILURES TO MINIMUM , THEN KEEP UR REDO LOG FILE SIZES TO SMALLER,,BUT ,,AS GENERALLY,,U SHUD TAKE IT TO A GOOD BIG ONE LIKE IN UR SITUATION I THINK IT SHUD BE AS:
    LOG_BUFFER = 104857600 --IN INIT.ORA (100MB)
    5 REDO LOG FILES AT MULTIPLE LOCATIONS EACH OF THE SIZE 400 MB...
    IT IS RECOMMENDED THAT DON'T TAKE UR ARCHIVES ON A NETWORKED LOCATION AS IT WILL DEFINITLY OVERLOADS NETWORK TRAFIC AS WELL AS SLOWS DOWN ARCHIVAL SPEED.
    REGARDS
    MUHAMMAD UMAR LIAQUAT

  • Terabyte Plus Libraries - What is the Best Practice?

    In my current Aperture library I have 150 gigs and I have only been using it for about a year. Previous to that I used iPhoto for a short time which still has a 60 gig library and previous to that I stored my images in file folders on a windows servers which I have around 500+ gig there. Then there are are those non electronic images which should be scanned in one day.....
    My hope was to have import everything into one tool and eventually to have some better organization and management of my images. At the rate I am shooting now it won't be long before I break the terabyte mark and am wondering as I try to pull al these sources together what is the best Practice?
    I know I can have more than one library now with Aperture, do folks manage their libraries by themes? Weddings, Family, Commercial, etc? I just picked up a 2 terabyte drive to start moving stuff off my Mac Book but am not sure if I should use the Archive tool to do this or break apart my images into libraries and store them them and just keep a working library on my Mac?
    Within Aperture I am using the Project -> Album hierarchy to manage my shoots now as well.
    Also, I don't have a ton of video yet, but have started shooting a little, plus have been making slideshows and books now, so I need to start planning for that as well. Just wondering what is the best, most efficient way of large data management with Aperture.
    Thanks!

    The solution is to avoid the (unfortunately default) Managed Masters and instead use a Referenced Masters Library kept on an internal drive. Back up originals prior to importing and keep Masters off the Library drive. That way the Aperture Library will remain small enough to live on a standard internal drive without overfilling it.
    Note that working drives (as opposed to backup-only drives) should not be allowed to exceed ~70% full for ideal speed and stability.
    Multiple Libraries is almost always poor images management in a digital world unless all rights to the images, including the right to simply view any image (such as security work), belong exclusively to the client. Usage of multiple Libraries is a big backward step into film-think and very significantly limits the power of digital images management.
    -Allen

  • Best Practice for External Libraries Shared Libraries and Web Dynrpo

    Two blogs have been written on sharing libraries with Web Dynpro DC, but I would
    like to know the best practice for doing this.
    External libraries seem to work great at compile time, but when deploying there is often an error related to the external library not being a deployed component. 
    Is there a workaround for this besides creating a shared J2EE library which I have been able to get working?  I am not interested in something that works, but really
    what are the best practice for this. What is the best way to  limit the number of jars that need to be kept in a shared library/ext library.  When is sharing ref service/etc a valid approach vs. hunting down the jars in the portal libraries etc and storing in an external library.

    Security is mainly about mitigation rather than 100% secure, "We have unknown unknowns". The component needs to talk to SQL Server. You could continue to use http to talk to SQL Server, perhaps even get SOAP Transactions working but personally
    I'd have more worries about using such a 'less trodden' path since that is exactly the areas where more security problems are discovered. I don't know about your specific design issues so there might be even more ways to mitigate the risk but in general you're
    using a DMZ as a decent way to mitigate risk. I would recommend asking your security team what they'd deem acceptable.
    http://pauliom.wordpress.com

  • The best practice for data archiving

    Hi
    My client has been using OnDemand for almost 2 years, there are around 2M records in the system(Activities), so just want to know what is the best practice of data archiving, we dont care much about the data in the 6 month ago.

    Hi Erik,
    Archival is nothing but deletion.
    Create a backup cube in BW. Copy the data from your planning cube to the backup cube, and then delete that data region from your planning cube.
    Archival will definitely improve the performance of your templates, scripts, etc; since the system will now search from a smaller dataset.
    Hope this helps.

  • Request info on Archive log mode Best Practices

    Hi,
    Could anyone from their personal experience share with me the Best Practices for maintaining Archiving on any version of oracle. Please tell me
    1) Whether to place archives and log files on same disks?
    2) How many lgwr processes to use.
    3) checkpoint frequency.
    4) How to maintain speed of the server being run in archivelog mode.
    5) Errors to look.
    Thanks,

    1. Use separate mount point for archive logs like /archv
    2. Start using with 1 and check the performance.
    3. This is depends upon the redo log file size. Create your redo log file such that hourly maximum 5-8 log switch will happen. Try to make it less than 5 log switch per hour.
    4. Check the redo log file size.
    5. Check for archive log mount point space allocation. Take the backup of archive by RMAN and deleted the backed up archive logs from the archived destination.
    Regards
    Asif Kabir

  • FCPX 10.1: Best practice moving libraries (files)?

    In the past, FCPX encouraged us to moves files WITHIN the application, through the UI.
    Does 10.1. now encourage us to move/delete files (libraries) "under the hood?"--in the finder? Given a choice, what is the best practice?

    Hi dostoelk,
    Welcome to the Support Communities!
    Clips, projects, and events should still be managed within Final Cut Pro X.  For more information, read our Managing Media with Final Cut Pro X Libraries white paper (PDF) at: www.apple.com/final-cut-pro/docs/Media_Management.pdf
    I hope this information helps ....
    Happy Holidays!
    - Judy

  • Apple Mail Archive - Best Practices

    Dear Fellow Apple Heads:
    Does anyone have any best practices to archive mail so that I can backup an mbox file and remove the archived items from current use?

    Greetings,
    There are two ways to do this, but since I don't really use 10.5.x I can't be sure of one of them. In Mail, there should be an option to Archive Mailbox somewhere, but I just don't know because I don't use it. The other way is just to copy the folder of the mailbox you're looking to archive onto your Desktop or wherever you want to save it.
    The drawback is that if you want to open these messages again, you can't do it from Mail; you would have to double-click on each one to open it, and the message name doesn't tell you anything about who it's from or what it's about. There's at least one third-party mail archiving solution on the market, but I don't remember the name offhand.
    The best option is to always have a current, bootable backup of your entire drive so you can restore your Mac to the way it was before the hard drive failed or that unexpected power outage causes a bunch of files to get corrupted or just disappear.

  • JDeveloper 10.1.3 Libraries Export/Best Practices

    Is there a way to export the User Libraries created in JDeveloper? I would like to be able to setup a standard user library in JDeveloper and then export the configuration and allow other developers on a team to import the User Library setup without each individual having to setup their own User Library.
    This relates to us having problems with user library names and sharing projects in a group. Does anybody have any suggestions on best practices?
    Thanks

    I'm a little unsure how this works. I was under the impression (maybe wrong one) the Default Project Properties were for the creation of new projects in your own JDeveloper workspace and that it would include the specific libraries found in the Default Project Properties libraries into your newly created project. When another person starts trying to modify an existing project and their User Libraries are named differently than your own setup how the Default Project Properties reconcile between the different libraries?

  • Archiving Best Practices / How To Guide for Oracle 10g - need urgently

    Hi,
    I apologize if this is a silly question. But i need a step by step archiving guide for Oracle 10g and cannot find any reference document. I am in a rather remote part of S.E. Asia & can't seem to find DBA's with the requisite experience to do the job properly. I have had 1 database lock up this week at a big telecoms provider and another one at a major bank is about to go. I can easily add LUNS & re-structure mirrors etc at the Unix level [ i am a Unix engineer ]
    but i know that is not the long run solution. I am sure the 2 databases i am concerned about have never been archived properly.
    This is the sort of thing DBA's must do all the time. Can someone point me to the proper documentation so i can do a proper job and archive a few years data out of these databases. I do not want to do a hack job. At least i can clone the databases and practise on the clones first before i actually do production.
    -thanks very much
    -gregoire
    [email protected]

    I'm not so sure this is a general database question, as it would be specific to an application and implementation, and as the technology has changed, the database options to support it has too.
    So for example, if you have bought the partitioning option, there may be some sensible procedure for partitioning off older data.
    Things may depend on whether you are talking about an OLTP, a DW, a DSS, or mixed systems.
    DBA's do it all the time because the requirements are different everywhere. Simply deleting a lot of data after copying the old data to another table (as some older systems do) may just wind up giving you performance problems scanning swiss-cheesed data.
    Some places may not archive at all, if they've separated out OLTP from reporting. If all the OLTP stuff is accessed through indices, all the older stuff just sits there. The reporting DB may only have what is needed to be reported on, or be on a standby db where range scans are sufficient to ignore old data. There there's exadata, which has it's own strengths.
    Best Practices have to be on similar enough systems, otherwise they are a self-contradiction.
    Get yourself someone who understands your requirements and can evaluate the actual problem. No apology needed, it is not a silly question. But what is silly is assuming what the problem is with no evidence.

  • Best practices to include client libraries used at component level

    How to include component level resources, while following the best practices.
    Ex:
    I am looking at the geometrixx media site in CQ 5.6. In the some of components, ex: 2-col-article-summary we have a client library defined under the component.
    /apps/geometrixx-media/components/2-col-article-summary
    -2-col-article-summary.jsp
    -clientlibs
      -css
      -css.txt
    if i look at the categories of the clientlib, it is defined as follows
    categories String[] apps.geometrixx-media, apps.geometrixx-media.2-col-article-summary
    The only place this client library is included is in the head.jsp of main page level component.
        <cq:includeClientLib categories="apps.geometrixx-media.all"/> - this in turn embeds the apps.geometrixx-media
    my questions are as follows
    1) Why do we have two categories for the clientlib, if ithe second category name( apps.geometrixx-media.2-col-article-summary) is not being used. Is there is some other usage for this i am missing.
    2) Also these set of css is always included no matter whether a specific component is added to the page or not.
    3)  I could use the following to include the client at the component level, but this will cause unnecessary <script> and <link> elements at the component level mark up.
       <cq:includeClientLib categories="apps.geometrixx-media.2-col-article-summary"/>
    Essentially  i am trying to understand, how to include a specific component level resources while following the best practices

    Hi,
    I dont have CQ5.6 setup so could not see referring example but by looking at description i can say
    Ans 1. The client library can be invoked directly through tag lib as <cq:includeClientLib categories="category name"/> but it can also be invoked when you add "dependencies" property to client library folder and in that case it resolve all the dependencies first. So to answer your quesiton by looking at client library folder configuration you can not say that specific category has not been used any where or not invoked.
    Ans 2. Invoking client library folder completely depends on your code where you are placing call for client library using <cq:includeClientLib> tag and dependencies configuration. So you have to dig it more to trace out all the calls (also default css/js loads)
    Ans 3. Correct. So accomplish that best way to manage client library at component level and give it a unique name which can not be invoked any where neither through <cq:includeClientLib> call nor through dependenies configuration. This way you can avoide overridding of same library files. (Better to manage proper hierarchy of library)
    Hope it gives you some idea .
    Thanks,
    Pawan

  • Customization approach as per best practice for SharePoint Online

    Hi All,
    I am working for a customer for customization on SharePoint Online. I need to create following customization.
    For each department one site collection is required to be created. There will be 15 site collections.
    Each site collection will have couple of team sites.
    Each team will have couple of document libraries and customer list.
    Custom lists and document libraries will have custom views.
    MaterPage and Layout will be customized to apply the UI Branding.
    Customer wants that configuration management should be as per the Microsoft best practice. I am wondering what the approach I should use is.
    Should I create visual studio solution, but since there are 15 different site collections are required to be created I believe sandboxed solution will not be feasible sine sandboxed solution are scoped with site collection.
    I also believe if I do create visual studio solution that development efforts will be extensive.
    I am not sure whether it is feasible, use the SharePoint Designer to apply this customization but I am confused in this case. If it is possible then how I will promote the customization to production.
    I am also confused in case SharePoint online how I will keep production and development environment separate? What is the best practice around it?
    Regards 
    Unrest Spirit
    Regards Restless Spirit

    Hi,
    You can create Custom Master page using SharePoint Designer. And for first four points from creating Sitecollection to creating views you can create a hierarchy of objects in site using csv file and then create Powershell script to create sitecollection,
    team site, list/libraries and view by reading csv files.
    http://blogs.technet.com/b/fromthefield/archive/2013/08/22/create-a-site-structure-using-powershell.aspx
    http://blog.falchionconsulting.com/index.php/2009/12/creating-a-sharepoint-2010-site-structure-using-powershell/
    Details about SharePoint Online Powershell management shellcan be found on below links:
    http://technet.microsoft.com/en-us/library/fp161362%28v=office.15%29.aspx
    https://support.office.com/en-GB/article/Introduction-to-the-SharePoint-Online-Management-Shell-c16941c3-19b4-4710-8056-34c034493429
    Best Regards,
    Brij K

  • SQL Server installation paths best practices

    In my company we're planning to setup a new (consolidated) SQL Server 2012 server (on Windows 2012 R2, VMWare). Current situation is there is a SQL Server 2000, a few SQL Server 2008 Express and a lot of Access databases. For the installation I'm wondering
    what the best selections for the various installation paths are. Our infra colleagues (offshore) have the following standard partition setup for SQL Server servers:
    C:\ OS
    E:\ Application
    L:\ Logs
    S:\ DB
    T:\ TEMPDB
    And during the installation I have to make a choice for the following
    Shared feature directory: x:\Program Files\Microsoft SQL Server\
    Shared feature directory (x86): x:\Program Files\Microsoft SQL Server\
    Instance root directory (SQL Server, Analysis Services, Reporting Services): x:\Program Files\Microsoft SQL Server\
    Database Engine Configuration Data Directories:
    Data root directory: x:\Program Files\Microsoft SQL Server\
    User database directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    User database log directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Temp DB directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Temp DB log directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Backup directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Analysis Services Configuration Data Directories:
    User database directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    User database log directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Temp DB directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Temp DB log directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Backup directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Distributed Replay Client:
    Working Directory: x:\Program Files (x86)\Microsoft SQL Server\DReplayClient\WorkingDir\
    Result Directory: x:\Program Files (86)\Microsoft SQL Server\DReplayClient\ResultDir\
    So I'd like some on assistance on the filling in the x drive letters. I understand it's best practice to seperate the data files and the logs files. But should that also be the case for TempDB? And should both the database and tempdb log files go to the
    same log paritition then? What about the backup directories? Any input is very much appreciated!
    Btw, I followed the http://www.sqlservercentral.com/blogs/basits-sql-server-tips/2012/06/23/sql-server-2012-installation-guide/ guide for the installation (Test server now).

    You can place all installation libraries on E:\ Drive.
    >>So I'd like some on assistance on the filling in the x drive letters. I understand it's best practice
    to seperate the data files and the logs files. But should that also be the case for TempDB? And should both the database and tempdb log files go to the same log paritition then? What about the backup directories? Any input is very much appreciated!
    You can place tempdb data files on T drive and I prefer to place tempdb log and user database log file
    on the same drive i.e is L:\ Drive.
    >>Backup directories
    If you are not using any third party tool then i would prefer to create separate drive for backup.
    Refer the below link for further reading
    http://www.brentozar.com/archive/2009/02/when-should-you-put-data-and-logs-on-the-same-drive/
    --Prashanth

  • Backup "best practices" scenarios?

    I'd be grateful for a discussion of different "best practices" regarding backups, taking into consideration that a Time Machine backup is only as good as the external disk it is on.
    My husband & I currently back up our two macs, each to its own 500 GB hard disk using TM. We have a 1TB disk and I was going to make that a periodic repository for backups for each of our macs, in case one of the 500GB disks fails, but was advised by a freelance mac tech not to do that. This was in the middle of talking about other things & now I cannot remember if he said why.
    We need the general backup, and also (perhaps a separate issue) I am particularly interested in safeguarding our iPhoto libraries. Other forms of our data can be redundantly backed up to DVDs but with 50GB iPhoto libraries that doesn't seem feasible, nor is backing up to some online storage facility. My own iP library is too large for my internal disk; putting the working library on an external disk is discouraged by the experts in the iPhoto forum (slow access, possible loss if connexion between computer & disk interrupted during transfer) so my options seem to be getting a larger internal disk or using the internal disk just for the most recent photos and keeping the entire library on an external disk, independent of incremental TM backups.

    It's probably more than you ever wanted to know about backups, but some of this might be useful:
    There are three basic types of backup applications: *Bootable Clone, Archive, and Time Machine.*
    This is a general explanation and comparison of the three types. Many variations exist, of course, and some combine features of others.
    |
    _*BOOTABLE "CLONE"*_
    |
    These make a complete, "bootable" copy of your entire system on an external disk/partition, a second internal disk/partition, or a partition of your internal disk.
    Advantages
    If your internal HD fails, you can boot and run from the clone immediately. Your Mac may run a bit slower, but it will run, and contain everything that was on your internal HD at the time the clone was made or last updated. (But of course, if something else critical fails, this won't work.)
    You can test whether it will run, just by booting-up from it (but of course you can't be positive that everything is ok without actually running everything).
    If it's on an external drive, you can easily take it off-site.
    Disadvantages
    Making an entire clone takes quite a while. Most of the cloning apps have an update feature, but even that takes a long time, as they must examine everything on your system to see what's changed and needs to be backed-up. Since this takes lots of time and CPU, it's usually not practical to do this more than once a day.
    Normally, it only contains a copy of what was on your internal HD when the clone was made or last updated.
    Some do have a feature that allows it to retain the previous copy of items that have been changed or deleted, in the fashion of an archive, but of course that has the same disadvantages as an archive.
    |
    _*TRADITIONAL "ARCHIVE" BACKUPS*_
    |
    These copy specific files and folders, or in some cases, your entire system. Usually, the first backup is a full copy of everything; subsequently, they're "incremental," copying only what's changed.
    Most of these will copy to an external disk; some can go to a network locations, some to CDs/DVDs, or even tape.
    Advantages
    They're usually fairly simple and reliable. If the increments are on separate media, they can be taken off-site easily.
    Disadvantages
    Most have to examine everything to determine what's changed and needs to be backed-up. This takes considerable time and lots of CPU. If an entire system is being backed-up, it's usually not practical to do this more than once, or perhaps twice, a day.
    Restoring an individual item means you have to find the media and/or file it's on. You may have to dig through many incremental backups to find what you're looking for.
    Restoring an entire system (or large folder) usually means you have to restore the most recent Full backup, then each of the increments, in the proper order. This can get very tedious and error-prone.
    You have to manage the backups yourself. If they're on an external disk, sooner or later it will get full, and you have to do something, like figure out what to delete. If they're on removable media, you have to store them somewhere appropriate and keep track of them. In some cases, if you lose one in the "string" (or it can't be read), you've lost most of the backup.
    |
    _*TIME MACHINE*_
    |
    Similar to an archive, TM keeps copies of everything currently on your system, plus changed/deleted items, on an external disk, Time Capsule (or USB drive connected to one), internal disk, or shared drive on another Mac on the same local network.
    Advantages
    Like many Archive apps, it first copies everything on your system, then does incremental backups of additions and changes. But TM's magic is, each backup appears to be a full one: a complete copy of everything on your system at the time of the backup.
    It uses an internal OSX log of what's changed to quickly determine what to copy, so most users can let it do it's hourly incremental backups without much effect on system performance. This means you have a much better chance to recover an item that was changed or deleted in error, or corrupted.
    Recovery of individual items is quite easy, via the TM interface. You can browse your backups just as your current data, and see "snapshots" of the entire contents at the time of each backup. You don't have to find and mount media, or dig through many files to find what you're looking for.
    You can also recover your entire system (OSX, apps, settings, users, data, etc.) to the exact state it was in at the time of any backup, even it that's a previous version of OSX.
    TM manages it's space for you, automatically. When your backup disk gets near full, TM will delete your oldest backup(s) to make room for new ones. But it will never delete it's copy of anything that's still on your internal HD, or was there at the time of any remaining backup. So all that's actually deleted are copies of items that were changed or deleted long ago.
    TM examines each file it's backing-up; if it's incomplete or corrupted, TM may detect that and fail, with a message telling you what file it is. That way, you can fix it immediately, rather than days, weeks, or months later when you try to use it.
    Disadvantages
    It's not bootable. If your internal HD fails, you can't boot directly from your TM backups. You must restore them, either to your repaired/replaced internal HD or an external disk. This is a fairly simple, but of course lengthy, procedure.
    TM doesn't keep it's copies of changed/deleted items forever, and you're usually not notified when it deletes them.
    It is fairly complex, and somewhat new, so may be a bit less reliable than some others.
    |
    RECOMMENDATION
    |
    For most non-professional users, TM is simple, workable, and maintenance-free. But it does have it's disadvantages.
    That's why many folks use both Time Machine and a bootable clone, to have two, independent backups, with the advantages of both. If one fails, the other remains. If there's room, these can be in separate partitions of the same external drive, but it's safer to have them on separate drives, so if either app or drive fails, you still have the other one.
    |
    _*OFF-SITE BACKUPS*_
    |
    As great as external drives are, they may not protect you from fire, flood, theft, or direct lightning strike on your power lines. So it's an excellent idea to get something off-site, to your safe deposit box, workplace, relative's house, etc.
    There are many ways to do that, depending on how much data you have, how often it changes, how valuable it is, and your level of paranoia.
    One of the the best strategies is to follow the above recommendation, but with a pair of portable externals, each 4 or more times the size of your data. Each has one partition the same size as your internal HD for a "bootable clone" and another with the remainder for TM.
    Use one drive for a week or so, then take it off-site and swap with the other. You do have to tell TM when you swap drives, via TM Preferences > Change Disk; and you shouldn't go more than about 10 days between swaps.
    There are other options, instead of the dual drives, or in addition to them. Your off-site backups don't necessarily have to be full backups, but can be just copies of critical information.
    If you have a MobileMe account, you can use Apple's Backup app to get relatively-small amounts of data (such as Address book, preferences, settings, etc.) off to iDisk daily. If not, you can use a 3rd-party service such as Mozy or Carbonite.
    You can also copy data to CDs or DVDs and take them off-site. Re-copy them every year or two, as their longevity is questionable.
    Backup strategies are not a "One Size Fits All" sort of thing. What's best varies by situation and preference.
    Just as an example, I keep full Time Machine backups; plus a CarbonCopyCloner clone (updated daily, while I'm snoozing) locally; plus small daily Backups to iDisk; plus some other things to CDsDVDs in my safe deposit box. Probably overkill, but as many of us have learned over the years, backups are one area where +Paranoia is Prudent!+

Maybe you are looking for