Best practice to back up our most important asset : iPhoto!

Hi everyone!
I consider my 30k photos which I've carefully tagged/marked/modified/red eyed/etc as my most important asset. I've spent days and days on iPhoto to eventually build this 50GB library.
I've bought my 500 GB TC few months ago and now guess what, it's completely full!
Modifications calculation, deletion and updates of my library are now taking forever. I understand from this board that Time Machine/TC is updating the 50GB as a whole instead of updating only the modifications ... what a nightmare in this almost ideal apple world!
Any recommendation to share? Any free apps to do this work for me?
So far my personal brainstorming came to this process:
1) download "Time Machine Editor"
2) schedule updates to be daily or weekly
3) be patient
Thanks for sharing your best practices!
dofre

I would get a program such as SuperDuper ( http://www.versiontracker.com/dyn/moreinfo/macosx/22126 ) to CLONE your hard drive onto the external (as an image, if you don't have a free partition) and then when your new HD comes back, clone it back.
SuperDuper is well worth the $20 shareware fee.

Similar Messages

  • What is the best way to back up my photo's in iPhoto?

    What is the best way to back up my photo's in iPhoto?
    I have used an external hard disk which doesn't work very well.
    Can anybody explain to me how I can use Time Machine? Or is there another simple way?
    Thanks,
    Marja Meijer

    Tri-Backup 6 to synch volumes, folders (and they have a TimeMachine utilty)
    http://www.tri-edre.com/english/tribackup.html
    Back-In-Time 2: Unleash the power of Time Machine
    Restore Time Machine data: Time Machine is a great basic backup tool. But the options for locating and restoring data are quite limited. Back-In-Time gives you total flexibility in discovering and recovering your data to any location on your Mac. See more...
    A File History backup archive whether TimeMachine which is fine I agree to use CCC as primary but I also like something else or combination, one for system, one for data. And multiple sets. And even then I have needed to go and find a file version or something and glad I had online backup of something besides a once a week clone, which would not have helped me.
    TimeMachine 101
    https://support.apple.com/kb/HT1427

  • Best practice for backing up and restoring forms

    Greetings, I would like to pose a question to the forum and understand how many of you, if at all, "backup" your interactive forms so that in the event one or multple become corrupt, you have a method to recover the forms. 
    We recently experienced such a scenario in which the forms we developed in ABAP, and access through our portal, had become corrupt.  What would happen to us when we attempted to access the form via the portal would be a SOAP error.  SOAP errors, I understand, can happen for various reasons but prior to the incident, our forms were working just fine.  We attempted to retrace our steps to identify the cause of the problem but found we could not replicate the issue.  Through analysis of the forms, we identified the corruption in the master page and found that if we copied the sections of the form that were not corrupt to a new master page, the form would work properly again.  Our thought is that this can not be the only method to recover from an incident like this and would like to know if others have experienced or have practices in place that would minimize the impact. 
    We asked SAP support whether or not there was a method to back up Interactive forms and the simple answer we received was to download the XML file from txn SFP.  Can others relate to this strategy as a proper "backup" method or do other best practices exist that would be more ideal??  
    Thank you in advance!

    I had many difficulties with this kind of errors like 2 years ago. Of course it was getting better with every patch level, and with LCD 8, LCD 8.1 etc.I don´t have any problems with the newest solution, but remember the feelings. I had to throw away a three days work once because of this "errors". But the solution is easy:
    - use the versioning like you do in your ABAP development, that works ok
    - if you would like to have an extra backup, copy the form into a backup, I mean with name like Z_YOURNAME_BCK_1
    - if you still feel that is not enough, you can always set your forms to be dynamic, what makes the ADS webservice to change the way the forms are constructed (now it will have a internal structure, you can check this out, if you open a dynamic form in LCD outside SAP, it works like a charm, if you open just a printform, it asks for an import, which does not have anything in common with the template earlier). And you can backup every form you generate with these settings. If a problem appears, you can always open this outside SAP in LCD, copy the hierarchy and paste it into your SAP window with LCD development.
    Hope that helps.
    Regards Otto

  • Best practice for backing bean population? (also, ActionListener RANT)

    Hello,
    I am about 3/4 of the way through development of a small to medium size JSF application. Sometimes I really like JSF, but much of the time I am left puzzled or frustrated for hours trying to find workarounds to JSF's bugs/glitches and design flaws.
    For example, early on, I was impressed with how easily it was to invoke a method from a page using an actionlistener. Now that I'm actually building things with JSF, the actionlistener funtionality still seems cool, but incredibly half baked. I find myself using request parameters LIKE CRAZY to work around the fact that JSF doesnt support passing parameters directly to backing bean methods. This feels awkward and wrong considering the fact that JSF is intended to abstract the HTTP underpinnings. To add insult to injury, I often have to iterate through ALL of the request parameters looking for one that has an id with an ending matching my desired property name (since JSF appends it's own crap to the beginning). I don't like doing things in a hacky way. This seems very hacky, and I feel dirty doing it.
    So, my first question is, what is the best practice for populating backing beans??? How do others accomplish this. I can think of several other approaches, but none feel less hacky.
    Second, are there plans in the next spec (please say there are) to allow parameters to be passed to backing bean methods? If not, WHY THE HECK NOT?
    Even though JSF expert group people have been conspicuously absent from this forum of late, I'd really appreciate responses from you as well.
    Thank you for your thoughts.

    Hi BrownBear,
    I've been using JSF for about 6 months now and I'd be glade to help as much as I can.
    Concerning parameters, I'm not sure what your issue is but I use the f:param tag to pass them. If you could post an example of what you are trying to do, I could see exactly what your issue is. Maybe the f:param can't help you.
    As for best practice for populating backing beans, I personaly try to let JSF do as much as possible. For example, if I have a backing bean with five properties, I make sure that they all are on the JSP page the bean serves. If one of the property is just there as an Id like, lets say, a Person ID (DB row key), then I put it on my JSP page as a hidden input field. I do the same with the properties that only for display, if I want them to be back in my bean when request comesback.
    Hope this help some how. Please, feel free to ask specific questions related to your specific problem and I monitor this post and trnasfer to you the ;little JSF experience I have.
    I'm pretty happy with JSF as it is but it sure needs improvements. :) What the heck, it's version 1.01 after all, and the next release should be a great one with the integration of JSTL.
    Cheers

  • Best Practices for Backing Up Large (+10TB) Servers?

    As we migrate to OS Lion Server, I need to revisit backup scenarios. I'm interested in researching best practices, which may include Time Machine for incrementals but also needs some sort of off-site possibilities (such as tape and then store somewhere else).
    (In Snow Leopard, we're having real trouble with BRU (unintelligible) and Roxio's retrospect for our DLT tape backups.)
    I would think this would be a great discussion for many to have.

    Mark as answered!

  • The best practice when backing up your files

    Hi,
    I recently started using Carbon Copy Cloner after using only Time Machine as my back up solution.  I do understand the purpose of each application TM and a cloner utility such as Super Duper or CCC but I was wondering what the best process is when using these two methods to backup you files.
    For instance I use TM to back up my files as frequently as possible to keep all my recent changes updated, but I don’t see how I would keep my clone updated and make sure that when something happens I will have a workable boot disk and not something that contains corrupted files.  In other words I think these cloner utilities have some sort of feature to keep your clone drive updated every time something changes but this got me wondering what if you update your clone drive and for some reason one of the updated file was corrupted, without you knowing it if course, now you have a backup that contains bad files and may affect your system (corrupted fonts files etc.) and when you realized that something is not working right in your system you may want to recover from your clone but you will basically end up with the same problem because the bad files were also backed up.
    How do you ensure that your clone will always be ready and that it will not contain bad files?
    What is your backup process?  Be so kind and share your method.
    Again, I’m ok with TM I just need to know how you guys are managing your clone drives using ether Super Duper or Carbon Copy Cloner.
    Thanks a lot!

    I use CCC exclusively and update my clones every couple of days or after installing updates. I have no use for TM, since I've never had to go back and get something I deleted or changed. I do, however, boot into those clones on a routine basis to ensure that they look and act like the originals. This has worked for over seven years, but YMMV.

  • What is current best practice for backing up a hard disc?

    With the availability of various cloud applications, I would like to know if backing up an hard disc is now best done to the cloud?  For example, should you buy space in Amazon.com, or somewhere else on the Internet,  and then backup your hard disc to that space? Could you backup an entire harddisc to iCloud?  Your knowledge and experience is greatly appreciated.  Thank you.

    iCloud doesn't provide storage except for pictures
    and your music, and only under limited circumstances.
    Merchant online backup services such as Carbonite
    and others exist for this reason. And you can sync
    your files with/to Dropbox. Many users keep a local
    copy of their files on  Time Machine or other hard
    drives in addition to remote backups (in case of a
    fire or other event compromising the local solution).
    Backup software or cloning programs are useful in
    all the above. You have to work at backup a bit -
    it's like insurance, a bit of expense now to save a
    lot of heartache later.Good to see you're determined
    to forestall potential disaster - too many lack it.
    Good luck, Tom

  • Best practice for backing up Keywords in Elements 8?

    I have had mutliple issues with losing my keywords in Elements 8 (new computer, then later a new hard drive).  Everytime I've lost all my keywords.  Whats the best way to avoid that in the future? 

    Jellis96 wrote:
    I have had mutliple issues with losing my keywords in Elements 8 (new computer, then later a new hard drive).  Everytime I've lost all my keywords.  Whats the best way to avoid that in the future? 
    The best way is to do regular backups from the Organizer. The whole catalog is saved as well as all your media files. This is the best way when you want to move your pictures to a new drive.

  • Best practice for backing up iPhone??

    I have authorized two computers in iTunes. I would like to know if it is a good idea to backup the iPhone on more than one computer? If so, will a normal sync create a backup on the second computer? For those of you who are already doing this, please let me know of any issues.

    Authorizing iTunes on both doesn't mean you can sync all data to both...
    Since iPhone won't sync the "same" data on two machines then in order to sync data you must make sure one machine syncs music etc and other is like calendar email etc...With that you should be able to get two back ups but again to me that's not needed at all...
    In theory the back up should happen on both although honestly not really necessary at all in my opinion...

  • What are the best practices for backing up SAP B1?

    Right now, we are using a nightly tape back up. Is this sufficient? Or are there other, better options?
    Thank you!
    John Sefton

    It is sufficient,as you would have default SBO-Backup utility
    whcih can be automatted.
    In addition, you can even take a manual backup from the back end.

  • What are best practices for back up drives when they become full?

    I've got an external hard drive that is now full. What do I do? Delete the oldest files on it, right? If so, do I delete the full backup and ALL the incremental backups since then? Do I have to do this manually now; isn't there a way to automate this?

    The "usual" procedure is to move the oldest incremental backups from the hard drive to permanent storage media such as DVD.
    However, if your backup drive is the same size as your main drive, then you could do incremental backups that delete files on the backup druve that have been deleted from the source drive. This prevents the drive from filling up with old and out of date files. However, if you need to maintain archives of even the incremental backups, then move the oldest to permanent storage media and delete them from the backup drive to free space.
    Visit The XLab FAQs and read the FAQ on backup and restore for more information.
    Why reward points?(Quoted from Discussions Terms of Use.)
    The reward system helps to increase community participation. When a community member gives you (or another member) a reward for providing helpful advice or a solution to their question, your accumulated points will increase your status level within the community.
    Members may reward you with 5 points if they deem that your reply is helpful and 10 points if you post a solution to their issue. Likewise, when you mark a reply as Helpful or Solved in your own created topic, you will be awarding the respondent with the same point values.

  • OVM Repository and VM Guest Backups - Best Practice?

    Hey all,
    Does anybody out there have any tips/best practices on backing up the OVM Repository as well ( of course ) the VM's? We are using NFS exclusively and have the ability to take snapshots at the storage level.
    Some of the main points we'd like to do ( without using a backup agent within each VM ):
    backup/recovery of the entire VM Guest
    single file restore of a file within a VM Guest
    backup/recovery of the entire repository.
    The single file restore is probably the most difficult/manual. The rest can be done manually from the .snapshot directories, but when we're talking about having hundreds and hundreds of guests within OVM...this isn't overly appealing to me.
    OVM has this lovely manner of naming it's underlying VM directories off of some abiguous number which has nothing to do with the name of the VM ( I've been told this is changing in an upcoming release ).
    Brent

    Please find below the response from the Oracle support on that.
    In short :
    - First, "manual" copies of files into the repository is not recommend nor supported.
    - Second we have to go back and forth through templates and http (or ftp) server.
    Note that when creating a template or creating a new VM from a template, we're tlaking about full copies. No "fast-clone" (snapshots) are involved.
    This is ridiculous.
    How to Back up a VM:1) Create a template from the OVM Manager console
    Note: Creating a template requires the VM to be stopped (this is required because the if the copy of the virtual disk is done with the running will corrupt data) and the process to create the template make changes to the vm.cfg
    2) Enable Storage Repository Back Ups using the step above:
    http://docs.oracle.com/cd/E27300_01/E27309/html/vmusg-storage-repo-config.html#vmusg-repo-backup
    2) Mount the NFS export created above on another server
    3) Them create a compress file (tgz) using the the relevant files (cfg + img) from the Repository NFS mount:
    Here is an example of the template:
    $ tar tf OVM_EL5U2_X86_64_PVHVM_4GB.tgz
    OVM_EL5U2_X86_64_PVHVM_4GB/
    OVM_EL5U2_X86_64_PVHVM_4GB/vm.cfg
    OVM_EL5U2_X86_64_PVHVM_4GB/System.img
    OVM_EL5U2_X86_64_PVHVM_4GB/README
    How to restore up a VM:1) Then upload the compress file (tgz) to an HTTP, HTTPS or FTP. server
    2) Import to the OVM manager using the following instructions:
    http://docs.oracle.com/cd/E27300_01/E27309/html/vmusg-repo.html#vmusg-repo-template-import
    3) Clone the Virtual machine from the template imported above using the following instructions:
    http://docs.oracle.com/cd/E27300_01/E27309/html/vmusg-vm-clone.html#vmusg-vm-clone-image
    Edited by: user521138 on Sep 5, 2012 11:59 PM
    Edited by: user521138 on Sep 6, 2012 3:06 AM

  • Kernel: PANIC! -- best practice for backup and recovery when modifying system?

    I installed NVidia drivers on my OL6.6 system at home and something went bad with one of the libraries.  On reboot, the kernel would panic and I couldn't get back into the system to fix anything.  I ended up re-installing the OS to recovery my system. 
    What would be some best practices for backing up the system when making a change and then recovering if this happens again?
    Would LVM snapshots be a good option?  Can I recovery a snapshot from a rescue boot?
    EX: File system snapshots with LVM | Ars Technica -- scroll down to the section discussing LVM.
    Any pointers to documentation would be welcome as well.  I'm just not sure what to do to revert the kernel or the system when installing something goes bad like this.
    Thanks for your attention.

    There is often a common misconception: A snapshot is not a backup. A snapshot and the original it was taken from initially share the same data blocks. LVM snapshot is a general purpose solution which can be used, for example, to quickly create a snapshot prior to a system upgrade, then if you are satisfied with the result, you would delete the snapshot.
    The advantage of a snapshot is that it can be used for a live filesystem or volume while changes are written to the snapshot volume. Hence it's called "copy on write (COW), or copy on change if you want. This is necessary for system integrity to have a consistent data status of all data at a certain point in time and to allow changes happening, for example to perform a filesystem backup. A snapshot is no substitute for a disaster recovery in case you loose your storage media. A snapshot only takes seconds, and initially does not copy or backup any data, unless data changes. It is therefore important to delete the snapshot if no longer required, in order to prevent duplication of data and restore file system performance.
    LVM was never a great thing under Linux and can cause serious I/O performance bottlenecks. If snapshot or COW technology suits your purpose, I suggest you look into Btrfs, which is a modern filesystem built into the latest Oracle UEK kernel. Btrfs employs the idea of subvolumes and is much more efficient that LVM because it can operate on files or directories while LVM is doing the whole logical volume.
    Keep in mind however, you cannot use LVM or Btrfs with the boot partition, because the Grub boot loader, which loads the Linux kernel, cannot deal with LVM or BTRFS before loading the Linux kernel (catch22).
    I think the following is an interesting and fun to read introduction explaining basic concepts:
    http://events.linuxfoundation.org/sites/events/files/slides/Btrfs_1.pdf

  • GUI Design Best Practices

    Still learning Java and OOP. The question is, what is the best practice method for designing the user interface? For instance, I have a main_GUI that has all the basic stuff and then I need to have 5 or so different displays. A couple of them need to be modal so I'm creating those as jDialog. Is this correct or should you not use those.
    Actually, all of the additional screens I want to be modal so I don't see another way of creating a normal jFrame that is modal which is why I'm asking before creating every screen as a jDialog.
    Also, I have one screen that could have 3 different functions. Is it acceptable to create the screen with all of the components and just hide the ones that I don't need at any given time or should I actually create 3 different screens that all look roughly the same?
    Thanks for any input/advice.

    ShosMeister wrote:
    So what's the difference, or more importantly, what's wrong with using jFrames? If you are creating a stand-alone, non-web app, then you will of course create a JFrame, and place your app in it, but I'm suggesting that the app not extend JFrame but rather you simply create a JFrame when you need it, and place a JPanel in this JFrame's contentPane, pack it and display it. The reasons for not extending a JFrame are multiple but mostly boil down to a general preference to avoiding extending classes with inheritance and using instead composition. There are many blogs dedicated to discussing this paradigm which can be found with Google, and here are two decent articles from the first page of my search:
    [JavaWorld: Inheritance versus composition: Which one should you choose?|http://www.javaworld.com/javaworld/jw-11-1998/jw-11-techniques.html]
    [Object Composition vs. Inheritance|http://brighton.ncsa.uiuc.edu/~prajlich/T/node14.html]
    One error caused by extension of a Swing class via inheritance that I saw in a recent thread in the Swing forum involved a class that overrode JLabel and held x and y int variables. The class had setX(int i), setY(int i), and getX(), getY() methods and thereby unknowingly overrode JComponents own similar methods completely messing up the JLabels ability to be positioned correctly.
    Are you saying I should only have one "window" and swap the data constantly in that window? Nope. I am saying that you should emulate other windows-like programs that you use. Most use a combination of panel swapping, modal and non-modal dialogs, ... whatever works best for the situation. But most importantly, you should write your code so that it is easy to change from one to the other with a minimal change in your code. Aim for flexibility of use.
    Would that require that all the jPanels were the same size to be able to display correctly?If you swapped with a CardLayout and created your JPanels to be flexible with sizing, the CardLayout would take care of this mostly for you.
    So if I create a main jPanel, set it up with all of the components that I want, when I run the program and main() is called, it would create the jFrame and drop the jPanel into itself? Not sure I've seen any examples of that so I'll have to look that one up. Yes, you'd drop the JPanel into the JFrame's contentPane. There are plenty of examples here, but it may take some digging to find them.
    Unless of course I've completely misunderstood you which is possible since, as I've mentioned, I'm just learning Java.I think you are understanding what we suggest here. You are asking the right questions, so I predict that you should learn Java quickly.
    Thanks!!!!Welcome!

  • MDM MDIS best practices!!!

    Hello Experts,
    I want know what is the best practices of MDIS when MDM connects/imports data from two different ECC instance,
    below is issue which we were facing in our project,
    MDM imports data from two ECC instance, we had maintained two inbound ports 1.ECCCH and 2.ECCCW so when MDM receives bulk files(1000files each file would contain one material record details) from ECCCW then MDIS is scanning and processing only CW port files and same time when MDM receives file from ECCCH then MDIS wont process because MDIS resource is already scanning and processing CW port,
    we are monitoring MDM ready folder through BMC tool when files stuck in ready folder for longer period(30Mint) then tool would generate high level incident so because of above issue now we are getting more high incidents,
    also referred SAP MDIS guide it is also says MDIS process one port at time,
    "If there are multiple files waiting in a port, MDIS imports the files in a
    first in, first out order, meaning the oldest file in the port is imported first,
    then the next oldest, and so on. MDIS imports all of the files that were
    waiting in the port before it imports files from any other port",
    is there any workaround for this issue,
    Regards
    Ajay

    Dear Pramod,
    Pls go throught those links.
    My Best are
    1.       Step by Step approach
    2.       Data Governance
    [Top 10 CDI-MDM Best Practices|http://www.dmreview.com/specialreports/20061019/1064839-1.html]
    [Seven master data management best practices|http://searchsap.techtarget.com/news/article/0,289142,sid21_gci1219185_tax305408,00.html]
    [Technical Best Practices for Master Data Management|http://www.tdwi.org/publications/display.aspx?id=8148]
    Hope this helps,
    + An

Maybe you are looking for

  • How do you keep your message from appearring on a locked screen

    how do you keep your message from appearring on a locked screen

  • New laptop and can't access my plans

    Just bought a new laptop and I've downloaded creative cloud to re-install Premiere CC but it looks as if I haven't purchased the plan. Even when I log into the Adobe website it even claims I haven't purchased anything. Worth noting that the last paym

  • HUD disappeared (Ap 2)

    Any ideas why the HUD doesn't appear even when the menu implies it is visible (i.e. it shows "Hide inspector HUD")? The other HUDs work fine. Thanks

  • How to force cache to release objects

    In MappingWorkbench I have configured the cache for my class (SoftCacheWeakIdentityMap, size 30000). Then I use following code to write out the cache size (more precisely: number of objects in the cache) and the cache size is bigger then 30000. Why i

  • First time connection nokia6600 to pc suite

    when i run pc suite on my pc, it says to connect phone to pc. i try and do this through bluetooth, but it says i'm disconnected. i'm running latest version of nokia 6600 pc suite on windows xp. i can send files to and from phone/ pc ok, in other soft