LR5 multi-platform backup strategy

> PC (2010):
SSD: Intel SSD 120Gb (Installed apps: Windows 7, CS6 Master Collection, Lightroom 5)
HDD1: 1TB (Lightroom gallery 1)
HDD2: 1TB  (Lightroom gallery 2)
HDD3: Backup 1 (3TB SDD + HDD1 + HDD3 + OS X)
> rMBP (2013): i5 2.6 GHz/16GB/512GB (Lightroom temporary Gallery)
OS X: (200GB): LR5, Windows 7 (300GB): CS6, LR5
> External Western Digital HDDs (current backup strategy):
WD1 2TB – Backup 2 (copy of Backup 1)
WD2 1TB – Backup 3 (copy of Lightroom gallery 1, 2)
Hi guys, I have recently purchased Lightroom 5 and would appreciate some advice on how to set up a backup strategy prior to syncing all my images with it. Currently, above PC is my primary machine (HDD1, HDD2 with image galleries and HDD3 is primary backup). I also have a mac where OS X and Windows 7 (bootcamp) both have LR5 installed. Images stored on the laptop is temporary while I am out and about which eventually gets moved back to HDD1/HDD2 (with backups in HDD3, WD1). I also a spare 1TB external HDD that I would like to utilize to exclusively store Lightroom files.
My queries are as follows:
1) What's the best way to move my laptop's (OS X and bootcamp) temporary LR5 files to my primary PC (HDD1, HDD2)? Can I do this over home network easily?
2) Best way to sync all Lightroom files (from HDD1 and HDD2) with 1TB external HDD? This is so that I have a copy of my images on the go and can sync the laptop to external drive on longer trips.
3) As my images (along with other data) gets backed up to HDD3 automatically, should I exclude this drive from sycing with LR5?
4) Is there any backup software I can use to simplify this backup process?
Cheers!

it is interesting topic!
how do you solve this?
user-friendly  and   multi-platform-friendly
backup strategy for LR and PS is maybe interesting for everybody
(I am looking for help for about 5? main LR install and backup configurations ...   )

Similar Messages

  • Cut-to-case backup strategy

    Me is on seting up a concept of backup strategy for mbp being planned to be used in our environment.
    Till now no experience with mac clients at all.
    We will do it stepwise. At the beginning the Mavericks on-board tool for backup Time Machine.
    As soon as bit conformable with platform disk cloning will possibly be included in the process.
    But only if Time Machine not sufficient for our needs.
    Finally if one day it turns out it all does not suite to our needs a switch to 3rd party backup solutions will be conducted.
    However this day me can see some signals saying the Time Machine will not
    be suitable due to functional limitations. So for final decision bit more information is needed.
    We plan to use NAS attached to router as backup destination storage.
    Mass storage of nas will be formatted in NTFS. Mac client will get access to nas for
    purposes of backup at regular cycle but not continuously, let's say 1 time a week - this corresponds to current
    backup strategy and fulfills the needs completely.
    As far mine understanding is the TM drive (backup target) removes backed up items in following situations
    - as soon as the remaining storage space is below some limit the oldest increments
      get removed, number of increments being removed depends on how quick the minimal free space
      requirement if met.
    - there is no warranty a file is backed up on TM drive as soon the drive gets connected to mac client
      and the original item no more exists on the client's drive (source)
    A. Is the above understanding fine? Any other situation to be added onto the list?
    B. Local snapshots: Is this how the foundation of mac os trash bin are built and look?
    C. Or is the mac os trash bin a function and implementation independent from TM local snapshots?
    So a nas is planned to store the backup files. Additionally a usb 3.0 drive is planned to backup the backup nas.
    In emergency situation where the mac os will following procedure will need to be conducted.
    step 1, By other means the backup files are to be copied from ntfs formatted nas storage to
    external drive - usb or thunderbolt - formatted using mac os conform file system.
    step 2, external drive with TM backups prepared as above to be connected locally to mac client
    step 3, recovery using Recovery HD is to be carried out.
    Is the above understanding fine?

    Yes, you are right.
    In the meantime have had more read about TM.
    As the well-known article does it claim it in the beginning indeed a complicated
    and mysterious machine.
    I guess for the same reason one needs more than one read runs
    to have an overview.
    Now I am aware, TM is proprietary and rather a closed-platform system.
    Those components are not standardized. If any grade of flexibility is shown
    it is so thanks to capabilities of that proprietary system.
    I see some functional limitations shown by TM which immediate disqualify this
    solution in our environment. We will need to take a 3rd party solution.
    What a pity, the costs continue to explode.
    Cloud based solution will not be acceptable due to connection rates
    in my region. It is also quite regular case here to be left without broad -band connection
    for more weeks after had changed the flat.
    Also due to mine concerns regarding data security basically nowadays.

  • Long-term  retention backup strategy for Oracle 10g

    Hello,
    I need design a Archivelog, Level 0 and Level 1 long-term retention backup strategy for Oracle 10g, It must be based in Rman with tapes.
    ¿Somebody could tell me what it's the best option to configure long-term retention backups in Oracle?, "CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 1825 DAYS;" It's posibble ?
    Regards and thanks in advance

    Hello;
    RECOVERY WINDOW is an integer so yes its possible. Does it make sense?
    Later My bad, this is Oracle 11 only ( the gavinsoorma link ) . Getting harder and harder to think about 10. Don't think I would use RECOVERY WINDOW.
    I would let the tape backup system handle the retention of your rman backups.
    Using crosscheck and delete expired commands you can keep the catalog current with the available backups on tape.
    Oracle 10
    Keeping a Long-Term Backup: Example
    http://docs.oracle.com/cd/B19306_01/backup.102/b14191/rcmbackp.htm#i1006840
    http://web.njit.edu/info/oracle/DOC/backup.102/b14191/advmaint005.htm
    Oracle 11
    If you want to keep 5 years worth of backups I might just use KEEP specific date using the UNTIL TIME clause :
    keep until time 'sysdate+1825'More info :
    RMAN KEEP FOREVER, KEEP UNTIL TIME and FORCE commands
    http://gavinsoorma.com/2010/04/rman-keep-forever-keep-until-time-and-force-commands/
    Best Regards
    mseberg
    Edited by: mseberg on Dec 4, 2012 7:20 AM

  • Need Advice on Backup strategy

    Hi,
    Database version:10.2.0.
    Database Size:60GB.
    Daily Updation Approximately :2GB.
    I need some advice on my backup strategy.
    My backup startegy:
    Monthly Once, Date: 1 Full backup (Incremental level 0).
    Daily Incremental level 2 backup from Date: 2 to 6.
    Weekly Once Incremental leve 1 backup on Date:7.
    Daily Incremental level 2 backup from Date:8 to 13.
    Weekly Incremental level 1 backup on Date:14.
    Daily Incremental level 2 backup from Date:15 to 21.
    Weekly Incremental level 1 backup on Date:22.
    Daily Incremental level 2 backup from Date:23 to 28.
    Weekly Incremental level 1 backup on Date:29.
    My system requirments are i have only 70GB disk to store the backups, i can store only the full backup in it.So what i am doing is, Am storing the full backup in my disk and am transfering the Incrmental level 1(weekly Once backup) to the Tape and deleting the Daily level 2 backups, after taking that weekly backup.
    Is my strategy correct or should i do anything different plz give me some advice.
    TIA,

    This backup strategy doesn't look bad. However the 70G of free space will be a pain in the neck. I suggest you to increase storage. Have you considered archivelog too? This average 2G daily transaction rate will increase storage requirements, sooner or later, so backup storage requirements.
    ~ Madrid

  • Need help with backup strategy

    So my 2 TB backup drive failed this morning.  I'm using a Seagate drive in an OWC external enclosure.  I plugged it in and the drive didn't pop-up on the desktop, nor is it visible in Disk Utility.
    I know that all drives eventually fail, but it seems like I've had more than my fair share of problems lately.  This is of course making me think hard about my backup strategy.  Here's what I'm doing now - I'd appreciate your thoughts on how to make it even more "bulletproof".
    > TimeMachine back-up of boot drive and media drive (with photos, documents, movies, etc.)
    > External clone of boot drive and media drive (on two partitions) - this is the one that failed
    I suppose I could add a third external clone for redundancy, but an offsite backup would probably be even better.  Not quite sure what the best option is there, though.  Any ideas?

    I too love automation. However, an automated backup requires the backup drive to be always connected (and maybe powered) in order to perform the backup. Also, the computer you speak of must remain on (by the sounds of it) 24/7 in order to do a nightly back up at midnight.
    First of all, it probably is not the best to leave your puter on 24/7. I won't go into all the OS reasons why. But here are 2 other reasons to think about:
    1) Your internal HD will always be powered, bad on drives
    2) You are constantly using power to run your system. Not good for your power bill or the greater environment
    As for the always connected Backup HD- I only connect, and power on my backup when I go to do a backup. This leaves it disconnected in case of virus infection. And powered down/disconnected removes some of the risk of damage from power surges/spikes (yes I use a UPS, but these can/have failed)
    So to sum up: I back up everyday. What I do is after I am done working in LR for the day I shut it down, and than start it back up (ugh) so that I can back up my Library with all the work I just did (wish Adobe would do a back up upon closing!!!)
    Than I close down LR AGAIN..... Than I connect my Back up HD. Via usb, once connected it automatically powers up and fires up the back up software. All I do is hit start. Since this is an incremental back up it only takes a short while. I go use the "John" grab a drink, and come back unplug the Back up HD, and turn off the system for the day.

  • InfoProviders backup Strategy in SAP BW

    Hello;
        Does any one experienced a backup issue with BW especially when you would like to perform a risky modification. Is there any backup strategy that is supported by SAP for any modifcation senario, if it is a small modification or also a big one affecting one or more infoproviders. if there is any document please share with us.
    Regards
    Anis

    Hi,
    After the modification if you want original cube structure mean you need to roll back your transports with help of basis team. About roll back you cross check with basis team
    Before doing the changes first have exact informations from your client or users.
    If you think like that as need original cube structure then you can on copy cube structure at prod and move to dev thru transport.
    When you need same or old cube structure, from dev you can move to prod or roll back option.
    Thanks

  • How to setup multi-platform export ?

    I have a FlashBuilder Burrito project created as a Mobile Flex App, and wondered if there's any quick easy way for it to be exported for mobile, web and desktop app?  One of my biggest trends to learning Flex and play with Burrito was the supposed multi-platform capabaility... although I'm not seeing it in this case.
    Is it, that I have to create a new Flex project, and cross merge my source code, and reapply differing User Interfaces?  Or is it that this kind of feature is not available in the Beta release?
    Apologies if this is sounding like a dumb 'fundamentals' question, but as I say, I'm new to Flex and I've not come across any answer here or Google...
    Any help appreciated. Thanks
    Stu

    Thanks, I knew about the export release build part - but as you mention, it says it will only build to Android.... not web or desktop app as well.
    (I know iOS- phone/pad is not available until release.)
    I was wanting to know if there was anyway to publish to multiple platforms from my existing code at the same time now, but it's looking like it's because Burrito is in Beat dev still (?)

  • Multi-platform File Adapter

    We are currently looking at the requirements for several new processes involving use of the File Adapter functionality. Our production BPM environment is RedHat Linux 4, while this particular legacy system runs on Windows 2000. The legacy system is already configured to read and write XML files into Windows file shares. Some integration processes should kick off when XML files are written to the file shares. Other processes will write XML files back into the file shares. I am looking for recommendations as to the best architectural approach to dealing with this multi-platform problem within the BPEL PM. As I see it, we have a few of options:
    1.     Utilize FTP server functionality rather than direct file access to read and write the files in a platform-independent manner.
    2.     Use some other technology to bridge between the platform-specific file directories and something less dependent on platform. For example, pick up the files from the Windows directory and write them to an AQ queue. Then feed the BPEL process from the queue.
    3.     Run BPM on multiple platforms and allow the Windows instance to handle Windows file drops while the Linux instance handles Linux file drops. Obviously there is a cost penalty here as well as complexity during deployment.
    Any thoughts or experiences are welcome.
    Thank you.

    Have you looked into relative path's? that would solve the issue of different OS path names.
    Also, you might want to consider the deployment descriptors you can use when compiliing, you can set enviroment specific variables like paths in there.

  • Best backup strategy

    Time Machine is brilliant. It's saved me many times.
    But recently, the backup drive with a year's worth of backups, died.
    I therefore lost ALL my backups.
    Not a major problem as I still have my current data and having re-formatted the Time Machine drive it's merrily backing it all up again.  (I just hope I don't need to recover to last week's backup ... as I no longer have it.)
    But until that's finished I have NO backups!  Eeek!
    So what is the best backup strategy, bearing in mind drives can fail, houses can burn down, etc.  Should I have two or three Time Machine backup discs and keep swapping them every day so if one dies I've still got a one-day-old version?
    Making DVD backups and storing them elsewhere is very time consuming and while my data is really important to me, it defeats the object if I can get on with any work on that data as I'm constantly burning to lotsof DVDs!
    Your views would be appreciated!

    I pretty much do a similar thing, but my offsite backup goes to a locked cabinet in my office (easier to get to weekly to bring home and update then using a bank - I honestly cannot remember when I last physically went to my bank, its been years).
    TM makes a great source for restoring single files or being able to go back to an earlier version of a file.  And the clones are easier for full restoration, or even for booting from temporarily if the machines boot drive dies and you just want to get in to deauthorize it before sending in for repairs or such.  Always good to have a bootable clone around, for many reasons.
    My external clones are on firewire 800 bus powered portable drives, again for simplicity (no power cables or bricks to go with them).
    I also still burn to optical disc for specific things like filed tax returns, and other financial documents I need to keep around in the fire safe for some period of time.

  • Backup strategy

    Hi,
    short story:
    Due to the structure of the library I have the "originals" and "modified" images, when edited some. When importing a library into another without copying them I have some/many thumbnails twice. How do I prevent iPhoto to use the original thumb when a modified exists?
    Long story:
    I'm looking for a working backup strategy for iPhoto. I'd like to collect fotos over the time and when the library reaches some GByte I'd like to copy it to two external drives in a software raid set (mirror). I know that I can merge the two libraries (one on raid, one on MacBook) with some piece of software. So far so good.
    Now I'd like to have the thumbnails of the fotos on the ext. drive on my MacBook as well. I can import them to a library with the option not to copy the images but leave them where they are (on ext. drive). When I'm on the go I can access the thumbnails. When viewing them I am asked to insert the drive.
    But now comes my problem: Due to the structure of the library I have the "originals" and "modified" images, when edited some. So I have some thumbnails twice. How do I prevent iPhoto to use the original thumb when a modified exists?
    Thanks a lot.

    What you really need to use isMedia Expression. It creates catalogs containing thumbnails of the cataloged photos and the catalog can be used without the source file being available.
    You can add keywords and other identifiers to the photos while just using the catalog and then when you get back and have the source files available the new metadata can be applied to the actual file.
    Expression Media appears to be available for the upgrade price of $99 if you are using iPhoto as shown here:
    $99 (Full Version $199)
    For qualifying owners of:
    Licensed copy of an earlier version of Expression Media or any iView Multimedia product.
    OR
    Any licensed photographic software, including Windows Photo Gallery or iPhoto
    That info is from this site: http://www.microsoft.com/expression/products/Upgrade.aspx and by clicking on the "here" link on that page. I've emailed the developers to see if I've intrepreted that page correctly.
    In the meantime you can download and run the demo. You can also catalog your iPhoto library with EM and write all metadata entered in iPhoto back to the original files. Steps 1-7 of Old Toad's Tutorial #1 describe how. That page is in revision since iView is no longer available and Step #2 is slightly different with the EM demo. It should be changed by the end of the day today.
    I use it in conjunction with iPhoto. EM is my primary DAM (digital asset management) application and I use iPhoto for projects like books, calendars, slideshows, etc.
    TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto (iPhoto.Library for iPhoto 5 and earlier) database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
    I've created an Automator workflow application (requires Tiger or later), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. It's compatible with iPhoto 6 and 7 libraries and Tiger and Leopard. Just put the application in the Dock and click on it whenever you want to backup the dB file. iPhoto does not have to be closed to run the application, just idle. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file.
    Note: There's now an Automator backup application for iPhoto 5 that will work with Tiger or Leopard.

  • Backup strategy in FRA

    Hi Experts,
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
    PL/SQL Release 11.1.0.6.0 - Production
    CORE     11.1.0.6.0     Production
    TNS for HPUX: Version 11.1.0.6.0 - Production
    NLSRTL Version 11.1.0.6.0 - ProductionI would like to ask some advice to place my backup in FRA.
    I read in a book that it is adviced not to put the archivelog backup in the FRA since if something happen with the disk that store FRA, everything will be gone. However the concept of FRA is a centralized backup logically.
    so based on your experiences, what is the best way to utilize FRA for backup strategy?
    is it safe to put everything in FRA or we should still split the backup for archivelog to different location?
    thanks

    The idea is that you should never have a single copy of your backup anyways, be it of whatever file type. Its true that the FRA gone would lead to the non-availability of the backups but then , you should had pushed your backups from it to the tape drive eventually to safeguard them. So there wont' be an harm in putting the backup in the FRA as long as you multiplex the backup and keep it on some another location as well.
    HTH
    Aman....

  • Best practice PDW database backup strategy/plan

    Hello All,
    We are ready with PDW infra , appliance is almost ready. we are planning for implementation.
    but before that , I have to document best practice for PDW database backup strategy/plan.
    since PDW environment is pretty new . Please help me with best backup strategy/plan which can
    be followed to implement in my proposed PDW solution.
    your suggestions will be highly appreciated.
    Regards,
    Anish.S
    Asandeen

    Hi Anish.S,
    According to your description, you want to backup SQL Server Parallel Data Warehouse (PDW) database.
    Before we get to the backup and restore syntax, it’s worth noting that the Parallel Data Warehouse (PDW) appliance architecture offers an environment that greatly enhances backup times (due to dedicated storage and network interfaces, see the following post
    for more information -
    https://saldeloera.wordpress.com/2012/07/09/lesson-1-of-parallel-data-warehouse-basic-architecture-overview/).
    For more details how to backup and restore database on PDW, please refer to the following blog:
    http://www.sqlservercentral.com/blogs/useful-information-and-case-studies-covering-data-warehousing-data-modeling-and-business-intelligence/2012/10/04/parallel-data-warehouse-pdw-how-to-using-backup-and-restore-database-on-pdw/
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Backup strategy for ebs R12?

    Hi,
    I have a couple of questions regarding to backup strategy for ebs R12.
    Let's say that we have one application server and one database server running, and we would like to have a backup periodically for both servers.
    This means that we want to have the exactly same backup of configurations and other files for the application server and the database server in case of any fatal crashes in the servers.
    I went through some discussion threads and it seems common way is to use RMAN for the data of database and file system copy for application and database servers. But, I would like to know what's the best way or recommended way for the backup strategy for data and server files of Oracle e-business suite.
    Do you guys have any suggestion for this?
    Thanks in advance.
    Cheers,
    SH

    Hi SH;
    I suggest check first our previous discussion
    backup and recovery startegy for EBS
    Re: Backup System
    Backup Policy for EBS can
    If you make search you can find many other thread which is mention same question
    For your issue basicly what we do
    1. For our prod db, if possible we take weekly cold backup(sunday) and take rman backup also daily(full backup, but it depends system you can prefer to take incremental)
    2. For our system we take file system backup weekly(OS base)
    3. For application we take backup every sunday(for some system 15 days or 30 days period). The most important point here if your system prod, and if you need to apply patch to your system you need to have valid backup and also need backup for after patching
    One other important issue is we run preclone sh before take backup for apps and db tier
    Regard
    Helios

  • What is the backup strategy for your rpool?

    Hi all,
    May I know what is your backup strategy for rpool? like how to integrate with the netbackup or any other backup software
    Is it using zfs send a data stream to a file and backup that file using backup software??
    Aside the above method, can we also use the UFS method?
         backup software to backup entire / /var..........  etc....
         1. re install the OS
         2. install the backup client
         3. create a zpool (rpool-restore)  using a 2nd disk,
         4. mount the new zpool  (rpool-restore) to /restore
         5. restore all the file in  /restore
         6. install boot blk,
         7. boot from 2nd disk
    Any more idea?

    HI Willy,
    According to the Flash Archives limitation stated in Oracle website, if the OS is with a child zone, it will not work right?
    http://docs.oracle.com/cd/E19253-01/821-0436/6nlg6fi8u/index.html
    I am thinking that using the traditional way to do it.
    Use a backup software, for example, networker, TSM, netbackup.. to back up the following:
    /rpool
    /export/
    /export/home
         1. re install the OS
         2. install the backup client
         3. create a zpool (rpool-restore)  using a 2nd disk,
         4. mount the new zpool  (rpool-restore) to /restore
         5. restore all the file in  /restore
         6. install boot blk,
         7. boot from 2nd disk
    will it still works?

  • Any new backup strategy with Oracle VM 3.1 ?

    Now, since Oracle VM 3.1 was released, could we be updated with backup strategy.
    Backing up OCFS2 repositories allow us protect OVF files right?, but what happen with apps runing on VMs, these data is also backed up? what about open files, databases, etc.
    I will appreciate any comment.
    Thanks in advance.

    try running this on each server:
    /usr/bin/rescan-scsi-bus.sh

Maybe you are looking for

  • Can I use the existing balance of my iTunes account to subscribe to iTunes match?

    I bought a 50 dollars card, and after include this into my apple ID aacout, i went to suscribe to itunes match.  I don´t want a credit card associated, and I noticed that there is an option to trade itunes cards to suscribe.  It is too late for may c

  • 2nd replacement iPhone 4S and battery is even worse!!!

    So this is the second replacement iPhone 4S I received from an apple genius. The battery life is even worse on this one. Down 1% every 2-5 minutes?! Fully charged at 109% and during usage, it went down to 90% in 45 minutes. All of the settings that h

  • Run External Application

    Does anyone know the best way to connect to an external (non-Oracle) application via SQL or PL/SQL?? I have to run external applications such as MS Word from within an Oracle application. There are several applications which may be used, so I plan to

  • When I plug my ipad into my computer nothing happens

    I have the ipad 2 and I cnat sync it to my computer. I have the latest update of itunes and nothing happens when I plug it in How can I sync my music and make the computer recognise it? Thank you

  • HT1541 There is no way to resend an email for a gift card given to someone? Need a help

    Could anyone help please! I gave a card gift to someone and she didn't receive any email! When going to the iTunes or App Store and following the instructions from apple to resend the email, there is no option to resend at all?! Only it's written don