Maintaining stability within Arch/Backup Planning

Hello Forum,
I've been using Arch for around 5 months or so. I truly do love this operating system, and the amount of control it offers and to be fair, in the short time I've been using it, I've only had pain with one update, and it was still a relatively minor fix.
My issue is that I have recently assembled a new computer, and I'm debating on what to run as its primary OS. On this system I plan on having a couple of virtualized servers, which I will need to have regular access too, my worry is that Arch may have some major bug in the future after an update, and I will not be able to utilize these servers.
I have tried going back to a more stable yet extensible OS, fedora for example (Primarily because I will be administrating Redhat boxes in my next job) but I am now completely dissatisfied with an overly bloated and opaque operating system, that would require hours of streamlining to get it down to a level that feels comfortable.
I've thought of doing a dual boot with an LTS solution as a backup to ensure I can gain access to the system, but this would still be pretty resource intensive.
If you were going to be running a system that required stability. What would you do to ensure stability? What are some other possible minimal base, but highly extendable options?
Last edited by cynicalpsycho (2014-11-01 22:37:11)

progandy wrote:
cynicalpsycho wrote:Alpine seems interesting, but it also seems to be pretty under supported/documented. I'm not sure I'm ready to go on that adventure just yet. But the Xen Hypervisor does seem interesting. Why would you recommend it over say... virtualbox?
I recommend KVM or Xen if you want to run virtual servers. For the reasons, please read about Paravirtualization (Xen and KVM) vs Emulation (Virtualbox).
Edit: virtualbox also has some paravirtualization featues, Xen and KVM seem to be more efficient.
theinternets wrote:
Native virtualization (or full virtualization) is where a type-2 hypervisor is used to partially allow access to the hardware and partially to simulate hardware in order to allow you to load a full operating system. This is used by emulation packages like VMware Server, Workstation, Virtual PC, and Virtual Server.
Paravirtualization is where the guest operating systems run on the hypervisor, allowing for higher performance and efficiency.  For more technical information and videos on this topic, visit VMware’s Technology Preview for Transparent Virtualization. Examples of paravirtualization are Microsoft Hyper-V and VMware ESX Server."
I just learned something new... Oh, and emulations is the process of simulating hardware, ie videogame emulation, vbox would be considered native virtualization. Thanks for the tip Progandy. That is interesting indeed!

Similar Messages

  • Best Practices: What is a best backup plan on BO 4.0

    Hi Experts!
    I work with BO since 2007. I worked a lot with BO XI 3.1 and now with BI 4.0. I always have a question about the way to make a backup: What is a best backup plan on BO 4.0.
    I know de many way to do this, but how I'm a consultant developer and backup usually is not my responsibility, but always I have to advise my clients and users.
    The little summary of ways I know on BI 4.0:
    - Stop the services and do a backup of repository database and FileStore folder (eventually include the TomCat folder and others BO installation folders)
    - Create a job on LCM and a schedule to export a LCMBIAR file
    - Use the Upgrade Management Tool to generate a BIAR file by the command line
    I found that interesting post of Raphael Branger, but his the best option is to use the 360View, but that software I don't know, and the clients usually want to use the SAP solutions, so the preference is use the BO's way to make backup.
    Backup & Recovery in BO 4.0
    Note: I agree with Raphael about the old Import Wizard, I don't know why Upgrade Management Tool don't allow to import a biar file that same version of target. It is terrible.
    Let me make a the big question: What is a best backup plan on BO 4.0?
    I know that this depends of the environment and the many variables, but let us consider the general environment, the standard installation.
    Thanks everybody!

    Thanks Mrinal and Ajay,
    On my experience I always use the full-backup: repository database backup + filestore folder backup (usually I recommend include BO folder installation and TomCat folder too because the custom configurations). That backup is essential, is basic.
    But this backup is not flexible.The usual problems on BO's production enviroment is accidental deletion of some reports or objects of BO. Since BO XI R2 I used the "Import Wizard" to generate BIAR files by the command line, I usually create a BAT file with command line to create thats files, however BO 4 Import Wizard was died, now exists "Upgrade Management Tool", but I can create BIAR files by the command line too. Let's suppose a case that the BO user has deleted a report and that user did notified that deletion after 1 month. We don't need to restore all objects of the full-backup of 1 month ago, with BIAR files, we can restore only that report. Thats is the advantage of using BIAR files.
    So, my strategy is use the full-backup (repository database + BO installation folder) and create BIAR files.
    What do you think about the backup by generating BIAR files?

  • SPA3000 F/W 3.1.20(GW) last digit repetition within the dial plan is not working for gateway 0-4

    H/W: SPA3000
    F/W: 3.1.20(GW)
    Problem: The last digit repetition within the dial plan is not working for gateway 0-4.
    Line 1 Dial Plan: (****|<#,xx.<:@gw1>|1747xxxxxxxx.<:@gw1>)
    Dial "17474743246#" which works for gw1.
    Dial "#17474743246" which also works for gw1.
    Dial "17474743246" which doesn't work and actually goes through VoIP 1 but not gw1.
    It results in I have to specify a fixed number of digits for gw0-gw4 in the dial plan or it won't works. This could be very difficult for international calls.
    Is there any way to specify Interdigit Short Timer with <:@gw0>?
    BTW, it's strange that I have to include (****) in the dial plan to enter IVR. Without it I can't even hear any sound coming from gw1 (SIPphone). Maybe the SPA needs a factory reset. The test dial plan might look like,
    S:4,(****|*xxx.|[2-9]|xxxxx|*0<:@gw1>|1747xxxxxxx<:@gw1>|1408xxxxxxx<:@gw0>|xxxxxxxxxxxxxxxxxxxx.!)
    The last barring sequence "xxxxxxxxxxxxxxxxxxxx!" is to make sure the Interdigit Short Timer works after the last digit dialed otherwise the number will be transmitted immediately which is not preferred.

    IMHO , if there's a need for you to include *** on the dial plan to enter the IVR menu there might be a need to reset the unit.
    I believe that you can set the interdigit timer by including that on the dial plan ie: xxxxS0 just like on a hotline, i just never had a chance to try including that syntax when invoking gw0 on the dial plan , but i guess you may give it a try.

  • Changing the datasource of LO cockpit inR/3..need a backup plan

    Hi gurus,
    i am going to change the datasource in production system i have done everything in Dev and QAs but my client is asking for a backup plan if something goes wrong.
    like if the transport fails he want everthing to normal without effecting the setup of deltas and data in BW
    can somebody suggest me what i can do as a backup plan if something goes wrong.
    thanks and regards
    neelu

    Hi Neel
    If something goes wrong and it mess up your deltas in Production system then what is the backup plan?
    This is what I think -
    If your data source is LO cockpit extractor then ensure following -
    Before your R/3 transports ,stop V3 batch job .
    Lock the users (transaction lock) ,
    Stop batch jobs that will update base tables of extracor
    Drain the delta queue into BW till you get 0 LUWs in RSA7 for that datasource in R/3 system.
    Activate the last request in ODS and also send the data into Infocube . Now you have all data till that time into BW.
    Import your transports. Then execute V3 job in R/3 and run Info package from BW .
    If something is messed up then immdiately push another transport that will repair it .Then  do Init without data transfer and resume deltas.
    For other extractors just ensure to Drain the delta queue into BW till you get 0 LUWs in RSA7 for that datasource in R/3 system.(I assume that you will take care of batch jobs /transactional update - they should be stopped for any extractor)
    Hope this plan helps you .
    Please let me know if you still have questions.
    Best of luck
    Pradip

  • How to find the existing sql server backup plan/schedule is there a script for that?

    Friends,
    Is there a easy way to find out in SQLServer (for All DB's) what is the current backup plan/schedule ? is there a script for that?
    Thanks,
    Karthikeyan Jothi

    To check the database backup 
    Select
    SERVERPROPERTY('ServerName'),
    db.name,
    CONVERT(VARCHAR(10), b.backup_start_date, 103) + + convert(VARCHAR(8), b.backup_start_date, 14) backup_start_date,
    CONVERT(VARCHAR(10), b.backup_finish_date, 103) + + convert(VARCHAR(8), b.backup_finish_date, 14) backup_finish_date,
    case
    when (DATEDIFF(hour, b.backup_start_date, getdate())<24)then 'Success'
    when (DATEDIFF(hour, b.backup_start_date, getdate())>=24)then 'Failed'
    end Status,
    DATEDIFF(hh, b.backup_finish_date, GETDATE())BackupAgeInHours,
    (b.backup_size/1024/1024/1024 )BackupSize,
    case b.[type]
    WHEN 'D' THEN 'Full'
    WHEN 'I' THEN 'Differential'
    WHEN 'L' THEN 'Transaction Log'
    END Type,
    ISNULL(STR(ABS(DATEDIFF(day, GetDate(),(Backup_finish_date)))), 'NEVER')DaysSinceLastBackup
    FROM sys.sysdatabases db
    Left OUTER JOIN (SELECT * , ROW_NUMBER() OVER(PARTITION BY database_name ORDER BY backup_finish_date DESC) AS RNUM
    FROM msdb.dbo.backupset) b ON b.database_name = db.name AND RNUM = 1
    where dbid<>2
    OR
    SELECT
    DISTINCT
    a.Name AS DatabaseName ,
    CONVERT(SYSNAME, DATABASEPROPERTYEX(a.name, 'Recovery')) RecoveryModel ,
    COALESCE(( SELECT CONVERT(VARCHAR(12), MAX(backup_finish_date), 101)
    FROM msdb.dbo.backupset
    WHERE database_name = a.name
    AND type = 'd'
    AND is_copy_only = '0'
    ), 'No Full') AS 'Full' ,
    COALESCE(( SELECT CONVERT(VARCHAR(12), MAX(backup_finish_date), 101)
    FROM msdb.dbo.backupset
    WHERE database_name = a.name
    AND type = 'i'
    AND is_copy_only = '0'
    ), 'No Diff') AS 'Diff' ,
    COALESCE(( SELECT CONVERT(VARCHAR(20), MAX(backup_finish_date), 120)
    FROM msdb.dbo.backupset
    WHERE database_name = a.name
    AND type = 'l'
    ), 'No Log') AS 'LastLog' ,
    COALESCE(( SELECT CONVERT(VARCHAR(20), backup_finish_date, 120)
    FROM ( SELECT ROW_NUMBER() OVER ( ORDER BY backup_finish_date DESC ) AS 'rownum' ,
    backup_finish_date
    FROM msdb.dbo.backupset
    WHERE database_name = a.name
    AND type = 'l'
    ) withrownum
    WHERE rownum = 2
    ), 'No Log') AS 'LastLog2'
    FROM sys.databases a
    LEFT OUTER JOIN msdb.dbo.backupset b ON b.database_name = a.name
    WHERE a.name <> 'tempdb'
    AND a.state_desc = 'online'
    GROUP BY a.Name ,
    a.compatibility_level
    ORDER BY a.name
    To check the schedule you can try the below script
    https://gallery.technet.microsoft.com/SQL-Jobs-Complete-eabe0050
    --Prashanth

  • Maintain attribute dimensions in Hyperion Planning, via ODI

    How can I move/maintain attribute members in Hyperion Planning, preferably via ODI?
    I have been loading an attribute dimension into Hyperion Planning, via ODI. The problem is that some children have been previously loaded to incorrect parents. However, a reload does not seem to move the children to their rightful place in the hierarchy.
    There are no errors being generated.
    Cheers

    Are you definitely sure it is not on the planning side the issue lies, I would create a file and then use the outline loader to compare load times to see where the issue is first.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Automatic Offline Full Backup Plan: CQ5 Installed As a Service

    I have designed an Automatic Offline Full Backup Plan for CQ5 Installed As a Service. I welcome your comments.
    Do you have any better way (scripts)?
    Offline Full Backup Scripts for Author
    :: offlineAuthorBackUp_CQ5AsService.bat
    :: Stop CQ5 service
    chdir /D %~dp0
    net stop cq5
    ::copy all files and folders to the destination
    mkdir h:\cq5backup\offline\author\backup
    ::robocopy only copies the updated files, so it is very quickly
    c:
    robocopy /MIR c:\cq5\author h:\cq5backup\offline\author\backup
    :: Start CQ5 service
    net start cq5
    ::Zip the backup with 7z.exe
    h:
    cd h:\cq5backup\offline\author
    7z a -y -tzip offlineAuthorBackup_%date:~4,2%%date:~7,2%.zip h:\cq5backup\offline\author\backup

    There is one big advantage of a clone.  You can immediately reboot
    to it and continue working and deal with the regular boot drive faiure,
    what ever it may be, later.  Especially since all your data and such
    is on another drive.  If you use your computer for work and time
    critical projects, this is a major plus!
    In the case of a hard drive failure/replacement, copying the clone
    to the drive is the fastest way to get the system and all your settings
    back.
    Time Machine and incremental backups have a place as well.  It is best
    suited for "incremental" problems.  Examples are installing an upgrade to
    software that doesn't work or just don't plain like.  With Time Machine it
    is easy to just restore back to the point before the install.
    Something else I do is backup current project files to USB memory sticks.
    If you are using your computer for business, you can never have too many
    backups.  Coralllary 456 of Murphy's Law is the "number of backups that
    you need will be one more than what you have!"

  • Backup Planning

    Hi, Guys. Just need your comments on my backup plan changes.
    Since my database is now too big, I would like to make daily tablespace backup instead of weekend whole database backup. Suppose I will backup like this:
    Monday: SYSTEM + SYSAUX tablespace
    Tudesday: APP_1
    Wednesday: APP_2
    Thursday: APP_3
    Friday: APP_4
    Staturday: APP_5
    Sunday: APP_6
    I configure the record keep time of control file to 8 days and backup archivelog for every backup. Delete obsolete on every backup. A few concerns here:
    1 - I'm not able to do PITR to time before last tablespace backup, right?
    2 - I need to restore evevery data file from backups to do PITR, right?
    3 - UNDO tablespace (I configured a undo tablespace under auto-undo management option) is not backed-up since I assume it can be re-configure in mount state before doing recovery of datafiles, am I right?
    Thanks for any inputs, again.
    G.B.

    Hi
    How big is your database? How many tablespace do you have? How big are they? Do you have read only tablespaces?
    I would like to make daily tablespace backup instead of weekend whole database backup.
    What for a backup, incremental 0, incremental 1, cumulative, differential, full?
    1 - I'm not able to do PITR to time before last tablespace backup, right?
    You must have at least one tablespace backup at the level 0. After that you can do all what you want/need.
    2 - I need to restore evevery data file from backups to do PITR, right?
    Yes
    3 - UNDO tablespace (I configured a undo tablespace under auto-undo management option) is not backed-up since I assume it can be re-configure in mount state before doing recovery of datafiles, am I right?
    How big is you UNDO tablespace to don't backup it? Usually you include this tablespace in the backup too.
    Bye, Aron

  • Is my backup plan ok?

    Hello, I've had some good advice about backing up (I use Lightroom and import as DNG - I'll be
    getting Photoshop in due course). I wonder if you'd be kind enough to advise if my backup process
    is ok - I'm thinking of this:
    1) Backup unedited DNGs to External HD (x 2, with one offsite). I'll do this regularly, and from
    pictures in 'home' folder, as these are unedited even if edited in LR4, and this way even if I have
    edited them in LR4 I will always have backup of unedited DNGs
    2) Use a separate 1T External HD (my imac internal HD is also 1TB but plenty of room as I currently
    have 900 photos, not thousands), and start using 'Time Machine'. Again I'll have a 2nd HD off site which
    I'll also use with Time Machine, and rotate them every few days.
    3) At the same time I backup unedited photos I'll also backup my catalogue file to the same Ext HD. That way
    in the event of a hard drive failure I'll have unedited DNGs, and also the edited photos, as the catalogue will
    presumably have all the edit steps etc. I'm not sure I see the point in backing up cat file more often than I
    back up photos?
    4) When I begin using Photoshop I presume that, to save the edited photos, I will need to save them
    separately, or will the Photoshop edits also be included in the LR4 cat file?
    Instead of Time Machine I've heard of Carbon Copy Cloner, but am trying to simplify things, and wonder
    if Time Machine will do the job - I am also wondering whether Time Machine will negate the need to backup
    DNGs separately, but I can't get my head around the fact that TM does delete old backups, and why this would
    mean I could lose photos or info.
    I'm finding this a bit complicated, and would appreciate any advice. Thanks, Roy

    1) Lightroom never modifies the photo portion of your DNGs (it might modify the metadata portion), and so you ALWAYS have an unedited copy the photo portion of your DNGs.
    2) So time machine creates two backups, plus in step 1 you also create 2 backups? That's four backups. That's a lot of backups. Only you can determine if its worth the time and money (extra Hard Disks) to do this. I find one backup, plus a 2nd one offsite (I use Carbonite, but I used to just take a EHD and store it at work) is sufficient.
    3) As stated, you don't need to make backups of both edited and unedited DNGs. THere's no such thing as edited DNGs. Using Lightroom, storing backups of the original DNGs is sufficient. Backups of the catalog file are mandatory. As far as the frequency of backups, if you are daily making significant changes to your catalog, then daily (or more frequent) backups of your catalog is a good idea. If you modify your catalog less frequently, then less frequent backups are probably appropriate.
    4) I think it is wise to send the Photoshop edits back to LR — you are planning to use LR as your cataloging software, aren't you? I believe in this case it is mandatory to include the photos saved by Photoshop in your backup plans. I don't know what you mean by "save them separately", but if you are using LR, your only real choice (without doing extra manual work) is to save these Photoshop edits in the same folder as the DNG.
    I'm finding this a bit complicated, and would appreciate any advice.
    Yes, it seems unnecessarily complicated to me.

  • WebCenter Portal Backup Plan

    Hi all,
    Please tell if there is any backup plan available with portal.
    If Yes,describe.
    Edited by: 985327 on Feb 6, 2013 1:48 AM

    Since WCP is multi-tier and really just a framework for letting you build and deploy well, just about anything really, there is no concise Backup and Recovery plan or single tool the way there is with Oracle RDBMS.
    Do you use WCC, Oracle IAM?, SOA? So many questions.
    You may want to review
    http://docs.oracle.com/cd/E25178_01/core.1111/e10105/part6.htm#ASADM10827
    Specifically
    http://docs.oracle.com/cd/E25178_01/core.1111/e10105/br_intro.htm#CHDJGHAI
    Chapter 16, "Introducing Backup and Recovery"
    http://docs.oracle.com/cd/E25178_01/core.1111/e10105/br_bkp.htm#BHAGEDDH
    Chapter 17, "Backing Up Your Environment"
    http://docs.oracle.com/cd/E25178_01/core.1111/e10105/br_rec.htm#BCGCBGDJ
    Chapter 18, "Recovering Your Environment"
    in order to understand the various components that can be backed up and some of the practices around that.
    With that information you can build / document and TEST your backup and recovery plan.

  • Adjusting my backup plan with Lion and TM Plus Clone

    Hi all, I am rethinking my backup plan for when I rollout Lion in my business. Currently in SL, we have clones scheduled to run every nite doing incremental backups and then we rotate the disks out periodically for offsite storage. Now I am thinking that since my testing shows that TM works really well in Lion, that we should use that as well. So I noted this week that, on average, TM eats up about 10gigs per day on the backup drive. Our installs / data per machine amounts to (let's round up to reduce the math) 250gb, and our drives are all 500gb total. That means that TM will cycle back every 25 days or so, yes? So if I reduce the clone schedule down to once every three weeks, things should be covered - right? Thx for any comments!

    Sounds right.
    Make a bootable clone with CarbonCopyCloner as well, then you can really sleep good at night. In case of catastrophic failure

  • Monthly Backup Plan using Incremental Backups

    I would like to have a monthly backup plan as explained in the oracle documentation:
    First Friday of the month: Full Backup (Incremental Level 0)
    Monday, Tuesday, Wednesday, Thursday: Incremental Level 1
    2nd, 3rd, 4th Friday: Cumulative Incremental Backups Level 1
    As I understand is that the 4th cumulative incremental backup will contain all the changes from the full backup.
    Is there any way so that the 4th friday backup will contain only the changes since the 3rd Friday and so on?

    user12075536123 wrote:
    Thank you for advice.
    I understand Oracle has "backup level 0-4"
    I checked the document
    http://docs.oracle.com/cd/E11882_01/backup.112/e10643/rcmsynta007.htm#i78895
    INCREMENTAL LEVEL integer
    Copies only those data blocks that have changed since the last incremental integer backup, where integer is 0 or 1
    I understand Oracle has only level 0 and 1
    I tested
    ==================================================
    RMAN> backup incremental level 0 database;
    Starting backup at 18-SEP-13
    allocated channel: ORA_DISK_1
    channel ORA_DISK_1: SID=64 instance=orcl device type=DISK
    channel ORA_DISK_1: starting incremental level 0 datafile backup set
    channel ORA_DISK_1: specifying datafile(s) in backup set
    Finished backup at 18-SEP-13
    RMAN> backup incremental level 1 database;
    Starting backup at 18-SEP-13
    using channel ORA_DISK_1
    channel ORA_DISK_1: starting incremental level 1 datafile backup set
    channel ORA_DISK_1: specifying datafile(s) in backup set
    Finished backup at 18-SEP-13
    RMAN> backup incremental level 2 database;
    Starting backup at 18-SEP-13
    using channel ORA_DISK_1
    channel ORA_DISK_1: starting incremental level 2 datafile backup set
    channel ORA_DISK_1: specifying datafile(s) in backup set
    Finished backup at 18-SEP-13
    RMAN> backup incremental level 3 database;
    Starting backup at 18-SEP-13
    using channel ORA_DISK_1
    channel ORA_DISK_1: starting incremental level 3 datafile backup set
    channel ORA_DISK_1: specifying datafile(s) in backup set
    Finished backup at 18-SEP-13
    RMAN> backup incremental level 4 database;
    Starting backup at 18-SEP-13
    using channel ORA_DISK_1
    channel ORA_DISK_1: starting incremental level 4 datafile backup set
    channel ORA_DISK_1: specifying datafile(s) in backup set
    Finished backup at 18-SEP-13
    RMAN> backup incremental level 5 database;
    Starting backup at 18-SEP-13
    using channel ORA_DISK_1
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03002: failure of backup command at 09/18/2013 21:32:58
    RMAN-06011: invalid level specified: 5
    ==================================================
    why!!
    Since there is no documentation of any level beyond 1, I can only guess that it is for backward compatibility.

  • Backup Plan

    Hi All,
    can anyone help me out.
    My Client is asking me what is ur backup plan,
    please guide me in this regard
    1. database runs in archive mode
    2. every day scheduled hot backup
    3. wkly cold backup
    4. data guard against natural disasters
    is it okay
    Thanks in advance for your help

    The backup strategy / standby ( dataguard) depend up on the
    1) How much critical db for your company.
    2) How much down time they can afford?
    3) What is size of the db?
    4) activities on the db ?
    5) monthly or yearly growth of db?
    You need consider this factors and then go for strategy for Backup.
    If you need high availability then I would suggest Standby ( depend up on management

  • IPhoto backup plans?

    When will apple have iPhoto backup plans? I want to be able to backup my photos in iCloud that I don't take with my iPhone.

    1. Apple are not "trying to remove the user form managing files and directories". Don't you understand that with a lossless processor there is no processed file? None. It has to be created by exporting from the database. So, how would surfing the stored files help there? For more on this see Adobe's Lightroom, not made by Apple but also a lossless processor. You're still confusing your file with your Photo. You still don't know the difference between an editor and a lossless processor.
    2. If you can't see that searching by filename, title, any number of exif metadata entries, keywords, date, processing status, camera model, location where the photo was taken, faces in the photo, text in the captions etc, in both iPhoto and in every single Open... or Attach... dialogue is more flexible and makes your photos more available in every single other application on your computer as somehow more difficult that rooting around in files and folders, there is little to say to you. If you're happy with storing 100k of files in a system of folders and think you can access the preferred one based on your memory of the filename, then my hat's off to you. iPhoto offers more ways to search your photos and makes them available in every app on your Mac than files-in-folders. That's indisputable. Yes there is a learning curve. Most folks read that article once.
    Sorry but you make enormous assumptions, and they are incorrect.
    So the bottom line is that there is not a current way to back-up your iPhotot library online.
    This is not true. What I said was that there is none that I would trust, and none that deal with the speed issue. But there are lots of services out there if these things are not a concern for you.
    The thing about them though is that it is always way easier to use them when you are immersed in the Aple ecosystem
    Well that's also true of Windows and Linux, Unix et al. Every OS has an eco system that it works best with. Ever try write to a Mac OS Extended (Journaled) disk with WIndows?
    In this case though when you commit to a Mac computing platform you are commitiing your pictures into iPhoto
    No you most certainly are not. This is a huge assumption and not at all true.  iPhoto is an application that you can choose to use. Or not. You can use Picasa for Mac, Aperture, Lightroom, Bridge or just the base file system, or any of 50 other apps to manage your Photos. You are not forced to use iPhoto. Using a Mac does not mean you are comitted to using iPhoto. Just like using Windows does not mean you're committed using Internet Explorer. It's a choice you make.
    Once you do that you are now limited on what your back-up options are.
    No you are not.
    So, notwithstanding your disappointment, you really need to look at what is in front of you. You have half a grasp of the possibilities and are then leaping to editorialising, and frankly, quite incorrectly.

  • Carbon Clone and Time Machine: developing a backup plan

    Howdy all!
    This is a second post that sort of flows on from another I have written today
    https://discussions.apple.com/thread/4649740
    I initally put them all together, but they were too rambling and disconnected, so it seemed better to seperate them. The question I have here is how best to organise my backup plan? I have a few ideas, but, basically, want to make sure I get the whole setup right the first time and would appreciate any advice from others that have been down the path before. As I am still waiting for some parts to arrive in the mail, I have a little time to think about how to go about setting up my Mac.
    Basically the setup is:
    Mac Mini 2012, boot drive is a Samsung 256GB 830 series SSD, seconday drive for data is a 1TB mechanical disk. I plan on having all my data on the seconday mechanical disk (photos, movies, music etc) and only the OS and Applications on the SSD. To this end, I understand I only have to move /Users to the mechanical disk to achieve this. I then also have 2x 2TB Western Digital MyBook Essential USB 3 disks for Time Machine backups. I plan on rotating them on a weekly basis (storing the disk not in use in a safe or offsite), and then, depending on costs a cloud backup service for some data (music, photos etc) which I might want to access when im not at home.
    So I have been thinking for a few days now on the benefit of having a Carbon Clone bootable recovery drive. The thinking goes along these lines. As my data is on a seperate drive, and is backed up to Time Machine, in the event of an OS disk failure, I can replace the disk and then point /Users to the new drive, and I can be up and running once I have reinstalled the apps i need. Now, I understand the idea of the Carbon Clone backup is such that it speeds up the time to rebuild the OS disk, but I have to question, how useful is this in reality?
    Consider, I can sit down now and write down all the apps I have needed in the past, install Mac OS, set it up (possibly with a generic admin password), install the apps I need from the App store and DVDs etc and then take a Carbon Clone at this point before any setup of Apps are done. If the apps configuration is backed up in the Time Machine backup (i.e.: the config files exist under /Users) then this is almost workable - in a recovery situation, the CC clone is used to rebuild the OS drive, the config files are pulled from the TM backups, and we're back up and running. Where this fails, is if I have installed (or removed) apps since the CC clone was made. At this point then, is it best to (a) make a new clone when a new app is added/removed or (b) make a note of apps added/removed, which will then have to be reinstalled if a recovery is required. I tend to think the (b) method is best here, as it preserves the integrity of the clone. If the machine has been compromised (malware etc) then remaking the clone, causes the clone to be compromised and hence the reinstalled machine as well. Though this method could be a pain if the machine state has changed somewhat over time. Also, it means that the reinstalled system will be missing updates etc which could be time consuming to apply anyway, so the usefulness of a clone is slightly reduced anyway.
    Does anyone have any thoughts on this? Some days I think having a clone will be useful esp. as most of my software was delivered on CD (Adobe Creative Suite, Office) or are large install bases (XCode), but other days I think, "its not a mission critical machine", i can survive a day without it while I rebuild the install, and so I dont achieve much by having a clone which is likely out-of-date by the time I go to use it.
    Also, in this backup plan, is it best to rely on TM for things like email backup or a dedicated mail backup utility? can a Carbon Clone exist on the same disk as Time Machine uses, or do I need to invest in a new disk or two for the CC clones?
    As I say, I want to make sure I have this machine setup right from the start, and would really appreciate any pointers, tips or advice.

    There is one big advantage of a clone.  You can immediately reboot
    to it and continue working and deal with the regular boot drive faiure,
    what ever it may be, later.  Especially since all your data and such
    is on another drive.  If you use your computer for work and time
    critical projects, this is a major plus!
    In the case of a hard drive failure/replacement, copying the clone
    to the drive is the fastest way to get the system and all your settings
    back.
    Time Machine and incremental backups have a place as well.  It is best
    suited for "incremental" problems.  Examples are installing an upgrade to
    software that doesn't work or just don't plain like.  With Time Machine it
    is easy to just restore back to the point before the install.
    Something else I do is backup current project files to USB memory sticks.
    If you are using your computer for business, you can never have too many
    backups.  Coralllary 456 of Murphy's Law is the "number of backups that
    you need will be one more than what you have!"

Maybe you are looking for

  • How to create an order by referencing a contract using  IDOC_INPUT_ORDERS

    HI Folks, I am trying to create a sales order using Inbound idoc IDOC_INPUT_ORDERS. I will be getting the contract number and line item from VBAK table(so far so good) by using blanket po / material number.This is done.. Now i would like to pass this

  • Time Machine backup with firewire external drive

    I am having issues with a brand new firewire external drive and I am not sure if it is a drive issue, a configuration issue or what. I have a 6tb Firewire hooked up to my machine. This is the drive which is being used as the destination for Time Mach

  • Topology map not working in LMS 3.1

    Hi, I recently did a clean installation of LMS 3.1. The topology services is not working returning an error named "Failed to initialize: Java.lang.NumberFormatException: null"in the pop up window. Any help would be appreciated with thanks sathappan

  • Using a Counter(PC-TIO-10) to Perform Quadrature Encoder Buffered Position Measurement in Visual C++

    I have Driver Ni-Daq 6.9 and Using a Counter to Perform Quadrature Encoder Buffered Position Measurement in Visual C++ 6.0 (I think Ni-Daq not support PC-TIO-10 because card not support pulg&play, you can hint me driver for support this card. ) and i

  • Iwork on 2 different iPads

    Hello, is it possible to have numbers on my 2 ipads without purchasing 2 . I can not find it in my itunes. THX for the answer DAvid