Disaster Plan advice.

Hello all,
I need some suggestion as to how to implement below mentioned strategy.
We have an ERP application running on oracle.
There are 3 servers, one for
Application Server, and two for Oracle.
Oracle is running on 2 node cluster(MSCS & OFS).
In case when there is a power outage, OS level corruption, or any scenario when the production is unavailable, I want some other servers to fail over, both Application Server as well as Oracle.
Options that I'm aware is to have a Standby database, and another Application Server instance running, pointing to standby database.
Backup policy followed is daily Data Dump, daily Incrementally updated backup, and weekly Online Full backup, via EXPDP and RMAN respectively.
My question is,
1- whether Standby Database option is enough to help me plan my DRP, or there other ways to handle this?
2- Other thing is if Standby Database becomes primary, and users start using it, how do I switch back the data to the original primary database?
3- Is it worth to create a dummy database, and have some bat script which'll DROP the user and re-import the schema (unattended) from the backup dump extracted everyday?
4- Is it worth to consider Duplicate Database on another Host option, but then how will it synchronize with production database?
Please advice on the above approach, whether am I on the right direction.
regards,
Luckys

There are a number of business questions to answer before a decent disaster plan can be implemented.
1- whether Standby Database option is enough to help me plan my DRP, or there other ways to handle this?
A standby database can definitely be a component of a disaster recovery plan. Whether it is the only solution, whether you use DataGuard or write your own scripts, and whether there are other options really depends on your environment (Oracle version and edition), the downtime window that is acceptable to the business, the amount of data loss that is acceptable in a disaster, and the business's willingness to invest time and money into developing a solution.
2- Other thing is if Standby Database becomes primary, and users start using it, how do I switch back the data to the original primary database?
You would take a backup of the standby database (which is now the primary) and recover that backup onto the primary machine, making it a standby, then do a switch over to make the primary machine the primary again. This is relatively automated with DataGuard, but is something you would roll yourself if you wrote your own scripts.
3- Is it worth to create a dummy database, and have some bat script which'll DROP the user and re-import the schema (unattended) from the backup dump extracted everyday?
That depends on the window you have to recover the database in a disaster and the business's willingness to lose data in a disaster. My guess is that this would probably not be beneficial.
4- Is it worth to consider Duplicate Database on another Host option, but then how will it synchronize with production database?
I'm not sure what "duplicate database" means here and how (or whether) that differs from a standby database.
Justin

Similar Messages

  • PI Disaster Plan

    Hi Experts,
    currently we are having 150+ interfaces in production system and few are very critical and time sensitive for business. Recently we encountered a production issue which lasted for 8 hours to bring the system back up and running. How is anyone doing disaster plan for production PI system outages ?
    Options I can think of is,
    - Route time critical interfaces via quality region during outage time period and rever it back
    - Maintain another production like PI system and keep your interfaces ready in it
    - Other sender/receiver applications can develop interface with another middleware
    Please suggest how in your organisation are doing for this.
    Thanks,
    Giridhar

    Thanks Nesimi and Anupam for your answers.
    I have few more questions,
    - Do the DR system lying on the same host or IP address ?
    - In case of SOAP Sender/Receiver Interfaces, we need to exchange the URL to other application system...so do we need to make this change in other application systems ?
    - For Idoc/RFC, we need to change port configurations in Sender R/3 systems. Do we need to make these changes as well.
    - Do external partner change the FTP address to send to new DR system.
    Sorry for putting some questions like this, basically I want to understand how much effort will be there for PI technical team and other application teams to do when we are moving production interfaces to DR system.
    Thanks,
    Giridhar

  • Disaster plan

    Hi all ...
    I have 2 servers, one for DB and the other for APP (Citrix), I am using oracle 9i over windows 2000 server.
    Guess that I wake up a day morning and I found my servers is fully damaged, or disappeared cause for natural disaster or burning ...
    what the procedure I should did before, so I can run-up a new server????
    what is the right backup plan for this, what things I have to back them up?
    and how I can rebuild the server and the DB and windows ????

    Bit of broad question, I don't think anybody will give you a complete answer for free. However to get you started you probably want to consider the following:
    - Are you going to have identical hardware for recovery?
    - How long can you live without the server?
    - How much data loss can you cope with?
    - What are you doing with your backups? (there is little point in having a plan if your tapes go up in smoke with the servers)
    The last time I was doing this on a shoestring the plan was something like:
    - Get hold of some hardware.
    - Rebuild and configure the server using detailed build instructions.
    - Restore the database from the cold backup and add any additional archivelogs that survived.

  • Disaster planning

    I need to setup a Backup Server to allow quickly get up and running if there is a server problem.
    Setup a 2nd Server in the Tree
    Install Groupwise system
    Backup Main Server to 2nd Server using DBCOPY
    If there is a problem would I be able to simply add a secondary IP address (Main server IP)
    and Start Groupwise on the 2nd Server
    is this practical ?
    I'm not sure the Dbcopy process copies config files for GW, Webaccess etc
    are there any other files I need to copy or include in my daily backup ?
    Ideally, I'd like to be able to fix the main server and go back to it
    Would like to TEST this without messing up the live system or backup server!
    GW 2012 + Webaccess + GWIA
    Would appreciate any help / pointers

    In article <[email protected]>, Bhrt60 wrote:
    > Setup a 2nd Server in the Tree
    > Install Groupwise system
    >
    > Backup Main Server to 2nd Server using DBCOPY
    >
    > If there is a problem would I be able to simply add a secondary IP
    > address (Main server IP)
    > and Start Groupwise on the 2nd Server
    >
    > is this practical ?
    >
    There are a few gotchas, especially with it being in the same tree.
    - Make sure your GW pieces all run on secondary IPs that you switch
    over, otherwise you will have an upset tree when eDir tries to talk to
    the failed server and finds it talking with itself.
    - On a failure you'd have to go change the GW objects to tell them
    where they are now on the new server (that's quick enough) but that
    means that you can't readily test other than switch it over for real.
    - You will have to install and configure the agents on the new server
    as well.
    - If you went with a different tree, you've have to graft the objects
    in, including any new users, but if you could isolate it, you could
    actually test it while the main system is live.
    This solution can work between separate sites that Cluster Services
    can't handle, and does so even if the data is directly corrupted, but
    at a longer RPO & RTO.
    Disaster Recovery terms RPO & RTO defined at
    http://en.wikipedia.org/wiki/Recovery_point_objective
    Andy Konecny
    Knowledge Partner (voluntary SysOp)
    KonecnyConsulting.ca in Toronto
    Andy's Profile: http://forums.novell.com/member.php?userid=75037

  • Live Recording DISASTER - help, advice, moral support...

    I've been using my Powerbook and Firepods to do 16ch live recordings for the past 4 months. The first 20 times or so were for friends' bands or just practice, all for free. Last night was the first paying gig, and it also happened to be the first time the recording didn't come out right. Scattered throughout the tracks for all 3 bands are periods of audio distortion/warping/f*ed up useless garbage. Sometimes its only on 1 track, sometimes all of them. It always happens in a section, there do not seem to be any random momentary blips.
    This is very similar to a problem i had momentarily in the studio that i recently posted about. I bought a SPDIF cable to physically connect the two firepods for better syncing, and I'm using the aggregate device for a full 16 channels. I thought i had the problem solved, we finished tracking with no problems. My environment is trimmed down to only the necessary tracks, and as far as I can tell, nothing else was different than all of the previous successful attempts. My 160 gb fw hd was 50% full.
    Up to this point, the one night that really counted, I was 100% confident in my rig. now I have no idea what to do, or what to tell the bands. two of them also hired videographers for a DVD with our audio. i've worked hard to get this kind of gig, and then this. and ultimately, its my fault. yeah, so thats that. if you've made it this far, thanks for reading my venting.
    Powerbook G4 1.67   Mac OS X (10.4.4)   2 gigs RAM, 2 Presonus Firepods, LaCie 160 gb FW 800

    I do not have final cut pro. i haven't messed around with the files much yet, we had a live sound gig tonight. (tested recording, and it messed up again, something must be wrong still) one thing i tried on a kick drum track was eq'ing out all of the highs, which got rid of most of the mess. but on a few vocal parts, nothing could be done short of overdubbing. i guess this could be my learning experience for salvaging recordings.
    we've thrown out the idea of a md or dat board mix before, and we'll definitely do it from now on. are there better ways? i really love doing this, and I know the machine can do it, i just don't know what changed.

  • Macbook Pro 2011 upgrade plan, advice appreciated

    Hello!
    So to cut the short backstory shorter I just thought I might as well upgrade my macbook (http://www.everymac.com/systems/apple/macbook_pro/specs/macbook-pro-core-i7-2.7- 13-early-2011-unibody-thunderbolt-specs.html) a bit more so I don't have to shell out on a new one in the near future. I would really appreciate a bit of guidance as I don't have loads of experience with this.
    So currently I have 16GB of RAM but I'm now looking at upgrading to an SSD (http://www.amazon.co.uk/Samsung-500GB-2-5-inch-Basic-Solid/dp/B00E3W19MO/).  Having done some research this seems the best value for its performance levels.
    I'd install this in place of my current 500GB standard issue HDD currently in my Macbook and install this HDD in place of my Optical Disk Drive an adapter caddy thing (http://www.amazon.co.uk/caddy-MacBook-replaces-SuperDrive-enclosure/dp/B00A2VNUK 4/).
    Then, to continue the game of musical drives, place the optical drive (or potentially a blue ray drive I purchase if I'm feeling lavish) in the external enclosure (above).
    Right, so onto the bit I'm less sure about... After that's done I'd then use migration assistant/time machine (from external HHD - anyone have a rough idea how long a 200GB migration over USB 2.0 might take!?!) to put my OSX on the SSD using a Mavericks bootable USB. I would expand my 80GB Windows 7 partition on my HHD in the optical drive bay and be able to boot from that if I wished.
    Does this all check out?

    EdFr,
    you won’t be able to boot from a Boot Camp partition from a disk in the optical bay; you can only boot from a Boot Camp partition on the disk in the disk bay.
    Since you have an Early 2011 model, either a Thunderbolt connection or a FireWire 800 connection would give you faster transfer speeds than a USB 2.0 connection, if either of those ports are on your external disk enclosure. (A Thunderbolt connection would be fastest.)

  • Network infrastructure planning advice

    I need to set up a new network: "Network A". It needs to have a t1 to "Network B", a backup t1 to "Network B", and a t1 to "Network C". In the near term future there will be a T1 to "Network C" and a T1 to "Network D".
    What is a good router configuration for "Network A"? How many routers? What models? What T1s to what routers?
    thanks

    In addition to ur stated requirements, I would also wish to know what type of connections(Long time, short time) and what is going to be the distance b/w ur Network A and other networks?
    Here r the two main WAN connection :
    1.For short Distances and long connection times - Leased line is far better.
    2.If u want to use shared bandwidth and more efficient packet switching and for Long distances(Long connection times) - frame-relay is good.
    Both these technologies provide more than T1 (1.544 mbps) speed connectivity.
    Since, u need a T1 backup, ISDN PRI provides this T1 speed connectivity and so u can use it as backup line.It is a circuit switched n/w.

  • SQL-Tuning Tips

    Hi there,
    i need some general (i can't supply you with an execution plan) advice in tuning this statement running in 9i CBO mode:
    SELECT
          TAB_C1.CLIENT_NR, TAB_C1.DD_NR, last_day(trunc(TAB_C1.BELEGDAT)),
          0, TAB_C2.VALUE_DIM ,
          nvl(SUM(DECODE(TAB_A.FOLGBU || TAB_A.ZUABNEUT,'0A',0,'1A',0,'0Z',1,'1Z',0) * TAB_C2.VALUE),0) ,
          nvl(SUM(DECODE(TAB_A.FOLGBU || TAB_A.ZUABNEUT,'0A',1,'1A',0,'0Z',0,'1Z',0) * TAB_C2.VALUE),0) ,
          nvl(SUM(DECODE(TAB_A.FOLGBU || TAB_A.ZUABNEUT,'0A',0,'1A',0,'0Z',0,'1Z',1) * TAB_C2.VALUE),0) ,
          nvl(SUM(DECODE(TAB_A.FOLGBU || TAB_A.ZUABNEUT,'0A',0,'1A',1,'0Z',0,'1Z',0) * TAB_C2.VALUE),0) ,
          0 ,max(TAB_C1.ROWSEQ * 1000) + TAB_C2.BLENDTYPE * 100 + TAB_C2.VALUE_DIM
      FROM TAB_C1,
           TAB_A,
           TAB_C2,
           TAB_C3,
           TAB_B
    WHERE TAB_C1.ACC_TYPE = 'N'
           AND TAB_C1.CANCEL = 0
           AND TAB_A.CLIENT_NR = TAB_C1.CLIENT_NR
           AND TAB_A.ACC_MODE = TAB_C1.ACC_MODE
           AND TAB_C3.CLIENT_NR = TAB_C1.CLIENT_NR
           AND TAB_C3.ACC_TYPE = TAB_C1.ACC_TYPE
           AND TAB_C3.ACC_NR = TAB_C1.ACC_NR
           AND TAB_C3.POS = TAB_C1.POS
           AND TAB_C3.ACC_DAT = TAB_C1.ACC_DAT
           AND TAB_C2.CLIENT_NR = TAB_C1.CLIENT_NR
           AND TAB_C2.ACC_TYPE = TAB_C1.ACC_TYPE
           AND TAB_C2.ACC_NR = TAB_C1.ACC_NR
           AND TAB_C2.POS = TAB_C1.POS
           AND TAB_C2.ACC_DAT = TAB_C1.ACC_DAT
           AND TAB_B.CLIENT_NR = TAB_C2.CLIENT_NR
           AND TAB_B.AMOUNT_KEY = TAB_C2.VALUE_DIM
           AND TAB_B.AMOUNT_TYPE = 0
    GROUP BY TAB_C1.CLIENT_NR,
              releasetype,
              releaseid,
              releasegrp,
              releaseseq,
              last_day(trunc(TAB_C1.BELEGDAT)),
              blendtype,
              VALUE_DIM;
    Table           Rows     PK
    TAB_A          300     CLIENT_NR, ACC_MODE
    TAB_B          600     CLIENT_NR, AMOUNT_KEY
    TAB_C1          200.000     CLIENT_NR, ACC_TYPE, ACC_NR, POS, ACC_DAT
    TAB_C2          1 Mio.     CLIENT_NR, ACC_TYPE, ACC_NR, POS, ACC_DAT, COL_BP, VALUE_DIM, COL_BT
    TAB_C3          350.000     CLIENT_NR, ACC_TYPE, ACC_NR, POS, ACC_DAT, COL_RI, COL_RT, COL_RG, COL_RS, COL_RLIn my opinion i should try to avoid the function calls (nvl(sum...), but i dont't have an idea how to realize this the best way. Maybe you can give me some advice in solve this or point me out some other options to improve performance. The view is used to fill a table.
    Thank you in advance
    Kind regards
    Matthias

    I would be really surprised if changing anything in the select list would improve performance much. The Oracle built-in functions are for the most part about as fast as selecting the column without using a function.
    I would concentrate my efforts on the from and where parts, which is the part that causes all of the I/O, which is generally the most expensive and time consuming part of the query. Without looking too closely at your query, and without knowing anything about your data, I notice that tab_b and tab_c3 are not used in your select list, and only tab_b has a selective predicate against it. Are you sure you need these two tables?
    Can you add more selective predicates against some or all of the tables without changing the results of the query? For example, does the tab_b.amount_type = 0 imply anything about perhaps a date range or account range in the other tables.
    You say that
    the restrictions:
    TAB_C1.CANCEL = 0
    TAB_B.AMOUNT_TYPE = 0
    have very low selectivity so that indexing one of these columns seems to be not very helpfulbut you also have a predicate on TAB_C1.ACC_TYPE = 'N' AND TAB_C1.CANCEL = 0. Is that combination more selective?
    The list goes on, but as others have said, without more information we are only guessing.
    John

  • SROAUG meeting invite-Friday June 21, 2002

    SROAUG Meeting Invite-Friday, June 21,2002
    Please attend the SROAUG (Southwest Oracle Applications Users Group) upcoming meeting:
    Location: Sheraton Gateway Hotel at LAX (Los Angeles International Airport)
    Time: 8:30 am to 12:30 pm
    General Presentation Track:
    1. Oracle Applications Disaster Planning and Recovery: Planning through Implementation
    Steve Rockey, VP of Managed Application Services, Data Systems Worldwide, Inc.
    2. The Invoice Gateway: An Alternative to the Invoice Workbench for Invoices and Credit Memos
    Paul D. Scott, Senior Principal Instructor, Instructor Center of Excellence (ICE) Team, Oracle University
    3. Maintain Assets Not Integration: Oracle's new Enterprise Asset Management Module (eAM)
    Andy Binsley, P.E. Director, eAM Business Development, Oracle Corporation
    Vendor Presentation Track:
    1. Mercury Technology Group
    2. California Manufacturing Technology Center
    For more details see: http://www.oaug.org/sroaug/confmeet/meet023.htm

    SROAUG Meeting Invite-Friday, June 21,2002
    Please attend the SROAUG (Southwest Oracle Applications Users Group) upcoming meeting:
    Location: Sheraton Gateway Hotel at LAX (Los Angeles International Airport)
    Time: 8:30 am to 12:30 pm
    General Presentation Track:
    1. Oracle Applications Disaster Planning and Recovery: Planning through Implementation
    Steve Rockey, VP of Managed Application Services, Data Systems Worldwide, Inc.
    2. The Invoice Gateway: An Alternative to the Invoice Workbench for Invoices and Credit Memos
    Paul D. Scott, Senior Principal Instructor, Instructor Center of Excellence (ICE) Team, Oracle University
    3. Maintain Assets Not Integration: Oracle's new Enterprise Asset Management Module (eAM)
    Andy Binsley, P.E. Director, eAM Business Development, Oracle Corporation
    Vendor Presentation Track:
    1. Mercury Technology Group
    2. California Manufacturing Technology Center
    For more details see: http://www.oaug.org/sroaug/confmeet/meet023.htm

  • Wake on LAN over VPN....

    Hello all...
    Our disaster plan involves me being able to do almost everything I can do in
    the office out of the office. One of the things that I cannot seem to do is
    use ConsoleOne or the ZenWorks Workstation Browser to wake up desktops in
    the office.
    Any ideas why I cannot to this??
    Thanks in advance....
    Delon E. Weuve
    Senior Network Engineer
    Office of Auditor of State
    State of Iowa
    USA

    What happens if you take the "NetGear" out of the way and connect directly
    to your Modem?
    If that works, you may need to configure "Port Forwarding" or "Port
    Triggering".
    I would start by taking the router out of the loop 1st before playing with
    the port settings on your router.
    No point in whacking your head against something until you know it might
    help.
    Craig Wilson - MCNE, MCSE, CCNA
    Novell Support Forums Volunteer Sysop
    Novell does not officially monitor these forums.
    Suggestions/Opinions/Statements made by me are solely my own.
    These thoughts may not be shared by either Novell or any rational human.
    "Delon Weuve" <[email protected]> wrote in message
    news:[email protected]..
    > Sorry. I realized I left out some information.
    >
    > By "out of the office" I mean my home, where I have a Netgear wireless
    > router.
    >
    > The connection is being made by a client VPN software package.
    >
    > I can do just about everything on the network while at home, but this is
    > one
    > of those more annoying things.
    >
    > Thanks.
    >
    >
    > __________________________________________________ ___
    >
    > Delon E. Weuve
    > Senior Network Engineer
    > Office of Auditor of State
    > State of Iowa
    > USA
    >
    >
    >
    >>>> On 6/25/2008 at 11:12 AM, in message
    > <[email protected]>, Delon
    > Weuve<[email protected]> wrote:
    >> Hello all...
    >>
    >> Our disaster plan involves me being able to do almost everything I can
    >> do in
    >> the office out of the office. One of the things that I cannot seem to do
    >> is
    >> use ConsoleOne or the ZenWorks Workstation Browser to wake up desktops
    >> in
    >> the office.
    >>
    >> Any ideas why I cannot to this??
    >>
    >> Thanks in advance....
    >>
    >>
    >> __________________________________________________ ___
    >>
    >> Delon E. Weuve
    >> Senior Network Engineer
    >> Office of Auditor of State
    >> State of Iowa
    >> USA
    >>

  • SECURITY: Virus Scanners, Spyware, etc

    Mac users have always prided themselves with the fact that they hardly have as many security problems, no viruses, etc, etc. Even the Mac expert told me I didn't really need anti-virus software and all that. OK, I can take that... I got the Macbook.
    But the question is: Is that really wishful thinking? Or do I actually need some sort of additional security software, anti virus and anti spyware software, etc. And if it is important to have such things: What are good programs to get.
    The only antispyware thing ive seen is MACSCAN and i've seen a few anti virus software, and i dont know which is "best"...

    Tom,
    1. Apple recommends using antivirus software. (See above.)
    2. Apple ships antivirus as part of OS X Server. (See above.)
    These two items should give us all pause.
    First things first: Computer security is a process, not an end.
    Step 1: Decide an appropriate level of paranoia for your system.(a) Since it's your system, you are responsible for its use and security.(b)
    The key factors should be the value of your data, the value of your system, and the liability that you create for yourself if you do not take reasonable precautions.
    Step 2: Stay aware of the threats and risks associated with using computers. Over time, threats and risks change, as do effective means to counter and mitigate them.
    Step 3: Take appropriate precautions.(c)
    Step 4: Lather, rinse, repeat.8
    (a) Bad guys really are out to get control of your computer and steal your data.
    (b) A personal computer is a general-purpose computing device, much as an automobile is a general-purpose transportation device. The level of safely of either depends on regular maintenance, properly installed and inspected safety and security equipment, disaster planning and recovery practice, and responsible use. There is no such thing as a completely secure car or computer: In order to be useful, they have to be able to do inherently dangerous things.
    (c) In my practice, I have found the following list to be a decent baseline: At a minimum, you should make sure that your software is up-to-date, your firewall is properly configured, your critical data is backed up, you are reasonably protected from malware and don't spread it to others, your network has proper incoming and outgoing access controls, and you regularly read your logs.
    What you do (or don't do) on your system is your choice. This is a topic that deserves serious attention and thought.
    At the very least, we should be sure to put serious thought, research, and fact into prescriptions and recommendations for other users.
    -Wayne

  • Storage Replica versus Robocopy: Fight!

    Storage Replica versus Robocopy: Fight!I've used Robocopy for so many years, that this blog post really caught my eye. Surely, Robocopy could not be beaten doing file copies? oh dear, it looks as though we have a new Sherriff in town.This copy tests both systemsunder various workloads:[originally postedby Ned Pyle]Hi folks, Ned here again. While we designed Storage Replica in Windows Server 2016 for synchronous, zero data-loss protection, it also offers a tantalizing option: extreme data mover. Today I compare Storage Replica’s performance with Robocopy and demonstrate using SR for more than just disaster planning peace of mind.Oh, and you may accidentally learn about Perfmon data collector sets.In this cornerRobocopy has been around for decades and is certainly the most advanced file copy utility shipped in Windows. Unlike the...
    This topic first appeared in the Spiceworks Community

    Hi, 
    Since you have deleted the existing Replication Group at step 5, the step6 could not affect the existing DFSR Database. 
    When you create a new replication group, it will do an initial sync between shares UsersA-C on server1 and a new SAN mounted drive on the replica site server.
    After the step 9, I think two replication groups is needed between server1 and replica site server to replication shares UsersA-C and shares UsersD-F. You could set the replica site server as primary member in the replication group. It will be considered to
    be the authoritative member and it wins out during the initial replication. This will overwrite the current replicated folder content on the non-primary member. 
    You could try a command to set another server as primary:
    Dfsradmin Membership Set /RGName:<RG Name> /RFName:<RF Name> /MemName:<Member Name> /IsPrimary:True
    Best Regards,
    Mandy
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • 1Tb 'green' drives - fast enough?

    Hi
    I have a G5 2.3 Ghz Dualcore, Late 2005. I'm still booting from the original 250Gb hard drive. I have a 500Gb drive in the second bay, and a couple of external 250Gb drives. I've run out of space.
    I want to retire all these hard drives and replace them with 1Tb drives. Two internal, two external (rotated in a Voyager Q dock that I'm about to buy, connected with FW800).
    My boot drive will hold everything (OS, apps, my data). The other 3 drives will be SuperDuper clones of my boot drive (with one clone being stored off-site). (The second internal will have one small partition (50Gb) for Photoshop scratch.)
    I'm a photographer and mainly use my computer to work on large (250Mb+) Photoshop files
    Ok, preamble over.
    My question is what 1Tb drives should I buy? (I want to avoid Seagate. I'm happy to buy 4 different types of drive - in fact maybe that's best, to avoid any potential Seagate-style firmware problems.)
    A dealer I've used before is offering me a good price on the following. I think they are all 'green' drives:
    - Hitachi Deskstar 7K1000.B
    - Samsung EcoGreen F2
    - Western Digital Green
    Are green drives slow?
    For my boot drive should I be looking for something faster? And maybe something fast for the second internal, with the Photoshop scratch partition? But are faster drives noisy? How noisy?
    (I'm not interested in Raptors, which I guess give the ultimate speed.)
    How about the two drives that are going to be cycled in the Voyager Q dock for backup (SuperDuper's 'smart backup'). I guess they don't need to be particularly high performance?
    Maybe all these 1Tb drives perform quite similarly, and I shouldn't worry too much about my choice?
    Thanks for any tips
    Richard

    Verify backups - SuperDuper is darn reliable and well tested and has been through the mill more than once in tests.
    Run Disk Warrior latest of course. I like to run TechTool Pro (5.04) as well which does look at files and more. So does DW's 2nd tab test somewhat, but doesn't give me the same fuzzy feeling.
    Good drive, good RAM, good backups, and "test to verify" - never trust a backup set you haven't used.
    I keep one nice pristine system partition with just Apple OS and updates, never used for anything except as "Mr EMERGENCY." And always keep a backup of last OS patch or update. Like 10.5.5 or the one from before Security Update 3.
    Most problems happen to the boot drive, data drives tend to go scot-free of most any problem.
    I make partition clones, but for the system, I also use sparse disk images as well. Image of Emergency, Last Version etc sitting on large volume somewhere.
    I also use TriBackup 5 which is handy and does more verifying and scripts. They also now have a TimeMachine inspector that I don't have that looks useful and added features.
    I couldn't get along doing backups without either PCI FW800 to add another channel, or SATA PCI to drive some more hot swap and RAID volumes: backups, scratch, etc.
    Redundancy is good. Different backup sets that you alternate and methods. Daily, weekly, monthly, a set of each, sound good? that was the way I set it up at work. Some files were backed up to two output devices simultaneously.
    Every AM I have my system set to do updates, backups, scanning, and a weekly image, all automatically while I get coffee and watch the news or eat breakfast.
    First the backups, then a routine you like, and add some software suites. Synchronize Pro gets pretty good marks, but you have to 'buy' it every two years but good for business and automated backups. When you get into 1TB of data that is highly important, and having two or three backups isn't enough, and/or the data is updated and changed frequently or needs to be searchable and online at all times, not offline, then you can add RAID6 solution.
    Managing a large library of media files, video, photo, audio or anything that needs an index and database, RAID6 can make sense. I just wish Spotlight would grow wings or could throw in a better search engine that knew to wait and use idle time, and stay out of the way when editing.
    If you have never looked at MicroMat TechTool Pro, I think it is worth a glance or two. Same with Synchronize Pro. And always have a "spare" hard drive available you can use if a drive fails, or has trouble and can't devote time to figure out why. You should always have a drive you can drop into place and still have your backups. A lot of the time, and time costs and is annoying, it is easier to restore than to fix and figure out why and can take most of a day (there are no quick fixes or magic bullets to many problems).
    New OS update? clone your system, apply the update to your test + clone backup drive and pull or leave the original as the new "backup." Always have a fall back disaster plan.

  • Portal Profile Server Recovery Question

    Hi All,
    I am currently implementing/rolling out to production 2x portal gateways
    and 2x portal servers(with 1 Master profile ldap on one of the portal servers).
    I am currently using iPlanet portal version iPS3.0sp3a.
    What I would like to know is: How are other clients/companies configuring a
    2x gateway with 2x portal server design
    What do other clients/companies do when the 1 master profile ldap or master profile
    server dies? The reason why I ask is because the master profile ldap is a single
    point of failure.
    Question: Are other companies creating a 2nd profile ldap that is dormant on the 2nd
    portal server? And if the Master profile server dies, then you can quickly restore/
    configure the 2nd portal server to be the new master profile server?
    I am just curious what other companies are doing to mitigate a potential disaster with
    the master profile ldap.
    Note: I know that iPSv3.0sp4 is due out soon, but I will be moving to production real
    soon and I need to have some sort of a disaster plan in place.
    Any suggestions and/or comments is most appreciated.
    Thanks in advance,
    Chris Wilt
    TransCanada

    sp4 is already available for download.
    The general high availability setup would be either have a sun cluster with the ldap server and setup replication so that if one server fails you have the other.
    Alternate option is to have replication and put idar in front of the ldap servers.
    With the availability of sp4 you have multiple profile servers that would avoid a single point of failure on the profile server.

  • Repair of motherboards( system units)

    Has anyone a kind of effect, but a general data sheet for carrying out good mother repair?
    This topic first appeared in the Spiceworks Community

    Storage Replica versus Robocopy: Fight!I've used Robocopy for so many years, that this blog post really caught my eye. Surely, Robocopy could not be beaten doing file copies? oh dear, it looks as though we have a new Sherriff in town.This copy tests both systemsunder various workloads:[originally postedby Ned Pyle]Hi folks, Ned here again. While we designed Storage Replica in Windows Server 2016 for synchronous, zero data-loss protection, it also offers a tantalizing option: extreme data mover. Today I compare Storage Replica’s performance with Robocopy and demonstrate using SR for more than just disaster planning peace of mind.Oh, and you may accidentally learn about Perfmon data collector sets.In this cornerRobocopy has been around for decades and is certainly the most advanced file copy utility shipped in Windows. Unlike the...

Maybe you are looking for

  • Trouble with Creative Cloud

    I signed up for Photoshop and Lightroom for $9.99 per month today. I downloaded and installed Photoshop CC successfully. It works. However, Lightroom 5 asks for a serial number. I am completely lost as to how to fix the problem. I tried chat with Ado

  • Usb, firewire, bluetooth all do not work

    I downloaded the most recent update for my OS X tiger software, and ever since the download my bluetooth mouse won't pair with the computer, I can't use my usb jump drive, my ipod is not recognized, and I can't download .dmg files because after they

  • Webservice called from inside of Workspace

    Hi There, I have created a simple PDF which uses the LDAP service to lookup somebodies manager from their email address. For testing purposes I have put a script in the form-ready event of the field which contains the email address. This works perfec

  • Stacked canvas not visible in Content view???

    I am following Oracle Forms Interactive Workbook and have downloaded a solution which involves implementing a scrolling stacked canvas. When looking at the solution I can see the scrolling stacked canvas items in the layout editor when the content ca

  • URL rewriting for the links

    Hi I have my webapplication which is running on tomcat ....and as of now i havent encoded any of my links in the web application ...they are like <a href="/action.do?id=1"></a> Is it my duty to keep all the URLs encoded so that if the client diables