Best DLT for Xsan backup

I need to back up the 5+ TB of my Xsan. I know the best way of doing this is DLT tape backup. Can anyone recommend a system/brand of DLT backup from personal experience?
Thank you in advance.
DKG

Right now, I am using a Overland Neo2000 with 2 LTO-3 drives for backing up my 5 TB Xsan.
AFAIK, this a good solution, depending on the average size of your data files. We're getting about 120 MB/sec from our 3.4 TB Xsan volume that hosts big data files and about 80 MB/sec from our "office" Xsan volume.
A full backup runs about 17 hrs, which is not the fault of the LTO-3 drives, but is due to the fact that Xsan does not work too well for office data. Maybe Xsan 2 will boost things in this regard.
Cheers,
Budy

Similar Messages

  • Best command for script backup?

    hello,
    i always use robocopy for file backup , but with powershell and the new technologies is still the easiest and best choice?

    Your initial question was "Best command for script backup?". For this you received an answer. If you now need support on the subject of your script then you should start a new thread. Note also that you must test your robocopy command from a
    command line before trying to embed it in a PowerShell script.

  • Best practice for bi backup

    Hi,
    Who can suggest me the best practice for backup/restore of entire bi dashboard reports, permissions and etc?
    Ed,

    Hi,
    If you want move the entire dashboards,reports ,permissions
    Zip the *<OracleBIData>Web/catalog* folder and move it new environment. In new envornment unzip this catalog and in Instanceconfig.xml mention the path for this new catalog.
    If you want move the few dashboads or reports ,do it by Catalog Manager.
    Thank you.

  • WHAT IS BEST STRATEGY FOR RMAN BACKUP CONFIGURATION

    Hi all,
    my database size is 50GB I want TAKe WEEKLY FULL BACKUP AND INCREMENTAL BACKUP
    WITHOUT RECOVERY CATALOG.by follwing commands
    weekly full database backup
    run
    backup as compressed backupset
    incremental level=0
    device type disk
    tag "weekly_database"
    format '/sw/weekly_database_%d_t%t_c%c_s%s_p%p'
    database;
    I want do CONFIGURE RMAN BY FOLLWING stragtegy
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 days.
    CONFIGURE BACKUP OPTIMIZATION OFF;
    CONFIGURE CONTROLFILE AUTOBACKUP ON;
    CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE DISK TO
    '\SW\AUTOCFILE_'%F';
    and other is by default
    sql>alter system set control_file_record_keep_time=15 days;
    os--aix6 and for two database 10g2 and 11g2
    what is best configuration strategy for rman backup.AND BACKUP WITH RECOVERY CATALOG OR WITHOUT RECOVERY CATALOG
    PLEASE TELL ME
    Edited by: afzal on Feb 26, 2011 1:45 AM

    For simply two databases, there really wouldn't be a need for a recovery catalog. You can still restore/recover without a controlfile and without a recovery catalog.
    From this:
    afzal wrote:
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 days.I am assuming you want to keep two weeks worth of backups, therefore these:
    alter system set control_file_record_keep_time=15 days;
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 daysShould be:
    RMAN> sql 'alter system set control_file_record_keep_time=22';
    RMAN> CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 14;22 would give you that extra layer of protection for instances when a problem occurs with a backup and you want to ensure that data doesn't get aged out.

  • Best pratices for RMAN backup management for many databases

    Dear all,
    We have many 10g databases (>40) hosted on multiple Windows servers which are backup up using RMAN.
    A year ago, all backup's were implemented through Windows Scheduled Tasks using some batch files.
    We have been busy (re)implementing / migrating such backup in Grid Control.
    I personally prefer to maintain the backup management in Grid Control, but a colleague wants now to go back to the batch files.
    What i am looking for here, are advices in the management of RMAN backup for multiple databases: do you guys use Grid Control or any third-party backup management tool or even got your home-made solution?
    One of the discussion topic is the work involved in case that the central backup location changes.
    Well... any real-life advices on best practices / strategies for RMAN backup management for many databases will be appreciated!
    Thanks,
    Thierry

    Hi Thierry,
    Thierry H. wrote:
    Thanks for your reaction.
    So, i understand that Grid Control is for you not used to manage the backups, and as a consequence, you also have no 'direct' overview of the job schedules.
    One of my concern is also to avoid that too many backups are started at the same time to avoid network / storage overload. Such overview is availble in Grid Control's Jobs screen.
    And, based on your strategy, do you recreate a 'one-time' Oracle scheduled job for every backup, or do your scripts create an Oracle job with multiple schedule?
    You're very welcome!
    Well, Grid Control is not an option for us, since each customer is in a separate infrastructure, and with their own licensing. I have no real way (in difference to your situation) to have a centralized point of control, but that on the other hand mean that I don't have to consider network/storage congestion, like you have to.
    The script is run from a "permanent" job within the dba-scheduler, created like this:
    dbms_scheduler.create_job(
            job_name        => 'BACKUP',
            job_type        => 'EXECUTABLE',
            job_action      => '/home/oracle/scripts/rman_backup.sh',
            start_date      => trunc(sysdate)+1+7/48,
            repeat_interval => 'trunc(sysdate)+1+7/48',
            enabled         => true,
            auto_drop       => false,
            comments        => 'execute backup script at 03:30');and then the "master-script", determines which level to use, based on weekday from the OS. The actual job schedule (start date, run interval etc) is set together with the customer IT/IS dept, to avoid congestion on the backup resources.
    I have no overview of the backup status, run times etc, but have made monitoring scripts that will alert me if/when a backup either fails, or runs for too long. This, in addition with scheduled disaster/recovery tests makes me sleep rather well at night.. ;-)
    I realize that there (might be) better ways of doing backup scheduling in your environment, since my requirements are so completely different than yours, but I guess that we all face the same challenges in unifying the environments as much as possible, to minimize the amount of actual work we have to do. :-)
    Good luck!
    //Johan

  • Best App For HD Backup?

    Hi everybody,
    Just wondering, what is the best program out there (preferably free) for simply backing up my Mac. I don't like Time Machine because I can only use it for one thing, and I need to carry my External HD around.
    Thanks in advance,
    Jonathan

    Hi Jonathan
    The Restore feature within Disk Utility is capable of copying a bootable clone of your Internal Hard Drive on an External Hard Drive.
    There are also two other great third party backup applications that I have personally used:
    Carbon Copy Cloner > http://www.bombich.com/index.html
    SuperDuper > http://www.shirt-pocket.com/SuperDuper/SuperDuperDescription.html
    Dennis

  • Best strategy for offsite backups

    I am using TM on an external hard drive (call it #1) with my iMac, however I want to store a backup offsite, in the event of a fire, etc.
    Can you use TM on one machine to make back ups on multiple different external hard drives (not necessarily at the same time)? Is this the best way to back up offsite, by using TM to make an extra external hard drive backup that I would store elsewhere?
    If I use TM, would I get another external hard drive (#2), then turn off TM, disconnect external hard drive #1, plug in #2, then turn TM on, then have it do a full back up, then turn TM off, then disconnect #2, reconnect #1, and turn TM back on?
    Let's say #2 has been off site for a month (while #1 has been running with TM on the computer) and I want to update #2. Is the best way to erase it, and do a full back up? Or, after a month, can I just plug it in and let TM do an incremental backup? Does it matter that TM has been doing backups with #1?
    Sorry for the confusing questions.
    Thanks for any advice.
    Lee

    Agreeing with Kappy that your "secondary" backups should be made with a different app. You can use Disk Utility as he details, or a "cloning" app such as CarbonCopyCloner or SuperDuper that can update the clone periodically, rather than erasing and re-copying from scratch.
    [CarbonCopyCloner|http://www.bombich.com> is donationware; [SuperDuper|http://www.shirt-pocket.com/SuperDuper/SuperDuperDescription.html] has a free version, but you need the paid one (about $30) to do updates instead of full replacements, or scheduling.
    If you do decide to make duplicate TM backups, you can do that. Just tell TM when you want to change disks (thus it's a good idea to give them different names). But there's no reason to erase and do full backups; after the first full backup to each drive, Time Machine will back up only the changes made since the last backup +*to that disk.+* Each is completely independent.

  • Best option for getting backups offsite

    Hi everyone,
    I have two copies of DPM 2012 R2 and FireStreamer, and two sites - the main site in Detroit and a satellite in Houston. Almost all of the data is in Detroit, consisting of mainly documents, then Exchange, Lync, a little bit of RDS, and a little bit of SharePoint.
    Operating hours are 0730-2230 week days.
    Currently I am backing up Detroit data to the Detroit DPM server, weekly spinning it off to one of (6) external 4 Gb backup drives, aka FireStreamer 'tape', and taking it to the bank. I'm backing up a 3 Tb SAN, and have 30 days of
    on-site storage.
    I have recently upgraded our MPLS link from Detroit to Houston to a 100 Mb fiber circuit, and am re-evaluating my options. I would really like to leverage my fiber connection and second site for disaster recovery, with the idea of making it simple.
    Here are my options as I see them:
    Firing up a Houston DPM server, and backing up Detroit to Houston, and Houston to Detroit. That way I have backup and disaster recovery in one fell swoop; not sure about Detroit to Houston time- and bandwidth-wise.
    Backing up Detroit to Detroit, and backing up the Detroit DPM with a secondary Houston DPM server. (I tried this when I had a 6 Mb MPLS and never got it to sync.)
    Same as above, only de-dup from Detroit to Houston.
    Backup to Azure storage or Azure VM. (I have 100 Mb of internet in both locations.)
    I know I could take countless hours to research this, but I would appreciate anyone out there who can shed some quick  'big picture' light on what my practical options are at this point, and point me in a direction. I don't mind being embarrassed. 
    :^) 
    Rob

    Markus,
    Thank you so much for your reply - it is much appreciated. The main answer (isn't it always?) is to do as much as you can with as little money as possible.
    After thinking this through, and talking it through with a member of the IT staff, we have come up with a tentative and very workable solution.
    We will use the Detroit DPM server to back up everything in Detroit, and Houston across the 100 Mb MPLS circuit, and then put a secondary DPM server in Houston.
    We have a spare 1U server in the Detroit IT Room. We will put (2) 4 Tb drives in it, set it up as the secondary DPM server, let the two servers synchronize while both are still in Detroit, and then after the servers are synched, send the 1U server to Houston,
    where it will be the secondary DPM server at our other geographically separate location and will synch over the 100 Mb MPLS circuit.
    (I actually tried to do this before the pipe was bumped up from 6 to 100 Mb, and never got the secondary to sync across the 6 Mb circuit.)
    The more I think about it the more I like it. Simple, geographically separated, and no more driving external USB drives to and from the bank! No more manually swapping drives! No more manually mounting, and FireStreamer, and managing, and repairing, etc.,
    the external USB drives! It's all automatic, and should run without intervention.
    And as far as cost: we own the 100 Mb pipe, the server, the OS license, the DPM license, we already have a Samsung 840 SSD for the OS, all we need is two 4 Tb drives for Houston storage; done for less than $400.
    If anyone has anything to add please do!
    Rob

  • Shell script for DB backup

    HI,
    i have written some java code Database backup but there are some problems with that so now i need to write shell script for db backup.
    what i was doing in java code i was running command like that
    /usr/local/bin/tar cvzf /export/home/monitor/FILE_20091005.tar.gz FILES/*20091005.*which compress the all *20091005* files (myisam table files)
    but after compression file doesn't extract
    so i have to write shell script for that ..... can any body guide me how can i write that kind of script and put it in cron job.
    thanks

    soundar wrote:
    Hi all,
    I have migrated database from 8i to 10gr2.For Backup in 8i, we used a RMAN shell script (scheduled uding cron tab) to backup the database to Tape.(VERITAS BACKUP).
    I am new to 10G.I checked out the options to backup the database using Oracle Enterprise manager DB console.
    http://www.oracle.com/technology/obe/10gr2_db_single/ha/rman/rman_otn.htm#t1d
    I am planning to take a test backup using the steps mentioned inthe above url.Could any one suggest whcih is the best option for database backup,eiether to use Oracle Enterprise manager DB console or thru RMAN shell script for backup..?
    Edited by: soundar on Mar 9, 2010 10:53 PMDear soudar
    I woudn't suggest you to work with EM if you want to be a professional DBA. Start learning RMAN and use CLI instead of GUI
    Those who live by the GUI, die by the GUI

  • Best software for doing full backups to ext. hard disk to to be used to restore system

    As a new PC owner I wanted to find out what the best software for doing full backups to external hard disk,  to to be used to restore the system if primary primary disk crashes. Im aware that Windows, Acronis and Paragon all have software that doing varying degress of back and resore functions.

    lasvideo,
    My favorites are ShadowProtect to do an "image backup" of boot drive and Beyond Compare for backing up files for all other "data" arrays. ShadowProtect does not seem to have much market share compared with Ghost and Acronis, but I really love it.
    What I like about ShadowProtect:
    - Extremely fast (complete image backup for 61GB of files on my boot SSD RAID 0 array to a RAID data array takes less than 3 minutes)
    - Can do image backups across a LAN; restores from boot CDROM and USB device
    - Allows for backups from one controller and restore of OS on a different controller and/or RAID configuration (has HAL tools; Hardware ? Layer I think)
    - Can "mount" an image backup as a drive letter and see (and copy) all directories, files, etc. that are contained in the archive
    What I like about Beyond Compare:
    - Can choose what to backup and what to ignore very easily (for example you can set it to ignore certain files or directory names to ignore and then save how you set things up as a "session")
    - Only needs to copy over files that are new or have changed since your last backup
    - Makes it very easy to selectively delete files from your backup
    Regards,
    Jim

  • Best practice for E-business suite 11i or R12 Application backup

    Hi,
    I'm taking RMAN backup of database. What would be "Best practice for E-business suite 11i or R12 Application backup" procedure?
    Right now I'm taking file level backup. Please suggest if any.
    Thanks

    Please review the following thread, it should be helpful.
    Reommended backup and recovery startegy for EBS
    Reommended backup and recovery startegy for EBS

  • Kernel: PANIC! -- best practice for backup and recovery when modifying system?

    I installed NVidia drivers on my OL6.6 system at home and something went bad with one of the libraries.  On reboot, the kernel would panic and I couldn't get back into the system to fix anything.  I ended up re-installing the OS to recovery my system. 
    What would be some best practices for backing up the system when making a change and then recovering if this happens again?
    Would LVM snapshots be a good option?  Can I recovery a snapshot from a rescue boot?
    EX: File system snapshots with LVM | Ars Technica -- scroll down to the section discussing LVM.
    Any pointers to documentation would be welcome as well.  I'm just not sure what to do to revert the kernel or the system when installing something goes bad like this.
    Thanks for your attention.

    There is often a common misconception: A snapshot is not a backup. A snapshot and the original it was taken from initially share the same data blocks. LVM snapshot is a general purpose solution which can be used, for example, to quickly create a snapshot prior to a system upgrade, then if you are satisfied with the result, you would delete the snapshot.
    The advantage of a snapshot is that it can be used for a live filesystem or volume while changes are written to the snapshot volume. Hence it's called "copy on write (COW), or copy on change if you want. This is necessary for system integrity to have a consistent data status of all data at a certain point in time and to allow changes happening, for example to perform a filesystem backup. A snapshot is no substitute for a disaster recovery in case you loose your storage media. A snapshot only takes seconds, and initially does not copy or backup any data, unless data changes. It is therefore important to delete the snapshot if no longer required, in order to prevent duplication of data and restore file system performance.
    LVM was never a great thing under Linux and can cause serious I/O performance bottlenecks. If snapshot or COW technology suits your purpose, I suggest you look into Btrfs, which is a modern filesystem built into the latest Oracle UEK kernel. Btrfs employs the idea of subvolumes and is much more efficient that LVM because it can operate on files or directories while LVM is doing the whole logical volume.
    Keep in mind however, you cannot use LVM or Btrfs with the boot partition, because the Grub boot loader, which loads the Linux kernel, cannot deal with LVM or BTRFS before loading the Linux kernel (catch22).
    I think the following is an interesting and fun to read introduction explaining basic concepts:
    http://events.linuxfoundation.org/sites/events/files/slides/Btrfs_1.pdf

  • Best setup for external drives and backup

    I'm using Aperture to organize several thousand photos. I've gotten good advice her before about how to get started on this. I'm not a professional and I'm not an experienced user of Aperture, so don't assume a lot of knowledge when you answer. I'm wondering what the best way to set up the file storage would be. I use a MacBook Pro, so obviously that's out for storage, and financially, purchasing a more powerful desktop is out for the time being. I was thinking of purchasing a mirror drive system, like this: http://www.newertech.com/products/gmax.php
    But then, there still remains the problem of backing up the photos in case of a fire or theft, etc. I have many of them on DVD, but not with all the metadata that I've added. Can I back the external drives up to a cloud-based storage system through the wireless on the MacBook?
    Or, is the answer none of the above? What recommendations do you folks have for managing this?

    more:
    Paula-
    Mirror drives are very much less than ideal for images backup. Mirroring occurs in real-time, so errors, breaks, etc. simply get copied. With images work (unlike fanancial work, for instance) we do not need real-time backup we just need regular accurate backup. Just have 2-3 external drives and rotate them regularly, always having one drive off-site (could be in your car or whatever). Back up manually rather than automatically so that you can be reasonably certain that the backup is not backing up something that just broke.
    I suggest the below workflow. Note that most important is that original images from the camera card are copied to two locations before reformatting the card and before importing into Aperture or any other images management application.
    Original images never change, so I prefer to manually copy them to two locations asap after capture: one location is the computer drive and the other is an external backup HD that itself gets backed up off site regularly. That assures me that "the pic is in the can." Until two digital files exist on different media I do not consider the pic in the can.
    Then reformat the card in-camera, not before.
    The Masters then get imported into Aperture from the Mac internal drive by reference (i.e. "Storing Files: in their current location" on the Mac internal drive). After editing is complete (may take weeks or months), from within Aperture I relocate the referenced Masters to an external hard drive for long-term storage.
    I do use Time Machine routinely on the MBP, but for the daily-volatile activities going of the MBP. I prefer not to have TM be my backup-of-originals protocol. Instead TM backs up the Mac internal drive on the TM schedule and I back up original images asap based on my shooting schedule. Also the TM drive is a different drive than the drives used for long-term original image files archiving.
    TM does back up my Library because the Library lives on the Mac internal drive but I do not assume that TM works for Library backup. I back the Library up to Vaults (on the same drives I put archives of Masters on) independent of TM. IMO one should back up image files and back up Vaults manually after verifying that what is being backed up is not broken, because automatic backup will just back up a broken Library or whatever.
    Note that Masters need only be backed up once (but to multiple backup locations) and that backup should happen immediately after copying to the hard drive from the camera card, before involving Aperture or any other images management app.
    Sorry for the redundant verbosity above but some was copied from previous posts. Also, I reinforce what Léonie said about DVDs. DVDs are way too slow, unreliable, etc. Instead rotate multiple hard drives to achieve redundancy.
    HTH
    -Allen

  • What is best solution for backup ?

    Our environment consisted of RAC (2 nodes). we are trying to back up our database using RMAN with NetBackup.
    we also have other different databases need to be backed up.
    In RAC case, what is best solution for backing up the database ?
    As I mentioned above, we are using RMAN for backup our RAC with NetBackup.
    Actually, the RMAN is not simple utility for me.
    Is there any other way to back up the database in our situation without rman.
    I need some advice from all of you.
    Thanks in advance.

    Hi Justin
    There are many possible ways to backup your database. You must decide the one that suits your environment.
    Following is the list of options that you have.
    1. Take Online Backups
    Issue this command to freeze tablespaces
    SQL> ALTER TABLESPACE tblspcname BEGIN BACKUP;
    Copy all the files belonging to this tablespace to your backup location using OS commands.
    Release the tablespace by using this command.
    SQL> ALTER TABLESPACE tblspcname END BACKUP;
    To find the data files belonging to a particular tablespace you can issue this statement
    SQL>SELECT file_name, tablespace_name FROM dba_data_files ORDER BY tablespace_name;
    2. If your db size is not BIG then you can take logical backups. Logical Backups can be FULL or incremental. In 10g you can have filesets to spilit your logical backups in more than one file with specified sizes. (By Logical Backups I mean EXPORT).
    To take export for example issue this command.
    ORACLE_HOME\bin\exp file=fullpath+filename.dmp log=fullpath+logfilename.log FULL=Y userid=system/pwd@dbconnectstring
    To get full list of export parameters type
    ORACLE_HOME\bin\exp help=y
    3. RMAN (Strongly recommended) but you ruled out its possibility so I won't elaborate on that.
    4. COLD BACKUP
    To perform this type of backup you will need to shutdown your database by issuing this command.
    SQL>SHUTDOWN IMMEDIATE;
    (On RAC you will need to shutdown all the instances before copying files to the backup location).
    Use OS copy command to copy files to backup location.
    (This method is not recommended as it will flush your SGA and your client will complain about performance for the first few hours).
    Let me know if you need more details.
    Hopefully this helps.
    Rgds
    Adnan

  • Best method for incremental replication to backup xserve?

    I was just trying to figure out what would be the best method for routinely incrementally replicating a primary xserve. I have a secondary that has the same hardware. I thought about using scheduled tasks with Carbon Copy Cloner over say a Crossover cable to the secondary xserve. However, doing this backup would continually wipe out the network settings for the secondary mac? I could do the replication to a secondary drive and then just make that drive the boot drive in the event of data corruption on the master server?
    Where I'm at now is that the primary server runs Raid1, so in the event of a single drive failure or a hardware failure, I could swap the drives into the backup server. However, I'd like some sort of protection against data corruption on the primary xserve.
    Any general thoughts on a better way to do this? Or is there software that does this well?
    Thanks

    Our primary is an XServe RAID. In addition it has an external 2TB drive that I CCC all the user accounts to every night. I then setup a Mac Mini Server as a replica.
    If the primary XServe has a catastrophic failure, I simply connect the 2TB drive to the Mac Mini and point the replicated user accounts to the cloned data on the 2TB drive.
    I haven't tested it. But this scenario seemed like the best solution.
    Message was edited by: MacTech_09

Maybe you are looking for

  • Reg sending Mail with PDF attachment

    Hi All,           Am converting smartforms(Purchase Order Me23N) to pdf then am attaching pdf in mail,mail is going with attachment to specific id but when i try to open PDF file am getting an Error PDF File is Corrupted. Am using these FM :CONVERT_O

  • Button Event doesn't work after interactive pdf form creation

    Hi everyone, I have an interactive form created after clicking on a button in view 1. The interactive form ui element is then in a view 2, with a button "back" to return to view 1. The problem is that this button no longer works (it worked on 2004, i

  • Popup containing a JPanel with a JComboBox

    I like to get a Popup containing a JPanel with a JComboBox and some other stuff. There are two difficulties: - it is not possible to use a Combobox with a light weight popup. So I set combobox.LightWeightPopupEnabled(false). Otherwise the combobox po

  • PLM integration in PI

    Hi Experts,                     We need to integrate third party PLM with SAP ERP for the following scenarios : Load part, part changes, BOM and BOM changes, AVL and AVL changes Could you please let me know if there is any standard content available

  • PLM - Engineering Change Management - ABAP Development

    Engineering Change Management Program: SAPMC29C Transaction:CC02 Change Master Valid From date once expired (Protected Time Period) – Not able to change by the user Requirement is to change the valid from date to New date from custom program even if