Best command for script backup?

hello,
i always use robocopy for file backup , but with powershell and the new technologies is still the easiest and best choice?

Your initial question was "Best command for script backup?". For this you received an answer. If you now need support on the subject of your script then you should start a new thread. Note also that you must test your robocopy command from a
command line before trying to embed it in a PowerShell script.

Similar Messages

  • WHAT IS BEST STRATEGY FOR RMAN BACKUP CONFIGURATION

    Hi all,
    my database size is 50GB I want TAKe WEEKLY FULL BACKUP AND INCREMENTAL BACKUP
    WITHOUT RECOVERY CATALOG.by follwing commands
    weekly full database backup
    run
    backup as compressed backupset
    incremental level=0
    device type disk
    tag "weekly_database"
    format '/sw/weekly_database_%d_t%t_c%c_s%s_p%p'
    database;
    I want do CONFIGURE RMAN BY FOLLWING stragtegy
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 days.
    CONFIGURE BACKUP OPTIMIZATION OFF;
    CONFIGURE CONTROLFILE AUTOBACKUP ON;
    CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE DISK TO
    '\SW\AUTOCFILE_'%F';
    and other is by default
    sql>alter system set control_file_record_keep_time=15 days;
    os--aix6 and for two database 10g2 and 11g2
    what is best configuration strategy for rman backup.AND BACKUP WITH RECOVERY CATALOG OR WITHOUT RECOVERY CATALOG
    PLEASE TELL ME
    Edited by: afzal on Feb 26, 2011 1:45 AM

    For simply two databases, there really wouldn't be a need for a recovery catalog. You can still restore/recover without a controlfile and without a recovery catalog.
    From this:
    afzal wrote:
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 days.I am assuming you want to keep two weeks worth of backups, therefore these:
    alter system set control_file_record_keep_time=15 days;
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 daysShould be:
    RMAN> sql 'alter system set control_file_record_keep_time=22';
    RMAN> CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 14;22 would give you that extra layer of protection for instances when a problem occurs with a backup and you want to ensure that data doesn't get aged out.

  • Best pratices for RMAN backup management for many databases

    Dear all,
    We have many 10g databases (>40) hosted on multiple Windows servers which are backup up using RMAN.
    A year ago, all backup's were implemented through Windows Scheduled Tasks using some batch files.
    We have been busy (re)implementing / migrating such backup in Grid Control.
    I personally prefer to maintain the backup management in Grid Control, but a colleague wants now to go back to the batch files.
    What i am looking for here, are advices in the management of RMAN backup for multiple databases: do you guys use Grid Control or any third-party backup management tool or even got your home-made solution?
    One of the discussion topic is the work involved in case that the central backup location changes.
    Well... any real-life advices on best practices / strategies for RMAN backup management for many databases will be appreciated!
    Thanks,
    Thierry

    Hi Thierry,
    Thierry H. wrote:
    Thanks for your reaction.
    So, i understand that Grid Control is for you not used to manage the backups, and as a consequence, you also have no 'direct' overview of the job schedules.
    One of my concern is also to avoid that too many backups are started at the same time to avoid network / storage overload. Such overview is availble in Grid Control's Jobs screen.
    And, based on your strategy, do you recreate a 'one-time' Oracle scheduled job for every backup, or do your scripts create an Oracle job with multiple schedule?
    You're very welcome!
    Well, Grid Control is not an option for us, since each customer is in a separate infrastructure, and with their own licensing. I have no real way (in difference to your situation) to have a centralized point of control, but that on the other hand mean that I don't have to consider network/storage congestion, like you have to.
    The script is run from a "permanent" job within the dba-scheduler, created like this:
    dbms_scheduler.create_job(
            job_name        => 'BACKUP',
            job_type        => 'EXECUTABLE',
            job_action      => '/home/oracle/scripts/rman_backup.sh',
            start_date      => trunc(sysdate)+1+7/48,
            repeat_interval => 'trunc(sysdate)+1+7/48',
            enabled         => true,
            auto_drop       => false,
            comments        => 'execute backup script at 03:30');and then the "master-script", determines which level to use, based on weekday from the OS. The actual job schedule (start date, run interval etc) is set together with the customer IT/IS dept, to avoid congestion on the backup resources.
    I have no overview of the backup status, run times etc, but have made monitoring scripts that will alert me if/when a backup either fails, or runs for too long. This, in addition with scheduled disaster/recovery tests makes me sleep rather well at night.. ;-)
    I realize that there (might be) better ways of doing backup scheduling in your environment, since my requirements are so completely different than yours, but I guess that we all face the same challenges in unifying the environments as much as possible, to minimize the amount of actual work we have to do. :-)
    Good luck!
    //Johan

  • Best practice for bi backup

    Hi,
    Who can suggest me the best practice for backup/restore of entire bi dashboard reports, permissions and etc?
    Ed,

    Hi,
    If you want move the entire dashboards,reports ,permissions
    Zip the *<OracleBIData>Web/catalog* folder and move it new environment. In new envornment unzip this catalog and in Instanceconfig.xml mention the path for this new catalog.
    If you want move the few dashboads or reports ,do it by Catalog Manager.
    Thank you.

  • Best practice for scripts in UCCX

    General Question
    When I wrote scripts for Nortel Symposium , I always created a primary script for queuing calls to skill sets so that the press 1 , welcome prompts, check day of week etc were all in different scripts but the core function- Select Resource was in a single simple script by itself- which was pointed to by the initial menu scripts-
    In UCCX , do people make use of a single script for each call flow - Welcome, Day of Week, Select Resource etc all in a single script to do you call sub - Scripts/ Flows for the actual  Select Resource - what this best practice , what do other people do
    The reason I had subscripts in Nortel was so that when the call was offered to the SkillSet, the Admin. person who that there were no additional queue offerings so any time to answer etc was purely on the Q also any changes  in other scripts did not affect  other Q`s 

    Here's a few tips/best practices that may help you out:
    http://www.cisco.com/en/US/docs/voice_ip_comm/cust_contact/contact_center/crs/express_7_0/reference/guide/UCCX_Best_Practices.pdf

  • Best DLT for Xsan backup

    I need to back up the 5+ TB of my Xsan. I know the best way of doing this is DLT tape backup. Can anyone recommend a system/brand of DLT backup from personal experience?
    Thank you in advance.
    DKG

    Right now, I am using a Overland Neo2000 with 2 LTO-3 drives for backing up my 5 TB Xsan.
    AFAIK, this a good solution, depending on the average size of your data files. We're getting about 120 MB/sec from our 3.4 TB Xsan volume that hosts big data files and about 80 MB/sec from our "office" Xsan volume.
    A full backup runs about 17 hrs, which is not the fault of the LTO-3 drives, but is due to the fact that Xsan does not work too well for office data. Maybe Xsan 2 will boost things in this regard.
    Cheers,
    Budy

  • Best App For HD Backup?

    Hi everybody,
    Just wondering, what is the best program out there (preferably free) for simply backing up my Mac. I don't like Time Machine because I can only use it for one thing, and I need to carry my External HD around.
    Thanks in advance,
    Jonathan

    Hi Jonathan
    The Restore feature within Disk Utility is capable of copying a bootable clone of your Internal Hard Drive on an External Hard Drive.
    There are also two other great third party backup applications that I have personally used:
    Carbon Copy Cloner > http://www.bombich.com/index.html
    SuperDuper > http://www.shirt-pocket.com/SuperDuper/SuperDuperDescription.html
    Dennis

  • Powershell commands for VM backups in 2012 vs 2008

    I have a powershell routine that is used by task scheduler that will shut down one or more Hyper-v VMs from a list, create a backup of the VHD file(s) then restart the VM(s).  It works great in 2008 but when I copied it over to a 2012 host, it
    didn't do anything.  I was wondering if powershell commands from 2008 still work in 2012 or if they've been completely rewritten.  If they are the same, then I'll assume "operator error" and keep trying.  If they have changed, then
    I know that I'll be spending my week googling around for how to use the new commands.  Thanks.

    Hello
    You can use below powershell comand for Server 2012 host.
    $backuppolicy = New-WBPolicy
    Get-WBVirtualMachine | Add-WBVirtualMachine -Policy $backuppolicy
    $targetfolder = New-WBBackupTarget -NetworkPath
    \\FileServerName\e$\BackupFolder -Credential (Get-Credential)
    Add-WBBackupTarget -Policy $backuppolicy -Target $targetfolder
    Set-WBSchedule -Policy $backuppolicy -Schedule 12:00,09:00
    $backuppolicy
    Set-WBPolicy $backkuppolicy
    Start-WBBackup -Policy $backuppolicy
    http://get-itlabs.com/ucretsiz-windows-hyper-v-server-2012-r2-backup/
    Ersin CAN - TAT MCSE - Private Cloud. www.get-itlabs.com

  • Best strategy for offsite backups

    I am using TM on an external hard drive (call it #1) with my iMac, however I want to store a backup offsite, in the event of a fire, etc.
    Can you use TM on one machine to make back ups on multiple different external hard drives (not necessarily at the same time)? Is this the best way to back up offsite, by using TM to make an extra external hard drive backup that I would store elsewhere?
    If I use TM, would I get another external hard drive (#2), then turn off TM, disconnect external hard drive #1, plug in #2, then turn TM on, then have it do a full back up, then turn TM off, then disconnect #2, reconnect #1, and turn TM back on?
    Let's say #2 has been off site for a month (while #1 has been running with TM on the computer) and I want to update #2. Is the best way to erase it, and do a full back up? Or, after a month, can I just plug it in and let TM do an incremental backup? Does it matter that TM has been doing backups with #1?
    Sorry for the confusing questions.
    Thanks for any advice.
    Lee

    Agreeing with Kappy that your "secondary" backups should be made with a different app. You can use Disk Utility as he details, or a "cloning" app such as CarbonCopyCloner or SuperDuper that can update the clone periodically, rather than erasing and re-copying from scratch.
    [CarbonCopyCloner|http://www.bombich.com> is donationware; [SuperDuper|http://www.shirt-pocket.com/SuperDuper/SuperDuperDescription.html] has a free version, but you need the paid one (about $30) to do updates instead of full replacements, or scheduling.
    If you do decide to make duplicate TM backups, you can do that. Just tell TM when you want to change disks (thus it's a good idea to give them different names). But there's no reason to erase and do full backups; after the first full backup to each drive, Time Machine will back up only the changes made since the last backup +*to that disk.+* Each is completely independent.

  • Best option for getting backups offsite

    Hi everyone,
    I have two copies of DPM 2012 R2 and FireStreamer, and two sites - the main site in Detroit and a satellite in Houston. Almost all of the data is in Detroit, consisting of mainly documents, then Exchange, Lync, a little bit of RDS, and a little bit of SharePoint.
    Operating hours are 0730-2230 week days.
    Currently I am backing up Detroit data to the Detroit DPM server, weekly spinning it off to one of (6) external 4 Gb backup drives, aka FireStreamer 'tape', and taking it to the bank. I'm backing up a 3 Tb SAN, and have 30 days of
    on-site storage.
    I have recently upgraded our MPLS link from Detroit to Houston to a 100 Mb fiber circuit, and am re-evaluating my options. I would really like to leverage my fiber connection and second site for disaster recovery, with the idea of making it simple.
    Here are my options as I see them:
    Firing up a Houston DPM server, and backing up Detroit to Houston, and Houston to Detroit. That way I have backup and disaster recovery in one fell swoop; not sure about Detroit to Houston time- and bandwidth-wise.
    Backing up Detroit to Detroit, and backing up the Detroit DPM with a secondary Houston DPM server. (I tried this when I had a 6 Mb MPLS and never got it to sync.)
    Same as above, only de-dup from Detroit to Houston.
    Backup to Azure storage or Azure VM. (I have 100 Mb of internet in both locations.)
    I know I could take countless hours to research this, but I would appreciate anyone out there who can shed some quick  'big picture' light on what my practical options are at this point, and point me in a direction. I don't mind being embarrassed. 
    :^) 
    Rob

    Markus,
    Thank you so much for your reply - it is much appreciated. The main answer (isn't it always?) is to do as much as you can with as little money as possible.
    After thinking this through, and talking it through with a member of the IT staff, we have come up with a tentative and very workable solution.
    We will use the Detroit DPM server to back up everything in Detroit, and Houston across the 100 Mb MPLS circuit, and then put a secondary DPM server in Houston.
    We have a spare 1U server in the Detroit IT Room. We will put (2) 4 Tb drives in it, set it up as the secondary DPM server, let the two servers synchronize while both are still in Detroit, and then after the servers are synched, send the 1U server to Houston,
    where it will be the secondary DPM server at our other geographically separate location and will synch over the 100 Mb MPLS circuit.
    (I actually tried to do this before the pipe was bumped up from 6 to 100 Mb, and never got the secondary to sync across the 6 Mb circuit.)
    The more I think about it the more I like it. Simple, geographically separated, and no more driving external USB drives to and from the bank! No more manually swapping drives! No more manually mounting, and FireStreamer, and managing, and repairing, etc.,
    the external USB drives! It's all automatic, and should run without intervention.
    And as far as cost: we own the 100 Mb pipe, the server, the OS license, the DPM license, we already have a Samsung 840 SSD for the OS, all we need is two 4 Tb drives for Houston storage; done for less than $400.
    If anyone has anything to add please do!
    Rob

  • Shell script for DB backup

    HI,
    i have written some java code Database backup but there are some problems with that so now i need to write shell script for db backup.
    what i was doing in java code i was running command like that
    /usr/local/bin/tar cvzf /export/home/monitor/FILE_20091005.tar.gz FILES/*20091005.*which compress the all *20091005* files (myisam table files)
    but after compression file doesn't extract
    so i have to write shell script for that ..... can any body guide me how can i write that kind of script and put it in cron job.
    thanks

    soundar wrote:
    Hi all,
    I have migrated database from 8i to 10gr2.For Backup in 8i, we used a RMAN shell script (scheduled uding cron tab) to backup the database to Tape.(VERITAS BACKUP).
    I am new to 10G.I checked out the options to backup the database using Oracle Enterprise manager DB console.
    http://www.oracle.com/technology/obe/10gr2_db_single/ha/rman/rman_otn.htm#t1d
    I am planning to take a test backup using the steps mentioned inthe above url.Could any one suggest whcih is the best option for database backup,eiether to use Oracle Enterprise manager DB console or thru RMAN shell script for backup..?
    Edited by: soundar on Mar 9, 2010 10:53 PMDear soudar
    I woudn't suggest you to work with EM if you want to be a professional DBA. Start learning RMAN and use CLI instead of GUI
    Those who live by the GUI, die by the GUI

  • How to run exp command for backup in Oracle APEX 4.0

    Hi!
    I am using APEX 4.0 with Oracle R11i. For Daily Backup I have to use the "CMD" procedure. Is it possible to run this "EXP" script through a BUTTON or anything else in Oracle APEX 4.0
    Looking For You Quick Response
    With Best Regards

    Why not instead run a dbms_scheduler job to handle your backup? Or better yet, just build a scheduled job to run every night and you don't need to have the user run ANYTHING?
    Thank you,
    Tony Miller
    Webster, TX
    Never Surrender Dreams!
    JMS

  • Best practice for taking Site collection Backup with more than 100GB

    Hi,
    I have site collection data is more than 100 GB. Can anyone please suggest me the best practice to take backup?
    Thanks in advance....
    Regards,
    Saya

    Hi
    i think Using powershell script we can do..
    Add this command in powershell
    Add-PSSnapin Microsoft.SharePoint.PowerShell
    Web application backup & restore
    Backup-SPFarm -Directory \\WebAppBackup\Development  -BackupMethod Full -Item "Web application name"
    Site Collection backup & restore
    Backup-SPSite http://1632/sites/TestSite  -Path C:\Backup\TestSite1.bak
    Restore-SPSite http://1632/sites/TestSite2  -Path C:\Backup\TestSite1.bak -Force
    Regards
    manikandan

  • What is best solution for backup ?

    Our environment consisted of RAC (2 nodes). we are trying to back up our database using RMAN with NetBackup.
    we also have other different databases need to be backed up.
    In RAC case, what is best solution for backing up the database ?
    As I mentioned above, we are using RMAN for backup our RAC with NetBackup.
    Actually, the RMAN is not simple utility for me.
    Is there any other way to back up the database in our situation without rman.
    I need some advice from all of you.
    Thanks in advance.

    Hi Justin
    There are many possible ways to backup your database. You must decide the one that suits your environment.
    Following is the list of options that you have.
    1. Take Online Backups
    Issue this command to freeze tablespaces
    SQL> ALTER TABLESPACE tblspcname BEGIN BACKUP;
    Copy all the files belonging to this tablespace to your backup location using OS commands.
    Release the tablespace by using this command.
    SQL> ALTER TABLESPACE tblspcname END BACKUP;
    To find the data files belonging to a particular tablespace you can issue this statement
    SQL>SELECT file_name, tablespace_name FROM dba_data_files ORDER BY tablespace_name;
    2. If your db size is not BIG then you can take logical backups. Logical Backups can be FULL or incremental. In 10g you can have filesets to spilit your logical backups in more than one file with specified sizes. (By Logical Backups I mean EXPORT).
    To take export for example issue this command.
    ORACLE_HOME\bin\exp file=fullpath+filename.dmp log=fullpath+logfilename.log FULL=Y userid=system/pwd@dbconnectstring
    To get full list of export parameters type
    ORACLE_HOME\bin\exp help=y
    3. RMAN (Strongly recommended) but you ruled out its possibility so I won't elaborate on that.
    4. COLD BACKUP
    To perform this type of backup you will need to shutdown your database by issuing this command.
    SQL>SHUTDOWN IMMEDIATE;
    (On RAC you will need to shutdown all the instances before copying files to the backup location).
    Use OS copy command to copy files to backup location.
    (This method is not recommended as it will flush your SGA and your client will complain about performance for the first few hours).
    Let me know if you need more details.
    Hopefully this helps.
    Rgds
    Adnan

  • Getting error while running script for online backup

    Hi,
    I am running a script for online backup but ended up with an the below error.
    *ERROR* [Backup Worker Thread] com.day.crx.core.backup.Backup Failed to create temporary directory
    Please help out in resolving this.
    Thanks in Advnace.
    Maheswar

    Hi mahesh,
    If you are using backup feature from crx console, I mean http://localhost:4502/crx/config/backup.jsp  I can say that we had also some problems with this functionalities.
    First off all what you need to check are the permissions, because when you check a source code there is line which creates a File object using a path specified by you to make a backup of repository.
    File targetDir = new File(req.getParameter("targetDir", listDir.getParentFile().getAbsolutePath()));
    You need to have sure that the proper read write access has been granted for this path.
    Another issue is that maybe there was already prepared some hotfix if you are using CQ5.4. Please refer to the following link:
    http://dev.day.com/content/kb/home/Crx/CrxSystemAdministration/CRXOnlineBackup.html
    and also to this one:
    http://dev.day.com/content/docs/en/crx/current/release_notes/overview.html which contains a hotfix number #34797 which was applied to backup.jsp file.
    Regards,
    kasq

Maybe you are looking for

  • Problem in Connecting with the server

    I have a problem while sitting the e-mail. A message comes saying that the device had a problem in connecting with the server. My BB is Tourch 9860 Please help me

  • SQLJ: unable to find input file null   (Error while building EJB.jar file)

    Hi, I am working on open SQL/SQLJ with the following details Name:Employee application(adding an employee to MAXDB through SQLJ connection) FRONT-END:JSP/SERVLET Middle:Stateless bean with DAO(DB connection) coming from SQLJ Back-end:MAXDB I have fol

  • Shopping Cart is not creating Purchase Order

    We are using SRM 4.0 in an extended classic version.  We have been running into a couple of situation where that shopping cart that are created and approved are not creating PO in SRM.  The follow-on document in the SC shows it's approved and the SC

  • Doubts about result table in endeca 3.1

    Hi All, I created the result table in Endeca 2.4 by using the checkbox for Use EQL query and entered the query but I cannot see the option of entering the eql in endeca 3.1.Has something changed or I have missed something? and In order to enable dril

  • Estk autocmpletion gone

    Dear all, I realized that my autocompletion in estk is gone. I still get code completion suggestions related to javascript and script ui, but not for the indesign dom. Looking at my HD I realize that I have the following files: ls -R /Library/Applica