Daily export backup (via datapump) of a 600GB production database

Hi Guys,
I have a 600GB DB.
10G Database.
Currently I have daily rman backup to tape.
Based on your experience, should i bother to perform a daily export of production database of such size?
Do u think it's useful?
thanks

Fran wrote:
All depends, what do you want to do? Do you have enough space to save a backup and export daily?
In my opinion with one backup is enough to save your database. I'm sorry, I can't agree with that. I've been in situations where the first, second and THIRD backup sources have been unavailable in the event of a Production restore. I had to go to Plan D and cross my fingers (and sweat a lot). That might be overkill, but you should never rely on just one backup method/source for Production data.
Not only that, but, obviously, you can also use an export of the data and import into another database or a schema inside the database in case you need to address any logical corruption. We also use the exports to refresh test databases.
We have many similar sizes of databases and we take exports weekly, instead of daily. An export will be an incomplete backup, obviously, though you can ensure consistency by giving it an SCN to use as a datapoint. You can also run Data Pump in parallel which does speed things up.
If you use it as part of your backup strategy, I'd make sure I had regular backups of the parameter file and the controlfiles (both binary and trace). I'd also make sure I had regular txt files showing the necessary information for the tablespaces and the datafiles which you can use to recreate the files in case you needed to.
Mark

Similar Messages

  • Daily export backup

    Hi,
    I have production & DR database of 350gb version 10.2.0.4 on aix 5.3 platform.
    In morning time after completion of daily batch, we are taking export backup (by stopping application services till application is not available for users) by using datapump utility which is taking 2 to 2.30 hrs to completes.
    After that we have starting the application services and then only application available for users.
    This export backup dump we are daily using for restoration on another development server.
    Kindly help me, is there any option to avoid export backup time so that it will not affect buisness time.
    and also i can do daily restoration on development server.
    Please help.
    Regards,

    In morning time after completion of daily batch, we are taking export backup (by stopping application services till application is not available for users) by using datapump utility which is taking 2 to 2.30 hrs to completes.
    After that we have starting the application services and then only application available for users.
    This export backup dump we are daily using for restoration on another development server.
    Kindly help me, is there any option to avoid export backup time so that it will not affect buisness time.
    and also i can do daily restoration on development server.if you want complete database refresh?
    you can use STREAMS.
    If certain schema/objects you can use MViews.

  • Tablespace export import via datapump

    Friends ,
    I want to export a particular tablespace using datapump "expdp" and also import it to a new tablespace of a new database . Using datapump , is it possible to do ?
    Plz help .. ...

    Maybe it's easier to use Transportable Tablespaces, see http://download.oracle.com/docs/cd/B19306_01/server.102/b14231/tspaces.htm#sthref1281
    It's also possible to use datapump, but it takes longer, see http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_export.htm#sthref71
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm#sthref254
    HTH
    Enrique

  • Getting ORA-31655 while trying to export Tablespace via Datapump

    Hi
    I am trying to export the tablespace GRAHAM_BMF_TS_01 as per below but I am getting the following error:
    ORA-31655: no data or metadata objects selected for job
    SQL> select SEGMENT_NAME from dba_segments where TABLESPACE_NAME='GRAHAM_BMF_TS_01';
    SEGMENT_NAME
    BMF_AGREEMENTS
    SQL> select count(*) from BMF_AGREEMENTS;
      COUNT(*)
        199999
    SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
    With the Partitioning and Data Mining options
    $ expdp directory=DATA_PUMP_DIR_CORRUPTION_TEST dumpfile=pre_corruption_dp.dmp logfile=pre_corruption_dp.log  TABLESPACES=GRAHAM_BMF_TS_01 CONTENT=ALL
    Export: Release 10.2.0.3.0 - 64bit Production on Tuesday, 21 June, 2011 12:45:59
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
    With the Partitioning and Data Mining options
    Starting "SYS"."SYS_EXPORT_TABLESPACE_01":  /******** AS SYSDBA directory=DATA_PUMP_DIR_CORRUPTION_TEST dumpfile=pre_corruption_dp.dmp logfile=pre_corruption_dp.log TABLESPACES=GRAHAM_BMF_TS_01 CONTENT=ALL
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    ORA-31655: no data or metadata objects selected for job
    Job "SYS"."SYS_EXPORT_TABLESPACE_01" completed with 1 error(s) at 12:46:14Any ideas what I am doing wrong?
    Edited by: Grahambo on Jun 21, 2011 4:52 AM

    Not sure why this thread was locked, but I unlocked it.
    There may be segments in this tablespace, but they may not be owned by a schema that is exported using Data Pump. For example, if all of the storage in this tablespace is owned by the SYS schema, then there is nothing in this tablespace to be exported.
    Any idea what schema owns the segments in this tablespace?
    Dean

  • Export Backup takes 90% CPU Usage

    hi Gurus,
    We have export backup scheduled on PROD database. We have noticed that export backup consumes alot of CPU usage.
    Database: Release 9.2.0.6
    CPU usage: Around 90% TO 95% during backup activity
    Backup last for around 20min
    Script file_
    @echo off
    for /F "tokens=2-4 delims=/ " %%f in ('date /t') do (
    set mm=%%f
    set dd=%%g
    set yyyy=%%h
    set DT=%mm%%dd%%yyyy%
    set ORACLE_SID=PRODUCT1
    echo Oracle SID set to PRODUCT1
    exp SYSTEM/DIAL1NG file=E:\ExportBackup\PRODUCT1\PRODUCT1_EXP_FULL_%mm%%dd%%yyyy%.dmp log=E:\ExportBackup\PRODUCT1\PRODUCT1_EXP_FULL_%mm%%dd%%yyyy%.log full=y grants=y indexes=y statistics=none direct=y consistent=y buffer=5000000
    goto END
    : END
    please advise us whether this can be fine tuned?
    Regards
    Shreeshail

    Google "restrict cpu usage windows"
    Lots of stuff out there.

  • How to ZIP Oracle Datapump export backup file

    Hello All,
    My customer is asking to give him the production data dump to the following path \\138.90.17.56\OMNISAFE.
    I really don't understand his requirement and he also wants me to zip the export backup file. How do I do that, Do you know any unix command to zip backup files.
    thanks and regards
    cherry

    1013498 wrote:
    Well Thanks for your reply.....my oracle version is 11.2.0.3.b and if we have the compression option can you please elaborate how to do that......
    It's in the documentation.  See Data Pump Export
    let us say my expdp file is abc.dmp...should I give the command gzip abc.dmp or any different.
    Let me google that for you
    One more question what does teh customer mean by production data dump to the following path \\138.90.17.56\OMNISAFE. please explain
    How do we know what the customer means?  Why don't you ask him?
    That said, it looks like a url to an ip address and a defined folder at that ip address.  Again, if the customer wants you to send them a file, you need to be working with said customer on the mechanics of accessing their system.
    All that said ....
    Learning how to look things up in the documentation is time well spent investing in your career.  To that end, you should drop everything else you are doing and do the following:
    Go to tahiti.oracle.com.
    Locate the link for your Oracle product and version, and click on it.
    You are now at the entire documentation set for your selected Oracle product and version.
    BOOKMARK THAT LOCATION
    Spend a few minutes just getting familiar with what is available here. Take special note of the "books" and "search" tabs. Under the "books" tab (for 10.x) or the "Master Book List" link (for 11.x) you will find the complete documentation library.
    Spend a few minutes just getting familiar with what kind  of documentation is available there by simply browsing the titles under the "Books" tab.
    Open the Reference Manual and spend a few minutes looking through the table of contents to get familiar with what kind of information is available there.
    Do the same with the SQL Reference Manual.
    Do the same with the Utilities manual.
    You don't have to read the above in depth.  They are reference manuals.  Just get familiar with what is there to be referenced. Ninety percent of the questions asked on this forum can be answered in less than 5 minutes by simply searching one of the above manuals.
    Then set yourself a plan to dig deeper.
    - *Read a chapter a day from the Concepts Manual*.
    - Take a look in your alert log.  One of the first things listed at startup is the initialization parms with non-default values. Read up on each one of them (listed in your alert log) in the Reference Manual.
    - Take a look at your listener.ora, tnsnames.ora, and sqlnet.ora files. Go to the Network Administrators manual and read up on everything you see in those files.
    - *When you have finished reading the Concepts Manual, do it again*.
    Give a man a fish and he eats for a day. Teach a man to fish and he eats for a lifetime.

  • Schedule Hot backup via OEM

    Hello all,
    We have a realtively small system running Oracle 10G on Windows 2003 O/S. I've scheduled a full offiline (cold) backup of the whole db to run once weekly via OEM. Every night we run a script to do a 'logical' backup by EXPORTing the 2 main schemas in the DB out to .DMP files. However this now may not be enough to satisfy DB recovery requirements.
    Is there an easy way to schedule a 'Hot' backup via OEM, that could replace the EXPORT scripts mentioned above, or would we have to write an RMAN script to do this ?? The requirement is really to allow the database to be updateable whilst any daily backup is running. I realise that the DB needs to be in ARCHIVELOG mode for anything like this to work.
    Any advice would be greatly appreciated.
    Thanks,
    Shaun.

    hi,
    on the OEM main page, click on targets, you should see your databse there. click on the database and then on the maintenance tab.
    from here you can schedule jobs and you have the option to schedule a online rman backup of your database. However, to do an online backup your database needs to have archiving turned on. Is this the case as you currently carry out offline backups which might indicate that archiving is turned off.
    regards
    Alan
    Edited by: alanm on Sep 5, 2008 11:47 AM

  • Export Backup Size is different

    Hi Gurus,
    We are using Oracle 10G (10.2.0.1.0) on Solaris 10 and we used to take export backup daily basis. While taking export backup using 'exp' then .dmp file size is becoming 30.3 GB where as when we use 'expdp' .dmp file size is 26.1 GB. I've checked both the log files and found no. of tables along with their records are same. so I'm confused why this size difference? and as a result we are not in a position to implement 'expdp'. Can anybody tell me why this size difference and whether can we rely on expdp or not?

    user606947 wrote:
    Hi Gurus,
    We are using Oracle 10G (10.2.0.1.0) on Solaris 10 and we used to take export backup daily basis. While taking export backup using 'exp' then .dmp file size is becoming 30.3 GB where as when we use 'expdp' .dmp file size is 26.1 GB. I've checked both the log files and found no. of tables along with their records are same. so I'm confused why this size difference? and as a result we are not in a position to implement 'expdp'. Can anybody tell me why this size difference and whether can we rely on expdp or not?
    Oracle recommends that you use the new Data Pump Export and Import utilities because they support all Oracle Database 10g features
    Look at the following documentation:
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_overview.htm#sthref14

  • Sequence behavior after importing via DataPump

    Hi Friends,
    I'm running Oracle DB 11.2.0.3 on Windows 2008 R2 SP1 Servers and I faced a strange sequences behavior after importing a schema via Data Pump.
    The export is done this way:
    EXPDP userid/password dumpfile= logfile= directory= remap_dumpfile=y (no news)
    The import is done this way
    IMPDP userid/password dumpfile= logfile= directory= remap_schema=(old_one:new_one) remap_tablespace=(old_ones:new_ones, so on...)
    The import works fine. There are no errors and the sequences are as well imported with no warnings.
    The strange behavior is that the sequences seems to "reset". When we call a sequence the NEXTVAL is just lower than the values already stored in the Database, and we get ORA-00001 a lot. The sequence should know that vale. I don't have this problem when using exp/imp, just via DataPump.
    So that when we create an order that should receive the value of 100, as an example, because we have 99 orders on the system, Oracle suggest a value lower than 99 or even the number one value (01).
    We then wrote a script to check the CURVAL of the sequences on the base schema to recreate the sequences using this initial value on the new imported schema.
    Does anyone faced this problem before?
    Any suggestions?
    Tks a lot

    Richard
    I've tried what you just said.
    Adding the parameter consistent=y makes Oracle to show a message like that at the beginning of the export
    "flashback_time=TO_TIMESTAMP('2013-09-03 12:18:12', 'YYYY-MM-DD HH24:MI:SS')"
    It warns me: Legacy Parameter CONSISTENT=TRUE, and replaces with flashback_time.
    Really, I did not know about this behavior with this "old" parameter. I'm very appreciated about your help.
    I was almost thinking it was a DataPump bug or something.
    Thanks a lot Richard. I'll now update my scripts and make lots of test.
    If you have more advices using this parameter please share us.
    Cheers

  • Export backup error while using where clause

    I am heaving Oracle9i on solaris platform. Whem i am trying to take the export backup of a table it is giving below error:
    exp swtiob tables=NDC_ATMPROOF_HIST file=NDC_ATMPROOF_HIST.dmp log=NDC_ATMPROOF_HIST.log query="where PROOF_DATE >= '01-july-2010'" statistics=none
    LRM-00112: multiple values not allowed for parameter 'query'
    EXP-00019: failed to process parameters, type 'EXP HELP=Y' for help
    EXP-00000: Export terminated unsuccessfully

    You need to escape the stuff like this:
    $ exp scott/tiger tables=emp file=emp.dmp log=emp.log query=\"where HIREDATE\>\'09-JUN-1981\'\"
    Export: Release 9.2.0.8.0 - Production on Wed Jul 7 12:54:48 2010
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Connected to: Oracle9i Enterprise Edition Release 9.2.0.8.0 - 64bit Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.8.0 - Production
    Export done in US7ASCII character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P1 character set (possible charset conversion)
    About to export specified tables via Conventional Path ...
    . . exporting table                            EMP          6 rows exported
    EXP-00091: Exporting questionable statistics.
    Export terminated successfully with warnings.
    $

  • Export backup failing with ORA-1555snapshot too old: rollback segment ERROR

    DB version =8.1.7.4
    OS=Solaris =5.10.
    I know this is outdated. But still we are supporting this version of database.
    We are facing ORA-1555 error daily when we are running the export backup of database. Consistent parameter is set to N .
    I needed to know why export uses rollback segments during export operation. even though CONSISTENT parameter is N.
    Thanks,
    Edited by: ahshivap on Dec 12, 2012 5:19 AM

    Hi,
    Consistent=y just mean the tables are consistent with each other. Oracle still has to guarantee that each individual table is at least consistent with itself. That's your issue.
    Regards,
    Harry

  • EXPORT Backup failing due to character set problem

    Hi ,
    The export Backup failing due to character set problem
    . . exporting table ravidlx
    EXP-00008: ORACLE error 6552 encountered
    ORA-06552: PL/SQL: Compilation unit analysis terminated
    ORA-06553: PLS-553: character set name is not recognized
    P
    Please suggest how to set character set
    Regards,
    kk
    Edited by: kk001 on Aug 29, 2011 7:22 PM

    kk001 wrote:
    Hi ,
    The export Backup failing due to character set problem
    . . exporting table ravidlx
    EXP-00008: ORACLE error 6552 encountered
    ORA-06552: PL/SQL: Compilation unit analysis terminated
    ORA-06553: PLS-553: character set name is not recognized
    P
    Please suggest how to set character set
    I don't know what you have.
    I don't know what you do.
    I don't know what you see.
    It is really, Really, REALLY difficult to fix a problem that can not be seen.
    use COPY & PASTE so we can see what you do & how Oracle responds.
    do as below so we can know complete Oracle version & OS name.
    Post via COPY & PASTE complete results of
    SELECT * from v$version;

  • Export backup terminated due to ORA-00600: internal error code, arguments:

    Hi,
    exp system/system full=y file=exp_bkp.dmp log=exp_bkp.log statistics=none
    My export backup terminated unsuccessfully and database also down due to ORA-00600: internal error code, arguments: [17147], [0x4152199C], [], [], [], [], [], []
    os version=RHEL4
    DB Version=10.2.0.1.0
    export log
    exp_bkp.log
    About to export the entire database ...
    . exporting tablespace definitions
    . exporting profiles
    . exporting user definitions
    . exporting roles
    . exporting resource costs
    . exporting rollback segment definitions
    . exporting database links
    . exporting sequence numbers
    . exporting directory aliases
    . exporting context namespaces
    . exporting foreign function library names
    . exporting PUBLIC type synonyms
    . exporting private type synonyms
    . exporting object type definitions
    . exporting system procedural objects and actions
    . exporting pre-schema procedural objects and actions
    . exporting cluster definitions
    . about to export SYSTEM's tables via Conventional Path ...
    . . exporting table DEF$_AQCALL 0 rows exported
    ORA-00600: internal error code, arguments: [17112], [0x4152199C], [], [], [], [], [], []
    ORA-00600: internal error code, arguments: [17112], [0x4152199C], [], [], [], [], [], []
    ORA-00600: internal error code, arguments: [17112], [0x4152199C], [], [], [], [], [], []
    ORA-00600: internal error code, arguments: [17147], [0x4152199C], [], [], [], [], [], []
    Alert Log
    Errors in file /u02/app/oracle/product/10.2.0/db_1/admin/PRIM/udump/prim_ora_3704.trc:
    ORA-00600: internal error code, arguments: [17147], [0x4152199C], [], [], [], [], [], []
    Mon Jun 8 13:43:11 2009
    Errors in file /u02/app/oracle/product/10.2.0/db_1/admin/PRIM/udump/prim_ora_3704.trc:
    ORA-00600: internal error code, arguments: [17112], [0x4152199C], [], [], [], [], [], []
    ORA-00600: internal error code, arguments: [17147], [0x4152199C], [], [], [], [], [], []
    Mon Jun 8 13:43:12 2009
    Errors in file /u02/app/oracle/product/10.2.0/db_1/admin/PRIM/udump/prim_ora_3704.trc:
    ORA-00600: internal error code, arguments: [17112], [0x4152199C], [], [], [], [], [], []
    ORA-00600: internal error code, arguments: [17112], [0x4152199C], [], [], [], [], [], []
    ORA-00600: internal error code, arguments: [17147], [0x4152199C], [], [], [], [], [], []
    Mon Jun 8 13:43:14 2009
    Errors in file /u02/app/oracle/product/10.2.0/db_1/admin/PRIM/udump/prim_ora_3704.trc:
    ORA-00600: internal error code, arguments: [17112], [0x4152199C], [], [], [], [], [], []
    ORA-00600: internal error code, arguments: [17112], [0x4152199C], [], [], [], [], [], []
    ORA-00600: internal error code, arguments: [17112], [0x4152199C], [], [], [], [], [], []
    ORA-00600: internal error code, arguments: [17147], [0x4152199C], [], [], [], [], [], []
    Mon Jun 8 13:44:14 2009
    Errors in file /u02/app/oracle/product/10.2.0/db_1/admin/PRIM/bdump/prim_pmon_3648.trc:
    ORA-00600: internal error code, arguments: [17112], [0x4152199C], [], [], [], [], [], []
    Mon Jun 8 13:44:16 2009
    Errors in file /u02/app/oracle/product/10.2.0/db_1/admin/PRIM/bdump/prim_pmon_3648.trc:
    ORA-00600: internal error code, arguments: [17112], [0x4152199C], [], [], [], [], [], []
    Mon Jun 8 13:44:16 2009
    PMON: terminating instance due to error 472
    Mon Jun 8 13:44:16 2009
    Errors in file /u02/app/oracle/product/10.2.0/db_1/admin/PRIM/bdump/prim_psp0_3650.trc:
    ORA-00472: PMON process terminated with error
    Instance terminated by PMON, pid = 3648

    Hi,
    Trace files
    */u02/app/oracle/product/10.2.0/db_1/admin/PRIM/bdump/prim_pmon_3648.trc*
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    ORACLE_HOME = /u02/app/oracle/product/10.2.0/db_1
    System name:     Linux
    Node name:     test1.domain.com
    Release:     2.6.9-5.EL
    Version:     #1 Wed Jan 5 19:22:18 EST 2005
    Machine:     i686
    Instance name: PRIM
    Redo thread mounted by this instance: 1
    Oracle process number: 2
    Unix process pid: 3648, image: [email protected] (PMON)
    *** 2009-06-08 13:44:14.697
    *** SERVICE NAME:(SYS$BACKGROUND) 2009-06-08 13:44:14.695
    *** SESSION ID:(170.1) 2009-06-08 13:44:14.695
    ********** Internal heap ERROR 17112 addr=0x4152199c *********
    ***** Dump of memory around addr 0x4152199c:
    41520990 00001001 [....]
    415209A0 4151F99C 2001CFBC 00000000 415219A8 [..QA... ......RA]
    415209B0 D0000FED 00000000 415219B8 40EFEB10 [..........RA...@]
    415209C0 415219BC 00000025 00000000 0C7470F0 [..RA%........pt.]
    415209D0 00000000 46440007 454C544C 0000004E [......DFLTLEN...]
    415209E0 000A000C 0E000001 00000015 415209C4 [..............RA]
    415209F0 0C7470D8 41520A70 41520A1C 00000015 [.pt.p.RA..RA....]
    41520A00 415209E8 0C7470D8 41520A84 41520A1C [..RA.pt...RA..RA]
    41520A10 00000055 415209FC 0C747378 00000201 [U.....RAxst.....]
    41520A20 00000000 00000059 00008100 00030040 [....Y.......@...]
    41520A30 00000000 00000016 41548140 00000000 [[email protected]....]
    41520A40 04000008 00000000 40FFBEFC 00000000 [...........@....]
    41520A50 00000000 40F34AF0 40F56490 00000000 [[email protected].@....]
    */u02/app/oracle/product/10.2.0/db_1/admin/PRIM/udump/prim_ora_3677.trc*
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    ORACLE_HOME = /u02/app/oracle/product/10.2.0/db_1
    System name:     Linux
    Node name:     test1.domain.com
    Release:     2.6.9-5.EL
    Version:     #1 Wed Jan 5 19:22:18 EST 2005
    Machine:     i686
    Instance name: PRIM
    Redo thread mounted by this instance: 1
    Oracle process number: 15
    Unix process pid: 3677, image: [email protected] (TNS V1-V3)
    *** SERVICE NAME:() 2009-06-08 13:41:55.582
    *** SESSION ID:(159.3) 2009-06-08 13:41:55.582
    Thread 1 checkpoint: logseq 183, block 2, scn 473582
    cache-low rba: logseq 183, block 3
    on-disk rba: logseq 183, block 101, scn 473671
    start recovery at logseq 183, block 3, scn 0
    ----- Redo read statistics for thread 1 -----
    Read rate (ASYNC): 49Kb in 0.55s => 0.09 Mb/sec
    Total physical reads: 4096Kb
    Longest record: 0Kb, moves: 0/152 (0%)
    Longest LWN: 6Kb, moves: 0/24 (0%), moved: 0Mb
    Last redo scn: 0x0000.00073a46 (473670)
    ----- Recovery Hash Table Statistics ---------
    Hash table buckets = 32768
    Longest hash chain = 1
    Average hash chain = 41/41 = 1.0
    Max compares per lookup = 1
    Avg compares per lookup = 223/264 = 0.8
    *** 2009-06-08 13:41:56.180
    KCRA: start recovery claims for 41 data blocks
    *** 2009-06-08 13:41:57.051
    KCRA: blocks processed = 41/41, claimed = 41, eliminated = 0
    *** 2009-06-08 13:41:57.053
    Recovery of Online Redo Log: Thread 1 Group 1 Seq 183 Reading mem 0
    ----- Recovery Hash Table Statistics ---------
    Hash table buckets = 32768
    Longest hash chain = 1
    Average hash chain = 41/41 = 1.0
    Max compares per lookup = 1
    Avg compares per lookup = 264/264 = 1.0
    tkcrrsarc: (WARN) Failed to find ARCH for message (message:0x1)
    tkcrrpa: (WARN) Failed initial attempt to send ARCH message (message:0x1)

  • Problem Exporting Backups from Cisco Prime LMS 4.2 deployed as software appliance

    Hi,
    I'm trying to backup a Cisco Prime LMS4.2 based on soft appliance. I have the backup stored on destination disk://localdisk/backup/, but i can't export it via FTP to external server. When I perfrom the transfer only the folder is stored in the destination path, the files aren't included. I think that i have to compress files on the backup folder as .tar file using linux shell, but i can´t find the backup folder from this shell.
    It´s correct my procedure, if not What is the procedure and commands to export a backup to external server via FTP?
    Thanks,

    Hi Dave,
    If your Goal is to upgarde the IOS of  the devices via LMS  then  MANUALLY download the IOS Image from cisco.com
    and use the FILE SYSTEM option to add the Image in the Software Repository
    Then try to upgrade the IOS and see how it works.
    Thanks
    Afroz

  • RDS 2012 R2 - How do I lockdown access to Local Computer Management and Windows Backup via Group Policy

    Greetings all,
    I am needing assistance in how to lockdown access to Local Computer Management and Windows Backup via Group Policy for users that access RDS service. I have followed this awesome guide - h t t p://w w w.it.ltsoy.com/windows/lock-down-remote-desktop-services-server-2012/
      - but it is missing two important resources that I would like to lock down.Currently, I have successfully locked down Control Panel for users via Group Policy, but I cannot find any group policy or guide on how to restrict user access
    to Computer Management (different to Server Manager). When using Win-X shortcut to open the 'Administrator's shortcuts' near the windows icon, I have locked down everything except Computer Management. Computer Management gives direct access to Disk Management,
    Shares etc, which are locked down for users. But Windows Server Backup is still accessible. Can someone please guide me on how to restrict access to both Computer Management and Windows Server Backup.
    Thanks in advance.
    Terry.

    Prevent running of Windows Server Backup
    Computer Configuration\Policies\Windows Settings\Security Settings\File System
    Right click on File System - Add File - Drill down to \System32\wbadmin.msc
    On the Database Security ACL that pops up - Remove Creator Owner, Remove Users and check Adminstrators have Full Access.
    On the Object window - choose Propagate inheritable permissions to all... (Default)

Maybe you are looking for

  • Windows 8.1 pro pack

    Hi i am going to upgrade windows 8.1 standard version to windows 8.1 pro pack . my question is , after install every fresh windows ( i mean restore  back image that already is in my laptop) do i need to purchase again for pro pack version ?

  • ALV Report_ its urgent pls help me ( every input gets the reward points)

    Program with ALV_Grid 1. Selection Screen Customer Number: KUNNR Billing Date: FKDAT Now Using the above Selection screen inputs, I want to generate the following fields in the output. 2. Output Screen Invoice No: VBELN Invoice Date: FKDAT Customer N

  • Websphere to Weblogic Migration

    Hi, I am looking for some guidelines and document which can provide adequate information on doing the J2EE Application Migration from Websphere Application Server 6.1.x to Weblogic 10.x version. Thanks in Advance Sachin

  • Tween Class fails

    Hi, i was looking for some help with tween class beacuse i have a strange problem. I'm working with AS3, when i test my project, sometimes the Tween fails, sometimes it works. It's really strange, i don't know why. The animation stops before the end

  • How to get a web browser gets updated automatically

    Hello, I am a new person with weblogic application and I am told to use weblogic to develop an application at work. Basically, I have a web portal which needs to query the oracle database for new data (the data will be comming into the DB continuousl