Export dump(logical backup) runing since yesterday

Hi to All,
We are taking full export dump(logical backup) of database.till last week its fine on sunday we change the password of administrator and we forgot to change to password in this scheduled scripts.and yesterday i run this backup manullay in morning and i saw its affecting our performance I cancel this task in middle only I haven't deleted that files from backup location and I had given the password in that scirpts and that scripts runing and that scripts started runing at there aproprite time but the incomplete backup is existed at backup location and the backup which started at scheduled time is runing since yesterday before its fininsh in 2 or 3 hours but know its runing since yesterday.Please give me some solution.
Thanks and Regards
Mohd Khaja

On windows it gets a bit hairy as there really is no clean and nice way of doing it.There are a couple of options.
1. If you can rely on dateformat not changing, you can use a static substring expression. For example, the following might work w/ finnish locale echo %date:~3,2%%date:~6,2%%date:~9,4%Similarly, when you know the dateformat you can tokenize the output of 'date /t' and discard the tokens you don't want.
2. You can set dateformat to your liking and then just use %date% in your script
3. You can run a "SELECT to_char(sysdate,..." into a file and then read that file in your script.
4. Simon Sheppard also has a solution you could use as a basis. I have a slight issue with the approach, but that could just be me.
5. Use gnuwin32 or similar ;)
Also note that %date% env var is set automatically from w2k onwards, so some of the solutions might not work w/ older versions.

Similar Messages

  • Can't export to movie / test movie since yesterday... even reinstalled!!

    Hello,
    I was working on a project for a few days... It all worked
    fine, I was able to export to flash movie, test the movie etc..
    But since yesterday, I can't export to mive anymore... it
    asks me the file name and then the settings window appears, as
    usual. When I click OK, the progress bar appears and either it
    freezes, or I get a Windows error message pops up saying Adobe
    Flash stopped responding...
    I even reinstalled Flash, didnt work.. I then deleted some
    layers , just to check if it would work, but now I got the message
    "flash is out of memory"... and it froze again... (i never got that
    memory message again)..
    I'm using Flash CS3, Vista 64bit, 4GB Ram ...
    Please Help

    I've had this problem a number of times. Usually you don't
    have enough memory. One way I get around it is to take your
    completed file and copy all the frames and paste them into a new
    document. Do not save the new document and then try and export the
    files. This should work.

  • Logical backup and recovery using export/import

    Hi,
    Anyone provide me the clear steps and command to take logical backup and recovery using the import and export method...??.
    im using oracle 9i database and redhat linux server version 4.0.........???.
    thanks,
    vasanth.......

    user12864080 wrote:
    Hi,
    Anyone provide me the clear steps and command to take logical backup and recovery using the import and export method...??.
    im using oracle 9i database and redhat linux server version 4.0.........???.
    thanks,
    vasanth.......Vasant,
    Though you have got links already for using the exp/imp , I would strongly suggest that when you mention backup/recovery, RMAN is the tool that you should be using. Exp/imp is/was never considered as a backup tool. You should read this paper to get the answer of "why rman" ?
    http://www.evdbt.com/TD_Rman.pdf
    Aman....

  • Logical standby error with export dump

    oracal 10g have a setup logical standby and when i am running export dump from logical i got this error.
    EXP-00008: ORACLE error 16224 encountered
    ORA-16224: Database Guard is enabled
    EXP-00000: Export terminated unsuccessfully
    can someone help me out how can it posibal to take export dump and datapump from logical standby.
    thanks

    16224, 00000, "Database Guard is enabled"
    // *Cause: Operation could not be performed because database guard is enabled
    // *Action: Verify operation is correct and disable database guard                                                                                                                                                                                                                                                                                                                                                                                       

  • I have adobe export PDF since a few months but since yesterday I can't export my PDF documents to word ! can you tell me how to do it again?

    I have adobe export PDF since a few months but since yesterday I can't export my PDF documents to word ! can you tell me how to do it again?

    Hi ceop,
    It seems earlier you were able to use it.
    It is not clear why you are facing this issue now.
    You can try it with another pdf file,It might be possible that there is some issue with your pdf file.
    Regards,
    Florence

  • Export dump file and log file  name as sysdate in script

    Hi to All,
    Can anybody help me to give logical backup export dump file namd log file name is as contain sysdate in its name so that we can Uniquelly identified it.
    Regards
    DXB_DBA

    On windows it gets a bit hairy as there really is no clean and nice way of doing it.There are a couple of options.
    1. If you can rely on dateformat not changing, you can use a static substring expression. For example, the following might work w/ finnish locale echo %date:~3,2%%date:~6,2%%date:~9,4%Similarly, when you know the dateformat you can tokenize the output of 'date /t' and discard the tokens you don't want.
    2. You can set dateformat to your liking and then just use %date% in your script
    3. You can run a "SELECT to_char(sysdate,..." into a file and then read that file in your script.
    4. Simon Sheppard also has a solution you could use as a basis. I have a slight issue with the approach, but that could just be me.
    5. Use gnuwin32 or similar ;)
    Also note that %date% env var is set automatically from w2k onwards, so some of the solutions might not work w/ older versions.

  • Logical backup of a schema without taking the data of particular table

    How can I take the logical backup of a schema without taking the backup of data of a one particular table?? I need to give the dump to the client but I do not want to give them data of one particular table.

    @Werner:
    Requirement is to have a "logical backup of a schema without taking the backup of data of a one particular table".
    I presume the requirement to have all tables with data and one table without data in a single dump file.
    ROWS=N will not export any data for any table
    :)

  • Export is not / Backup

    I found in oracle docs that export is the logical backup my question is can we use this as Backup. If this used for small database which database is small, are not they critical?. And if i have export what may be the corruption,worst senerio or how long i can keep it for import.
    Since I can make large files(more than 50gb) export and keep it for years, then can i expect them to import correctly.
    finally, should i rely on export.exp or expdp.

    Hi,
    >>If this used for small database which database is small, are not they critical?
    Well, If so, I think there is no problem in make logical backups depending of the purpose of this database (production ?, development ?) but they need to be validated. (A backup is just a backup when it is valid). Otherwise, you need to know whether the users that use this database can support a lost of data in case of a media failure or corruption of some important database files, because in this case, no way in recover this database because you just have an option to create the database again and import your last valid logical backup. By the way, this database is working in ARCHIVELOG or NOARCHIVELOG mode ?
    Cheers

  • What else in logical backups

    Hi, DBA Professionals,
    If i take logical backup of full database, what else in the database to take backup.
    is this logical backup will take only logical data or physical data?
    if it takes logical only then what about row=y
    and what is difference between isqlplus and oem
    could you please clarify my doubts.
    thanks a lot in advance.

    805547 wrote:
    thanks a lot for guided me,
    i am using 10g R2 database.
    if i take complete, incremental, cumulative backups of my database, if i loss whole database is it possible to recover it with logical backups. if it is yes, what is the method.
    A "complete, incremental, cumulative backups" would be physical backups using rman. a "logical" backup would be an export (exp or expdp) dump. The logical backup can only be imported back into a working database, and would be a point-in-time snapshot of the data. With use of proper backup options, a proper physical backup can recover a database from a total loss, and to any point in time covered by the backups.
    if i loss system datafile only what is the method to recover it using logical backups.There is none. That's why "logical" backups are only one part - and the least important part - of a complete backup/recovery strategy.

  • Sharing some info about Export dumps

    Just to share some info about exports happenig in our system
    I have a 20GB Database but the dump size is 10GB it keeps growing..the data too grows but the dump has also getting increased significantly...
    Has anyone seen encountered soemthing like this...
    Strange but to live with it
    Oracle 8i on Win2k

    Is this your Production Database? Did you schedule this export? During peak/off peak business hours? Production
    Yes
    Off Peak hours
    Apart from this export, what database backup procedure you have in place? Full Backups/Incremental all in place
    You said, you take export regularly as part of procedure. Is this a strange behavior today or did you observe this previously as well??I have been observing since i joined this company.It was 14GB and Dump was around 7 ..now the DB is 20GB and dump is around 10GB
    I was just thinkking for a TB database how much the export would be...:)
    Adding on i had a 100Gb db generating 8Gb dump...Looked strange..)BUt that 100GB did not have so amny objects like what i have now...
    Thanks

  • HOW TO IMPORT DATA MORE THAN ONCE FROM THE SAME EXPORT DUMP FILE?

    before asking my question i'd like to mention that i'm a french spoke...
    so my english is a little bit bad. sorry for that.
    my problem is : IMPORT
    how to import data a SECOND TIME from an export dump file within oracle?
    My Export dump file was made successfully (Full Export) and then i
    tried to import datas for the first time.
    I got this following message in my logfile: I ADDED SOME COMMENTS
    Warning: the objects were exported by L1, not by you
    . importing SYSTEM's objects into SYSTEM
    REM ************** CREATING TABLESPACES *****
    REM *********************************************
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLESPACE "USER_DATA" DATAFILE 'E:\ORANT\DATABASE\USR1ORCL.ORA' SI"
    "ZE 3145728 DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS 1 MAX"
    "EXTENTS 121 PCTINCREASE 50) ONLINE PERMANENT"
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLESPACE "ROLLBACK_DATA" DATAFILE 'E:\ORANT\DATABASE\RBS1ORCL.ORA"
    "' SIZE 10485760 DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS "
    "1 MAXEXTENTS 121 PCTINCREASE 50) ONLINE PERMANENT"
    etc........
    IMP-00017: following statement failed with ORACLE error 1119:
    "CREATE TABLESPACE "L1" DATAFILE 'E:\ORADATA\L1.DBF' SIZE 1048576000 "
    "DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS 1 MAXEXTENTS 121 PCTIN"
    "CREASE 50) ONLINE PERMANENT"
    IMP-00003: ORACLE error 1119 encountered
    ORA-01119: error in creating database file 'E:\ORADATA\L1.DBF'
    ORA-09200: sfccf: error creating file
    OSD-04002: unable to open file
    O/S-Error: (OS 3) The system cannot find the path specified
    --->etc..........
    the drive E: with the folder E:\ORADATA didn't exist, but after
    all that i created it.
    see below, before my IMPORT statement
    REM ********************* CREATING USER *********
    REM ********************************************
    IMP-00017: following statement failed with ORACLE error 959:
    "CREATE USER "L1" IDENTIFIED BY VALUES 'A6E0DAA6865E7627' DEFAULT TABLESPACE"
    " "L1" TEMPORARY TABLESPACE "TEMPORARY_DATA""
    IMP-00003: ORACLE error 959 encountered
    ORA-00959: tablespace 'L1' does not exist
    IMP-00017: following statement failed with ORACLE error 959:
    "CREATE USER "MLCO" IDENTIFIED BY VALUES '56AC6447B7D50467' DEFAULT TABLESPA"
    "CE "MLCO" TEMPORARY TABLESPACE "TEMPORARY_DATA""
    IMP-00003: ORACLE error 959 encountered
    ORA-00959: tablespace 'MLCO' does not exist
    ETC.......
    REM ********************* GRANTING ROLES ***********
    REM ************************************************
    IMP-00017: following statement failed with ORACLE error 1917:
    "GRANT ALTER ANY TABLE to "L1" "
    IMP-00003: ORACLE error 1917 encountered
    ORA-01917: user or role 'L1' does not exist
    ETC.........
    IMP-00017: following statement failed with ORACLE error 1918:
    "ALTER USER "L1" DEFAULT ROLE ALL"
    IMP-00003: ORACLE error 1918 encountered
    ORA-01918: user 'L1' does not exist
    -- that is normal, since the creation of the
    tablespace failed !!
    REM******************************
    IMP-00015: following statement failed because the object already exists:
    "CREATE ROLLBACK SEGMENT RB_TEMP STORAGE (INITIAL 10240 NEXT 10240 MINEXTENT"
    "S 2 MAXEXTENTS 121) TABLESPACE "SYSTEM""
    IMP-00015: following statement failed because
    . importing SCOTT's objects into SCOTT
    IMP-00015: following statement failed because the object already exists:
    "CREATE SEQUENCE "EVT_PROFILE_SEQ" MINVALUE 1 MAXVALUE 999999999999999999999"
    "999999 INCREMENT BY 1 START WITH 21 CACHE 20 NOORDER NOCYCLE"
    ETC............
    importing L1's objects into L1
    IMP-00017: following statement failed with ORACLE error 1435:
    "ALTER SCHEMA = "L1""
    IMP-00003: ORACLE error 1435 encountered
    ORA-01435: user does not exist
    REM *************** IMPORTING TABLES *******************
    REM ****************************************************
    . importing SYSTEM's objects into SYSTEM
    . . importing table "AN1999_BDAT" 243 rows imported
    . . importing table "BOPD" 112 rows imported
    . . importing table "BOINFO_AP" 49
    ETC................
    . . importing table "BO_WHF" 2 rows imported
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLE "DEF$_CALL" ("DEFERRED_TRAN_DB" VARCHAR2(128),
    IMP-00015: following statement failed because the object already exists:
    "CREATE SYNONYM "DBA_ROLES" FOR "SYS"."DBA_ROLES""
    IMP-00015: following statement failed because the object already exists:
    "CREATE SYNONYM "DBA_ERRORS" FOR "SYS"."DBA_ERRORS""
    IMP-00008: unrecognized statement in the export file:
    . importing L1's objects into L1
    IMP-00017: following statement failed with ORACLE error 1435:
    "ALTER SCHEMA = "L1""
    IMP-00008: unrecognized statement in the export file:
    J
    Import terminated successfully with warnings.
    -------------------------------------b]
    So after analysing this log file, i created
    the appropriate drives and folders... as the
    import statement doesn't see them.
    E:\ORADATA G:\ORDATA etc...
    And i started to [b]IMPORT ONE MORE TIME. with:
    $ IMP73 sys/pssw Full=Y FILE=c:\temp\FOLD_1\data_1.dmp BUFFER=64000
    COMMIT=Y INDEXFILE=c:\temp\FOLD_1\BOO_idx.sql
    LOG=c:\temp\FOLD_1\BOO_log.LOG DESTROY=Y IGNORE=Y;
    after that i could not see the users nor the
    tables created.
    and the following message appeared in the log file:
    Warning: the objects were exported by L1, not by you
    . . skipping table "AN1999_BDAT"
    . . skipping table "ANPK"
    . . skipping table "BOAP"
    . . skipping table "BOO_D"
    ETC.....skipping all the tables
    . . skipping table "THIN_PER0"
    . . skipping table "UPDATE_TEMP"
    Import terminated successfully without warnings.
    and only 2 new tablespaces (originally 3) were
    created without any data in ( i check that in
    the Oracle Storage manager : the tablespaces exit
    with 0.002 used space; originally 60 M for each !!)
    so,
    How to import data (with full import option) succefully
    MORE THAN ONE TIME from an exported dump file ?
    Even if we have to overwrite tablespaces , tables and users.
    thank you very much

    The Member Feedback forum is for suggestions and feedback for OTN Developer Services. This forum is not monitored by Oracle support or product teams and so Oracle product and technology related questions will not be answered. We recommend that you post this thread to the appropriate Database forum.
    The main URL is:
    http://forums.oracle.com/forums/index.jsp?cat=18

  • Export dump files.

    hi all.
    i am using Enterprise Linux server version 5.1.19.6 /Red Hat and Oracle database 11G R2.I want
    to export dump files automatically like we are doing in windows xp with scheduled Task.
    Is it Possible in Linux?plz help me out thanks in advance.

    Mehwish wrote:
    hi all.
    i am using Enterprise Linux server version 5.1.19.6 /Red Hat and Oracle database 11G R2.I want
    to export dump files automatically like we are doing in windows xp with scheduled Task.
    Is it Possible in Linux?plz help me out thanks in advance.Yes,possible,you can use cron job refer
    http://adminschoice.com/crontab-quick-reference
    But for backup recovery operation you should be use rman(recover manager),and through enterprise manager you can schedule backup job.
    http://download.oracle.com/docs/cd/E10317_01/doc/backup.102/e05407/osb_rman_backup.htm
    http://download.oracle.com/docs/cd/B12037_01/server.101/b10734/rcmrecov.htm

  • Just erased my disk, whent to install Maverick and get : "Not available now" message since yesterday

    Just erased my disk, whent to install Maverick and get : "Not available now" message since yesterday. Can someone help me revive my dead IMac!!

    If you are selling or giving away the computer then follow these instructions:
    Follow these instructions step by step to prepare a Mac for sale:
    Step One - Back up your data:
           A. If you have any Virtual PCs shut them down. They cannot be in their "fast saved" state. They must
               be shut down from inside Windows.
           B. Clone to an external drive using using Carbon Copy Cloner.
              1. Open Carbon Copy Cloner.
              2. Select the Source volume from the Select a source drop down menu on the left side.
              3. Select the Destination volume from the Select a destination drop down menu on the right side.
              4. Click on the Clone button. If you are prompted about creating a clone of the Recovery HD be sure
                  to opt for that.
              Destination means a freshly erased external backup drive. Source means the internal startup drive.
    Step Two - Prepare the machine for the new buyer:
              1. De-authorize the computer in iTunes! De-authorize both iTunes and Audible accounts.
              2, Remove any Open Firmware passwords or Firmware passwords.
              3. Turn the brightness full up and volume nearly so.
              4. Turn off File Vault, if enabled.
              5. Disable iCloud, if enabled: See What to do with iCloud before selling your computer.
    Step Three - Install a fresh OS:
         A. Snow Leopard and earlier versions of OS X
              1. Insert the original OS X install CD/DVD that came with your computer.
              2. Restart the computer while holding down the C key to boot from the CD/DVD.
              3. Select Disk Utility from the Utilities menu; repartition and reformat the internal hard drive.
                  Optionally, click on the Security button and set the Zero Data option to one-pass.
              4. Install OS X.
              5. Upon completion DO NOT restart the computer.
              6. Shutdown the computer.
         B. Lion and Mountain Lion (if pre-installed on the computer at purchase*)
             Note: You will need an active Internet connection. I suggest using Ethernet if possible
                        because it is three times faster than wireless.
              1. Restart the computer while holding down the COMMAND and R keys until the Mac OS X
                  Utilities window appears.
              2. Select Disk Utility from the Mac OS X Utilities window and click on the Continue button.
              3. After DU loads select your startup volume (usually Macintosh HD) from the left side list.
                  Click on the Erase tab in the DU main window.
              4. Set the format type to Mac OS Extended (Journaled.) Optionally, click on the Security button
                  and set the Zero Data option to one-pass.
              5. Click on the Erase button and wait until the process has completed.
              6. Quit DU and return to the Mac OS X Utilities window.
              7. Select Reinstall Lion/Mountain Lion and click on the Install button.
              8. Upon completion shutdown the computer.
    *If your computer came with Lion or Mountain Lion pre-installed then you are entitled to transfer your license once. If you purchased Lion or Mountain Lion from the App Store then you cannot transfer your license to another party. In the case of the latter you should install the original version of OS X that came with your computer. You need to repartition the hard drive as well as reformat it; this will assure that the Recovery HD partition is removed. See Step Three above. You may verify these requirements by reviewing your OS X Software License.

  • If exporting images for backup, to reimport into a clean aperture or other program, is it best to use 72 dpi or 300 dpi, or does it matter?

    If exporting images for backup, to reimport into a clean aperture or other program, is it better to use 72 dpi or 300 dpi, or does it matter?  I want the best quality for any future unforeseen use. 

    I am somewhat reluctant to answer your questions after Frank Caggiano's excellent advice, but I really do not like to leave the question open, for there will be many occasions when you will need to export images and to understand how it works.
    But please, follow Frank's advice, right now you do not need to worry about pixels and dpi. That is exactly what I meant, when I suggested to you to make sure you keep a copy of your Aperture Library and to back it up with all your other data, before you erase your disk for a clean reinstall.
    DPI revisited:
    So, is the dpi setting only for exporting to print?
    The dpi settings are necessary to define the size of a digital image, since pixels don't have any dimensions. And since you cannot print or display an image without knowing its width and height, you will need to specify dpi when you are printing or scanning.
    If I leave it at the default 72dpi will there be any problems getting quality prints in the future from jpeg versions exported with that setting?
    Not if you export with the original size - the maximum number of pixels available. That will ensure the maximum print quality.
    The dpi settings are required to export versions; versions are derived from the masters and new image files are computed. When you export masters you get a copy of the original file that already may have a dpi setting.
    If I choose "export masters", will aperture will export my masters just as they are? 
    yes, and  you may add IPICT data if you choose
    Pardon my thick skull--I'm an old dog trying to learn new tricks in this digital world!
    No apologies necessary, we were all beginners once
    Here is another example, maybe that helps a little:
    I exported an image (jpeg) with three different setting: export masters, a version with 72dpi, a version with 300dpi and inspected the files in Graphic Converter:
    The master was exported like this:
    Notice, the master had already dpi settings, although I did not specify any on export.
    The size is 51,48 cm x 38,61 cm.
    The 72 dpi Version has larger dimensions, but also 10 Megapixels.
    and here the 300 dpi version: smaller dimensions, same amount of pixels.

  • Getting Datapump Export Dump file to the local machine

    I apologize to everyone as this is a duplicate post.
    Re: Getting Datapump Export Dump file to the local machine
    My initial thread(started yesterday)was in 'Database General' and didn't get much response today. Where do i post questions on EXPORT/IMPORT utilities?
    Anyway, here is my problem:
    I want to take the export dump of itemrep schema in orcl database (in a remote machine). I have an Oracle server (10G Rel2) running in my local Windows machine. I have created a user john with necessary EXPORT/IMPORT privileges in my local db. Then i created a Directory object,ie a folder named datapump in my local hard drive and granted READ WRITE privileges to john.
    So john, who is a user in my local machine's oracle db is going to run the expdp utility.
    expdp john/jhendrix@my_local_db_alias SCHEMAS=itemrep directory=datapump logfile=itemrepexp.log
    The above command will fail because it will look for itemrep schema inside my local db, not the remote db where the itemprep is actually located. And you can't qualify the schemaname with its db in the SCHEMAS parameter (like SCHEMAS=itemrep@orcl).
    Can anyone provide me a solution for this?

    I think you can initiate the datapump exp utility from your client machine to export a schema in a remote database.But, Upon execution,oracle looks for the directory in the remote database and not on your local machine.
    You're inovoking expdp from a client (local DB) to export data from a remote DB.
    So, With this method, you can create the dumpfiles only on the Remote server and not on the local Machine.
    You can perform a direct import instead of export using the NETWORK_LINK option.
    Create a DBlink from your local and Remote DB and verify the connection.
    Then,Initiate the Impdp from Your local machine DB using the parameter network_link=<db_link of the Remote DB> to import the schema.
    The advantage of this option eliminates the Dumpfile creation at the Server side.
    There are no dumpfiles during the import process. the Data is imported directly to the target schema.

Maybe you are looking for

  • Not able to un lock the pernr after using HR_MAINTAIN_MASTERDATA

    Hi, We using HR_MAINTAIN_MASTERDATA to create new employee.its working fine, we need to create A008 relation then we try to unlock the employee id using HR_EMPLOYEE_DEQUEUE not able to unlock.Ii tried all the other Function modules used for unlock th

  • System management patches that write to the registry write to wrong section on Win7x64 devices

    I have entered a SR on this but just seeing if everyone else is running into this on Win7x64 devices. As an example if i try to deploy remediation of the 'System Management - Disable Java Update for Java JRE (Disabled)' patch on a Win7x64 device I en

  • Play book is not starting

    I had won a playbook in blackberry 10 app challenge .i worked 2 months fine.Now the problem is that is is not starting (power on is the problem)

  • Prevent LBFO Primary/Standby fallback

    Hi All, I have a client requirement to implement network redundancy on a Windows Server 2012 R2 platform. So enter two switches, firewalls, routers etc. As part of this I have set up an LBFO team on NIC 1 & 2. The client has since informed me that th

  • Editing Raw in iPhoto '08

    I upgraded to Leopard and ilife '08. When I had Tiger and iphoto '06 I was editing raw photos with no problems.. Now, however, after investing all this time and money in upgrading I can't edit my raw photos in iphoto. I tried to go to the page that l