Attach datapump export job

Hi Guys,
I am using Oracle 10g Release 2 on Solaris.
I have database that is 1.5 TB and I am doing datapump export of this database of which datapump estimate is 500GB.
Now after the 300GB export the server crashed.
Will I be able to attach the data pump export job and continue from 300GB after database startup?
NB I am using the parameter flashback_time for data consistency.
Please Help !!!!!!!!!!!!!!
Thanks.

Thanks for the reply...
I tried to attach the job after the database startup and here is what I get:
expdp \"/ as sysdba\" attach=SYS_EXPORT_FULL_01Export: Release 10.2.0.2.0 - 64bit Production on Saturday, 30 July, 2011 17:50:31
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - 64bit Production
With the Partitioning, OLAP and Data Mining options
ORA-39002: invalid operation
ORA-39068: invalid master table data in row with PROCESS_ORDER=-59
ORA-39150: bad flashback time
ORA-00907: missing right parenthesis
I guess I just have to restart the job as i cannot attach that job...
Thanks...

Similar Messages

  • Datapump - export job problem

    Just started playing with this new feature of 10g. I created a new export job through Enterprise Manager - database control. Now when I tried to delete it, its giving me an error message, which is as follows,
    Error
    The specified job, job run or execution is still active. It must finish running, or be stopped before it can be deleted. Filter on status 'Active' to see active executions
    I stopped this process successfully so many times (even I don't remember now, how many times) through database control but when I try to again delete run, it gives me the same error message.
    I logged on SQLPlus and checked that this process is still active as it has an entry into DBA_DATAPUMP_JOBS view. I deleted the corresponding table and the entry is gone from the view but when I checked in the database control, the job execution is still there with a status of "Stop Pending"
    Can somebody help me in this, I mean how can I delete that job from the database control. If you need any other information to help me, I am more than willing to provide the same.
    The job is owned by system. My platform is Windows XP Professional.
    Any help is greatly appreciated as I am doing different things for last 2 days with no success.
    Regards,

    Hi Bhargava,
    What do you get when you execute this block -
    set serverout on;
    declare
    myhandle number;
    begin
    myhandle := dbms_datapump.attach('JOB_NAME','JOB_OWNER');
    dbms_output.put_line(myhandle);
    dbms_datapump.detach(myhandle);
    end;
    If this block executes without error and prints out a number, then you can try to stop the job with this block:
    declare
    myhandle number;
    begin
    myhandle := dbms_datapump.attach('JOB_NAME','JOB_OWNER');
    dbms_output.put_line(myhandle);
    dbms_datapump.stop_job(myhandle, 1,0,0 );
    end;
    Here is an article with more information on the pl/sql API to dbms_datapump:
    http://www.devx.com/dbzone/Article/30355
    Here is the dbms_datapump documentation:
    http://download-east.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_datpmp.htm
    -Natalka
    http://toolkit.rdbms-insight.com

  • How to Restart a Datapump Export Job

    hi experts,
    I have 10g on Windows.
    C:\Documents and Settings\jbates>set oracle_sid = ultradev
    C:\Documents and Settings\jbates>expdp system/ThePassword ATTACH=DP_EXPORT_ULTRADEV_SSU
    Export: Release 10.2.0.4.0 - 64bit Production on Tuesday, 04 August, 2009 10:04:
    41
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit
    Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-31626: job does not exist
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.KUPV$FT", line 438
    ORA-31638: cannot attach to job DP_EXPORT_ULTRADEV_SSU for user SYSTEM
    ORA-31632: master table "SYSTEM.DP_EXPORT_ULTRADEV_SSU" not found, invalid, or i
    naccessible
    ORA-00942: table or view does not exist
    When I run select * from dba_datapump_jobs the job does exist and has no attached sessions and the State is Not Running.
    How can I attach and restart this job? I'm missing something very simple I'm sure.
    Thanks, John

    Probably you started the job as different user than system, and this is why, you receive:
    ORA-31638: cannot attach to job DP_EXPORT_ULTRADEV_SSU for user SYSTEM
    ORA-31632: master table "SYSTEM.DP_EXPORT_ULTRADEV_SSU" not found, invalid, or i
    naccessible
    ORA-00942: table or view does not existWith kind regards
    Krystian Zieja

  • Enterprise Manager Job for Scripting DataPump Export for Oracle Database Running On MS Windows Server 2008

    Greetings,
    I would like an example of an Enterprise Manager Job that uses an OS Script for MS Windows that would effectively run a datapump export of my Oracle 11g database (11.2.0.3) running on a Windows 2008 server.  My OEM OMS is running on a Linux server with an Oracle 12c repository.  I'd like to be able to set environment variables for date and time, my export file name (that includes SID, and export date and time, job name, and other information pertinent to the datapump export.  Thus far, I have been unsuccessful with using the % delimiter around my variables.  Also, I have put the "cmd/c" as the "Interpreter" but I am not getting anywhere in a hurry :-( 
    Thanks  Million!!!
    Mike

    1. Try to reach server with IP )(bypath name resolution)
    2. Disabling IPv6 is not good idea
    3. What is server operating system and what is workstation operating system?
    4. Is this new or persistent problkem?
    5. If server and workstation has different SMB version, set higher to lower one (see Petri web for procedure)
    6. Uninstall AV with removal tool and test it without AV
    7. Use network monitor to diagnose network traffic.
    M.

  • Reg : Datapump export failing with ORA-31633 error

    Hi,
    I am trying to export one of the production databases (10.2.0.4.0) using datapump - backup is failing with the following error.
    [UBASE2]:/backup/exports/scripts>more /backup/exports/cron/cron_UBASE_log
    Export Start
    Sat Oct 10 10:32:00 GMT 2009
    Export: Release 10.2.0.4.0 - 64bit Production on Saturday, 10 October, 2009 10:32:00
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, Real Application Clusters, OLAP, Data Mining
    and Real Application Testing options
    ORA-31626: job does not exist
    ORA-31633: unable to create master table "SYSTEM.FULL_UBASE"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT", line 871
    ORA-00955: name is already used by an existing object
    Export End
    Sat Oct 10 10:32:01 GMT 2009
    I tried to attach to the existing job to kill that, but i am getting the below error -
    [UBASE2]:/backup/exports/scripts>expdp attach=FULL_UBASE
    Export: Release 10.2.0.4.0 - 64bit Production on Saturday, 10 October, 2009 14:31:46
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, Real Application Clusters, OLAP, Data Mining
    and Real Application Testing options
    ORA-31626: job does not exist
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.KUPV$FT", line 438
    ORA-31638: cannot attach to job FULL_UBASE for user SYS
    ORA-31632: master table "SYS.FULL_UBASE" not found, invalid, or inaccessible
    ORA-00942: table or view does not exist
    As this is critical production database please help me how to resolve this.
    Thanks in Advance.
    Raju A
    Oracle DBA.

    Hi,
    Have you verified if your DBMS_AQ environment has been corrupted? If so, then you'll have to drop and recreate a number of queue related objects all over again. Furthermore, ORA-00942 appears to be the main problem as it is a case sensitivity error. You should change the table names to UPPERCASE and then try again.
    Hope this helps.
    Regards,
    Naveed.

  • Export Job credentials

    I created an Export job in Oracle Enterprise Manager (listed in the Jobs Activity page off the main OEM database instance home). The job is apparently "owned" by the system user and I recently changed the password for that user. Now when the job runs, it gives me an ORA-01017: invalid username/password error.
    In the past, the only way i've found to get around this is to delete all instances of the export job and recreate it with the new password for the System user. Is there any way to alter the password associated with the job so I don't have to rebuild it?
    Thanks in advance.

    I've tried changing it on the credentials tab for the Job but OEM says "This job does not require any credentials."
    Also, we do use RMAN, but also use datapump export to aid in rebuilding our development environment as well as to use in recovery if RMAN backups don't work.

  • Error while Datapump EXPORT

    Hi,
    I scheduled one datapump(EXPDP) job at OS level. It is showing the below error.
    Can anybody help me on the below error.
    My OS : SunOS usa0300uz1078 5.10 Generic_144488-17 sun4u sparc SUNW,Sun-Fire-15000
    DB Version : 10.2.0.4.0
    ****** Beginning expdp of fndb1 on Mon Dec 12 05:45:00 EST 2011 ******
    **ld.so.1: expdp: fatal: libclntsh.so.10.1: open failed: No such file or directory**
    ****** Ending export of fndb1 on Mon Dec 12 05:45:00 EST 2011 ******
    **COMPRESSING dump files**
    ****** Ending of Compression Mon Dec 12 05:45:00 EST 2011 ******
    Thanks
    Raj

    Hi raj;
    Please see:
    OS Command Job Fails Calling Sqlplus or expdp [ID 1259434.1]
    Hope it helps,
    Regard
    Helios

  • Where are device export jobs managed?

    I created a scheduled device export job from the CS/Device Management/Device summary page to run daily and create a csv file. This ran fine for several months, but then seemed to stop. I think we had an issue with the jrm process, long since resolved. During that time I created another scheduled export job. I think they are now conflicting with each other (export to the same file name). I was hoping to delete one of them, but am unable to determine where they are stored. Just for a test I created a third job, noted the jobID, but can't find that one either. They don't seem to be listed in the RME job browser. Where are these stored and how do I delete the extraneous jobs?
    Perhaps a related issue. When go to the System View of CW, there is a panel named Job Information Status. It always contains only the string 'Loading....' in red (along with Log Space Usage and Critical Message Window). Thoughts?.

    My guess is you have a lot of jobs on this system, and jrm is not returning fast enough.  I find that Firefox is a bit more tolerant to delays than IE.  If you can, try FF, and see if the job browser loads.  If so, purge some old jobs.
    Posted from my mobile device.

  • Interface Problems: DBA = Data Pump = Export Jobs (Job Name)

    Hello Folks,
    I need your help in troubleshooting an SQL Developer interface problem.
    DBA => Data Pump => Export Jobs (Job Name) => Data Pump Export => Job Scheduler (Step):
    -a- Job Name and Job Description fields are not visible. Well the fields are there but each of them just 1/2 character wide. I can't see/enter anything in the fields.
    Import Wizard:
    -b- Job Name field under the first "Type" wizard's step looks exactly the same as in Export case.
    -c- Can't see any row under "Chose Input Files" section (I see just ~1 mm of the first row and everything else is hidden).
    My env:
    -- Version 3.2.20.09, Build MAIN-09.87
    -- Windows 7 (64 bit)
    It could be related to the fact that I did change fonts in the Preferences. As I don't know what is the default font I can't change it back to the default and test (let me know what is the default and I will test it).
    PS
    -- Have tried to disable all extensions but DBA Navigator (11.2.0.09.87). It didn't help
    -- There are no any messages in the console if I run SQL Dev under cmd "sqldeveloper\bin\sqldeveloper.exe
    Any help is appreciated,
    Yury

    Hi Yury,
    a-I see those 1/2 character size text boxes (in my case on frequency) when the pop up dialog is too small - do they go away when you make it bigger?
    b- On IMPORT the name it starts with IMPORT - if it is the half character issue have you tried making the dialog bigger?
    c-I think it is size again but my dialog at minimum size is already big enough.
    Have you tried a smaller font - or making the dialogs bigger (resizing from the corners).
    I have a 3.2.1 version where I have not changed the fonts from Tools->Preferences->CodeEditor->Fonts appears to be:
    Font Name: DialogInput
    Font size: 12
    Turloch
    -SQLDeveloper Team

  • Scheduling an Export Job Using Grid Control

    Hi,
    I got a requirement to schedule export jobs in Oracle Grid Control 10.2.0.5 (Windows). Is this possible and have any one done this? If so please share the steps. The idea is to get alerts if the job fails.
    Thanks.
    GB

    Here is the easy steps (There might be slight differences as I am posting it based on 11g screen)
    1. On grid control console, click on the relevant database
    2. click on 'Data Movement' tab
    3. Click on 'Export to Export Files'
    4. chose the Export type (Database / Schema / Tables/ Tablespace)
    5. Provide Host credentials (OS username and Password) and click on 'Continue'. You will get a new screen called 'Export: Options'
    6. Click the checkbox called 'Generate Log File' and select the Directory object where you wants to create the dump files. (You need to create a directory object in database, if you don't have one)
    7. Chose the contents option as you required, and click '*Next*'. You will get a new page called 'Export: Files'
    8. Select the directory object from the drop-down box and provide a name format for file name, and click 'Next'
    9. Here you provide the Job name and description, and chose repeat options (daily, weekly etc), and click 'Next'
    10. You will get a summary screen called 'Export: Schedule'. review your job details and click on 'Submit'.
    This is the solution, and it works well.

  • Parallel Sessions on Datapump Export  (10.2.0.4)

    Hi,
    We are using Oracle 10.2.0.4 on Solaris and I'm exporting a table using Datapump export.
    The export includes a query which selects from three tables based on relevant conditions. The parfile specifies 'parallel=4' and the dumpfile setting uses %U so that it creates an appropriate number of files.
    When I run the export using my own (DBA) account (i.e. expdp mr_dba parfile=exp_xyz.par) the export completes in 15 minutes and creates four dumpfiles. When I run the export as the schema owner using the exact same parfile (i.e. expdp schema_own parfile=exp_xyz.par) the export takes over two hours and only creates two dumpfiles.
    Could anyone suggest things that I could look at to find out why there is such a difference in the elapsed time? The exports have been run a number of times as both users with the box having similar loads and the results are fairly consistent i.e. 15 mins for my user and two hours for the schema owner.
    The schema owner does have a different profile and a different Resource Consumer Group but both my profile and the schema owners profile have 'sessions_per_user' set to Unlimited. In Resource Manager the Parallel_Degree_Limit_P1 value is set to 16 for my consumer group and is not set at all for the schema owners consumer group.
    I have observed that when exporting under the schema owner the DBA_DATAPUMP_SESSIONS showed a DBMS_DATAPUMP session, a MASTER session and two WORKER sessions. When I run it under my user id it shows these four sessions but also shows three EXTERNAL TABLE sessions. This suggests that it is using a different approach but I'm not sure what would cause this.
    Any advice would be very welcome. I haven't posted any specific information about the parameter file or the tables as I'm not sure what info people might require - so if you need specific details of anything please let me know.
    Many thanks.

    Sorry for the delay in responding - it took a couple of days for our security people to give me the go-ahead to make the changes (red tape is ridiculous here!)
    The tweak to the consumer groups in Resource Manager didn't seem to make much difference and it continued to use the same plan (but it was worth trying it). I then granted the EXP_FULL_DATABASE role and it did indeed result in much better performance (and it created the four dumpfiles instead of two).
    I'm still not sure why it makes such a difference - the export is only exporting a table from the users schema but it does query a table in someone else's schema to identify appropriate candidates. You would assume that providing it can access all the necessary information it would run at the optimum speed but obviously the EXP_FULL_DATABASE role makes a considerable difference.
    Thanks again for both replies, much appreciated. Well done Dean for identifying the solution - great call.
    Edited by: user2480656 on 21-Aug-2012 08:35

  • Problem Exporting Job to a Production Repository

    We're running Data Integrator 11.5.
    I'm exporting jobs from my development repo to a production repo and am having some weirdness when starting jobs from the Designer client.  No matter which repo I connect to, if I start a job, I see it running in the Monitor tab of both repositories.  Same name and everything.  It's really only running in the repo I start it in, using the correct System Configuration settings, etc., but I can see it in both.  I cannot view the log contents from within the non-starting repo, however.  Bu, if I cancel it in the non-starting repo, it will cancel the running job.  Even if I go back into my dev repo and rename the jobs, when I start the job in the prod repo, it will show the dev job, with it's different name, running in the dev repo monitor (even though, in fact, that job is not running).  In the prod repo monitor, I see the correct job name.
    I'm wondering if anyone else has had these kinds of issues and whether it's something I can fix, or whether I'm just going to have to live with it until we upgrade to DS 4.0 in a couple of months.

    Andy
    It has everything to do with the Format of the card.
    These cards are formatted FAT16 or FAT 32, and these older Windows formats have a limit to the number of objects they can have on the root directory of the volume. So, don’t copy the images from the folder on your desktop, just drag the folder to the card...
    Regards
    TD

  • Import/Export Job in management portal.

    I am trying to create an Import/Export job in portal, I read this document:
    http://azure.microsoft.com/en-us/documentation/articles/storage-import-export-service/
    and I am unable to do this, document says download import/export tool, but I am unable to do that any help will be greatly appreciated.

    Hi,
    I would request you to check if you meet all these prerequisites:
    • You must have an active Azure subscription.
    • Your subscription must include a storage account with enough available space to store the files you are going to import.
    • You need at least one of the account keys for the storage account.
    • You need a computer (the "copy machine") with Windows 7, Windows Server 2008 R2, or a newer Windows operating system installed.
    • The .NET Framework 4 must be installed on the copy machine.
    • BitLocker must be enabled on the copy machine.
    • You will need one or more empty 3.5-inch SATA hard drives connected to the copy machine.
    • The files you plan to import must be accessible from the copy machine, whether they are on a network share or a local hard drive.
    Regards,
    Azam Khan

  • Import/Export Jobs Fail - ORA-01017 Invalid UserName/PW - Solved!

    Every time I try to run either an Import or Export Job from OEM it always fails with ORA-01017 Invalid UserName/PW.
    I have read numerous posts in this forum on related topics.
    Suggestion have been;
    1. Make sure OS login is a member of the ORA_DBA group.
    2. Make sure the OS user has the ability to logon as batch job.
    3. Make sure the ORACLE_SID and ORACLE_HOME are set.
    4. Make sure to set up Preferred Credentials on the Preferences page.
    4. On and on and on.
    I am using Oracle Version 10.2.0.1.0 On Windows 2003 Server Enterprise SP1.
    When I installed the DB using Oracle Universal Installer it asks what Password you would like to assign to SYS, SYSTEM etc. I used the following password "AvT_$651#JK" which it accepted without problem. The installation completed and I am able to log into OEM using SYS/AvT_$651#JK (SYSDBA) or SYSTEM/AvT_$651#JK (Normal).
    I then proceed to "import from export files" a schema from a previously "export to export files" that was created on another host using the same Oracle Version 10.2.0.1.0 On Windows 2003 Server Enterprise SP1. This job always fails with "ORA-01017 Invalid UserName/PW" even though the username and pw are correct!
    Turns out the password itself is the problem, apparently some part of the process (RMAN??) does not like the special characters in the PW. Changing the PW to "testpw1" worked.
    Using Oracle Universal Installer should have prevented me from using a password that violated this policy.
    BG...

    Does you provide username under local security policy along with domainname like
    abc\uo.cok ? If not, please enter username along with domainname and let me know the result. If possible aste eventviewer log for this connection

  • Which background process involves in datapump export/import?

    Hi guys,
    Could any one please tell me which background process involves in datapump export and import activity? . any information please.
    /mR

    Data pump export and import is done by foreground server processes (master and workers), not background.
    http://www.acs.ilstu.edu/docs/Oracle/server.101/b10825/dp_overview.htm#sthref22

Maybe you are looking for

  • Update IOS 7.0.4 Network Error 9006

    I just update my iphone to IOS 7.0.4 but i facing trouble with it. During processing file, it keep pop out says my network error 9006 and whole thing just stoped. I have already tried lots of time but it could not work. i also have tried deauthorise,

  • Add column in Staff Assignemnt Area(Structure) of transaction PPOME

    Hi, I want to add a new column with custom information in Staff assignment( staructure search) area of transaction PPOME. I tried using the confirguration to add this information but couldnt reach any conclusion. Need Help. Regards

  • Error in value-transfer between program and screen.

    Hello guys. I want ho hide one TC column when creating and changing document. This column has to be visible when document is in display mode. And this is the problem. I take value of the field which has to be displayed in this column from DB table in

  • How to assgin parameters to the rule standard rule 21000056 in workflow.

    Hi, I am trying to test the standard Workflow for payment approval and reversal documents in FICA. where I am using two standard workflow WS21000093 and 21000089 for Document change and Document reversal. For these Two workfows are using the same rul

  • Lost app data

    My iPhone (3GS) just lost its apps today. It shows just a small number of the apps I had installed. I connected to my PC to look for a backup, but in doing so I think it overwote my backup! Apple backup can't really be this stupic, can it?? I've got