Export job

As part of my backup policy I want to create a Grid Control schedule job executing an export command.
I have created a OS type job with this command in one line:
export ORACLE_HOME=/oracle/ora10g ; export ORACLE_SID=psep10g1 ; export NLS_LANG=spanish_spain ; $ORACLE_HOME/bin/exp userid=\'/ as sysdba\' file=/oracle/abd/copias/export_total_psep10g.dmp full=y statistics=none
The problem is in "userid" parameter. I'd like to log in with oracle user credentials in the local machine 'cos I use '/ as sysdba'. I don't want use any password.
I have tried 4 methods with error:
1) userid='/ as sysdba' (LRM-00112: multiple values not allowed for parameter 'userid')
2) userid=\'/ as sysdba\' (sh: unexpected EOF while looking for matching `'')
3) userid="/ as sysdba" (LRM-00112: multiple values not allowed for parameter 'userid')
4) userid=\"/ as sysdba\" (sh: unexpected EOF while looking for matching `"')
Any help?
Thanks.

As part of my backup policy Why is it that your policy does not allow you to schedule it as EM "Export to Export Files" ?
BTW - If you must schedule it as OS Command job, then enter those in an export script and then call it from the job.

Similar Messages

  • Where are device export jobs managed?

    I created a scheduled device export job from the CS/Device Management/Device summary page to run daily and create a csv file. This ran fine for several months, but then seemed to stop. I think we had an issue with the jrm process, long since resolved. During that time I created another scheduled export job. I think they are now conflicting with each other (export to the same file name). I was hoping to delete one of them, but am unable to determine where they are stored. Just for a test I created a third job, noted the jobID, but can't find that one either. They don't seem to be listed in the RME job browser. Where are these stored and how do I delete the extraneous jobs?
    Perhaps a related issue. When go to the System View of CW, there is a panel named Job Information Status. It always contains only the string 'Loading....' in red (along with Log Space Usage and Critical Message Window). Thoughts?.

    My guess is you have a lot of jobs on this system, and jrm is not returning fast enough.  I find that Firefox is a bit more tolerant to delays than IE.  If you can, try FF, and see if the job browser loads.  If so, purge some old jobs.
    Posted from my mobile device.

  • Interface Problems: DBA = Data Pump = Export Jobs (Job Name)

    Hello Folks,
    I need your help in troubleshooting an SQL Developer interface problem.
    DBA => Data Pump => Export Jobs (Job Name) => Data Pump Export => Job Scheduler (Step):
    -a- Job Name and Job Description fields are not visible. Well the fields are there but each of them just 1/2 character wide. I can't see/enter anything in the fields.
    Import Wizard:
    -b- Job Name field under the first "Type" wizard's step looks exactly the same as in Export case.
    -c- Can't see any row under "Chose Input Files" section (I see just ~1 mm of the first row and everything else is hidden).
    My env:
    -- Version 3.2.20.09, Build MAIN-09.87
    -- Windows 7 (64 bit)
    It could be related to the fact that I did change fonts in the Preferences. As I don't know what is the default font I can't change it back to the default and test (let me know what is the default and I will test it).
    PS
    -- Have tried to disable all extensions but DBA Navigator (11.2.0.09.87). It didn't help
    -- There are no any messages in the console if I run SQL Dev under cmd "sqldeveloper\bin\sqldeveloper.exe
    Any help is appreciated,
    Yury

    Hi Yury,
    a-I see those 1/2 character size text boxes (in my case on frequency) when the pop up dialog is too small - do they go away when you make it bigger?
    b- On IMPORT the name it starts with IMPORT - if it is the half character issue have you tried making the dialog bigger?
    c-I think it is size again but my dialog at minimum size is already big enough.
    Have you tried a smaller font - or making the dialogs bigger (resizing from the corners).
    I have a 3.2.1 version where I have not changed the fonts from Tools->Preferences->CodeEditor->Fonts appears to be:
    Font Name: DialogInput
    Font size: 12
    Turloch
    -SQLDeveloper Team

  • Scheduling an Export Job Using Grid Control

    Hi,
    I got a requirement to schedule export jobs in Oracle Grid Control 10.2.0.5 (Windows). Is this possible and have any one done this? If so please share the steps. The idea is to get alerts if the job fails.
    Thanks.
    GB

    Here is the easy steps (There might be slight differences as I am posting it based on 11g screen)
    1. On grid control console, click on the relevant database
    2. click on 'Data Movement' tab
    3. Click on 'Export to Export Files'
    4. chose the Export type (Database / Schema / Tables/ Tablespace)
    5. Provide Host credentials (OS username and Password) and click on 'Continue'. You will get a new screen called 'Export: Options'
    6. Click the checkbox called 'Generate Log File' and select the Directory object where you wants to create the dump files. (You need to create a directory object in database, if you don't have one)
    7. Chose the contents option as you required, and click '*Next*'. You will get a new page called 'Export: Files'
    8. Select the directory object from the drop-down box and provide a name format for file name, and click 'Next'
    9. Here you provide the Job name and description, and chose repeat options (daily, weekly etc), and click 'Next'
    10. You will get a summary screen called 'Export: Schedule'. review your job details and click on 'Submit'.
    This is the solution, and it works well.

  • Problem Exporting Job to a Production Repository

    We're running Data Integrator 11.5.
    I'm exporting jobs from my development repo to a production repo and am having some weirdness when starting jobs from the Designer client.  No matter which repo I connect to, if I start a job, I see it running in the Monitor tab of both repositories.  Same name and everything.  It's really only running in the repo I start it in, using the correct System Configuration settings, etc., but I can see it in both.  I cannot view the log contents from within the non-starting repo, however.  Bu, if I cancel it in the non-starting repo, it will cancel the running job.  Even if I go back into my dev repo and rename the jobs, when I start the job in the prod repo, it will show the dev job, with it's different name, running in the dev repo monitor (even though, in fact, that job is not running).  In the prod repo monitor, I see the correct job name.
    I'm wondering if anyone else has had these kinds of issues and whether it's something I can fix, or whether I'm just going to have to live with it until we upgrade to DS 4.0 in a couple of months.

    Andy
    It has everything to do with the Format of the card.
    These cards are formatted FAT16 or FAT 32, and these older Windows formats have a limit to the number of objects they can have on the root directory of the volume. So, don’t copy the images from the folder on your desktop, just drag the folder to the card...
    Regards
    TD

  • Import/Export Job in management portal.

    I am trying to create an Import/Export job in portal, I read this document:
    http://azure.microsoft.com/en-us/documentation/articles/storage-import-export-service/
    and I am unable to do this, document says download import/export tool, but I am unable to do that any help will be greatly appreciated.

    Hi,
    I would request you to check if you meet all these prerequisites:
    • You must have an active Azure subscription.
    • Your subscription must include a storage account with enough available space to store the files you are going to import.
    • You need at least one of the account keys for the storage account.
    • You need a computer (the "copy machine") with Windows 7, Windows Server 2008 R2, or a newer Windows operating system installed.
    • The .NET Framework 4 must be installed on the copy machine.
    • BitLocker must be enabled on the copy machine.
    • You will need one or more empty 3.5-inch SATA hard drives connected to the copy machine.
    • The files you plan to import must be accessible from the copy machine, whether they are on a network share or a local hard drive.
    Regards,
    Azam Khan

  • Import/Export Jobs Fail - ORA-01017 Invalid UserName/PW - Solved!

    Every time I try to run either an Import or Export Job from OEM it always fails with ORA-01017 Invalid UserName/PW.
    I have read numerous posts in this forum on related topics.
    Suggestion have been;
    1. Make sure OS login is a member of the ORA_DBA group.
    2. Make sure the OS user has the ability to logon as batch job.
    3. Make sure the ORACLE_SID and ORACLE_HOME are set.
    4. Make sure to set up Preferred Credentials on the Preferences page.
    4. On and on and on.
    I am using Oracle Version 10.2.0.1.0 On Windows 2003 Server Enterprise SP1.
    When I installed the DB using Oracle Universal Installer it asks what Password you would like to assign to SYS, SYSTEM etc. I used the following password "AvT_$651#JK" which it accepted without problem. The installation completed and I am able to log into OEM using SYS/AvT_$651#JK (SYSDBA) or SYSTEM/AvT_$651#JK (Normal).
    I then proceed to "import from export files" a schema from a previously "export to export files" that was created on another host using the same Oracle Version 10.2.0.1.0 On Windows 2003 Server Enterprise SP1. This job always fails with "ORA-01017 Invalid UserName/PW" even though the username and pw are correct!
    Turns out the password itself is the problem, apparently some part of the process (RMAN??) does not like the special characters in the PW. Changing the PW to "testpw1" worked.
    Using Oracle Universal Installer should have prevented me from using a password that violated this policy.
    BG...

    Does you provide username under local security policy along with domainname like
    abc\uo.cok ? If not, please enter username along with domainname and let me know the result. If possible aste eventviewer log for this connection

  • Attach datapump export job

    Hi Guys,
    I am using Oracle 10g Release 2 on Solaris.
    I have database that is 1.5 TB and I am doing datapump export of this database of which datapump estimate is 500GB.
    Now after the 300GB export the server crashed.
    Will I be able to attach the data pump export job and continue from 300GB after database startup?
    NB I am using the parameter flashback_time for data consistency.
    Please Help !!!!!!!!!!!!!!
    Thanks.

    Thanks for the reply...
    I tried to attach the job after the database startup and here is what I get:
    expdp \"/ as sysdba\" attach=SYS_EXPORT_FULL_01Export: Release 10.2.0.2.0 - 64bit Production on Saturday, 30 July, 2011 17:50:31
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    ORA-39002: invalid operation
    ORA-39068: invalid master table data in row with PROCESS_ORDER=-59
    ORA-39150: bad flashback time
    ORA-00907: missing right parenthesis
    I guess I just have to restart the job as i cannot attach that job...
    Thanks...

  • Export job on EM Grid Control 10gR2

    As far as I understand Export job on EM Grid Control 10gR2 can be created only on Database Maintenance page. Also Create Like is not supported for this job type, it cannot be edited and cannot be in Job Library. Thus, its definition is not in Grid Control repository.
    I think dealing with Export job is not much user friendly on EM Grid Control 10gR2. Am I right?
    Thank you.

    Thank you for interesting for my question.
    Every time I want to create Export job I can create it through wizard. After that I cannot use neither Create Like feature nor edit this job. I have to use wizard to create another job from scratch.
    When I use RMAN scripts job I can notice much better features.
    In addition, I do not know whether the job definition exists in Grid Control repository.

  • ORA-00904 when exporting job queues

    Hi!
    I have a 8.1.6.3.7 database (although you only see 8.1.6.3.0) when you log in) which I'm trying to export.
    Everything is fine until the time comes to export job queues. I then get the following error messages:
    EXP-00008: ORACLE error 904 encountered
    ORA-00904: invalid column name
    EXP-00000: Export terminated unsuccessfully
    Now, I've just spent my morning looking through the discussion forums to see if anyone else had this error, but no luck.
    What I've done:
    - Rerun catexp: no change
    - ensured that I'm running the same version of the RDBMS and export (v8.1.6.3(.7) and 8.1.6.3.0): ok?
    Any ideas anyone?
    Thanks,
    Kevin

    Hi Prathmesh,
    in fact I saw this very thread before and I made sure that both solutions were applied. Moreover as I said patch 6991626 had already been applied earlier, precisley to fix this problem, and I've had been able to successfully export other albeit somewhat smaller schemas (500M instead of 3GB) in the last few months. This is why I was so puzzled to see that exact bug raise its ugly head again. As far as I can tell I didn't do any modification to the DB since that last patch in nov. 2009. In fact the DB has been running pretty much untouched since then.
    I even tried yestereday to reinstalled the patch again; opatch does the operation gracefully, first rolling back the patch then reapplying it again, with only a warning about the patch being already present. However the pb does not get fixed any better.
    Thanks a lot for your help,
    Chris

  • Datapump - export job problem

    Just started playing with this new feature of 10g. I created a new export job through Enterprise Manager - database control. Now when I tried to delete it, its giving me an error message, which is as follows,
    Error
    The specified job, job run or execution is still active. It must finish running, or be stopped before it can be deleted. Filter on status 'Active' to see active executions
    I stopped this process successfully so many times (even I don't remember now, how many times) through database control but when I try to again delete run, it gives me the same error message.
    I logged on SQLPlus and checked that this process is still active as it has an entry into DBA_DATAPUMP_JOBS view. I deleted the corresponding table and the entry is gone from the view but when I checked in the database control, the job execution is still there with a status of "Stop Pending"
    Can somebody help me in this, I mean how can I delete that job from the database control. If you need any other information to help me, I am more than willing to provide the same.
    The job is owned by system. My platform is Windows XP Professional.
    Any help is greatly appreciated as I am doing different things for last 2 days with no success.
    Regards,

    Hi Bhargava,
    What do you get when you execute this block -
    set serverout on;
    declare
    myhandle number;
    begin
    myhandle := dbms_datapump.attach('JOB_NAME','JOB_OWNER');
    dbms_output.put_line(myhandle);
    dbms_datapump.detach(myhandle);
    end;
    If this block executes without error and prints out a number, then you can try to stop the job with this block:
    declare
    myhandle number;
    begin
    myhandle := dbms_datapump.attach('JOB_NAME','JOB_OWNER');
    dbms_output.put_line(myhandle);
    dbms_datapump.stop_job(myhandle, 1,0,0 );
    end;
    Here is an article with more information on the pl/sql API to dbms_datapump:
    http://www.devx.com/dbzone/Article/30355
    Here is the dbms_datapump documentation:
    http://download-east.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_datpmp.htm
    -Natalka
    http://toolkit.rdbms-insight.com

  • Export Job credentials

    I created an Export job in Oracle Enterprise Manager (listed in the Jobs Activity page off the main OEM database instance home). The job is apparently "owned" by the system user and I recently changed the password for that user. Now when the job runs, it gives me an ORA-01017: invalid username/password error.
    In the past, the only way i've found to get around this is to delete all instances of the export job and recreate it with the new password for the System user. Is there any way to alter the password associated with the job so I don't have to rebuild it?
    Thanks in advance.

    I've tried changing it on the credentials tab for the Job but OEM says "This job does not require any credentials."
    Also, we do use RMAN, but also use datapump export to aid in rebuilding our development environment as well as to use in recovery if RMAN backups don't work.

  • Error message when creating export Job via GRID

    Hello All,
    I am trying to do an export using GRID control 10.2.0.4. The database host is a Windows 2003 server with Oracle 11.1.0.6 database. After I create the export job, GRID generates the following error message and the job fails:
    Connect Failed: ORA-12557: TNS:protocol adapter not loadable (DBD ERROR: OCIServerAttach)
    Any ideas?
    Thanks,
    JY

    I think you may be hitting bug 6272323. This is fixed in 10.2.0.5, problem exporting 11g databases on windows with 10.2 GC. I suggest you raise a SR and ask for a fix.

  • Scheduled Export Job

    Hi,
    We need to create an export dump file on a daily basis. However, the dump file should only be updated and not created multiple times.
    How can we create a scheduled (Data Pump) export job? We are running 10gR2 on Windows Server 2003.
    Thanks!

    Hello,
    when you say the file should be "updated" - do you mean it should be overwritten ?
    If yes, maybe one solution could be to create the dump job as it should run every night and just put it in a .cmd file (as you write you are on a windows server) and run this .cmd file as a "Scheduled Task" in the Windows Operating System.
    Kind regards

  • Session parameter for a export job

    Dear all,
    10.2.0.1 on windows 2003.
    Is there anyway I can set this session parameter ALTER SESSION SET EVENTS '10231 TRACE NAME CONTEXT FOREVER, LEVEL 10'; for a export job using exp done at the OS level ?
    Kai

    Is there anyway I can set this session parameter ALTER SESSION SET EVENTS '10231 TRACE NAME CONTEXT FOREVER, LEVEL 10'; for a export job using exp done at the OS levelyes, via LOGON trigger

  • Scheduling Export Job - 10g R2

    Hi,
    We would like to have our export run on a daily basis. However, the export dump file should be overwritten and not newly created. Does somebody have a suggestion or script that creates this job? This is independent from our regular backups.
    Thank you!

    In unix,
    First create a shell executable file
    #!/bin/ksh
    ORACLE_BASE=/u01/app/oracle
    ORACLE_SID=prod
    ORACLE_HOME=/u01/app/oracle/oracle/product/10.2.0/db_1
    export ORACLE_BASE ORACLE_SID ORACLE_HOME
    LANG=C;export LANG
    PATH=/u01/app/oracle/oracle/product/10.2.0/db_1/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/home/oracle/jdk1.6.0/bin:/home/oracle/bin:.
    export PATH
    CLASSPATH=/home/oracle/jdk1.6.0/lib:.
    export CLASSPATH
    /u01/app/oracle/oracle/product/10.2.0/db_1/bin/exp jj/jj tables=tab5 file=/u04/expdp_dir/tab5.dmp
    exit
    now make the file executable,
    chmod 755 /u04/expdp_dir/expdb.sh
    Now create a job in the scheduler
    begin
    dbms_scheduler.create_job
    (job_name=> 'daily_exp',
    job_type =>'executable',
    job_action =>'/u04/expdp_dir/expdb.sh',
    repeat_interval =>'FREQ=DAILY;BYHOUR=19',
    enabled=>TRUE);
    end;
    Well you will have to change the parameters as ur wish....
    rgds,
    Jj

Maybe you are looking for