Schedule export using cron?
Hi
I want to schedule the export of a table using cron job.
create the .sh file as below
export ORACLE_BASE=/opt/oracle
export ORACLE_HOME=/opt/oracle/product/9.2.6/db_1
export ORACLE_SID=orcl
/opt/oracle/product/9.2.6/db_1/bin/exp system/manager file=/home/oracle/test.dmp log=/home/oracle/test.log tables=table_name
$crontab -l
43 16 * * * /home/oracle/export.sh > /home/oracle/export.log
but when this crontab job runs export is not performed and i can't file the export dump file and logile.
test.log file is getting created with nothing in it.
Can any one let me know what is wrong with the shell script.
Regards,
Hi ,
Can you please change the permission of your export.sh file
$chmod 775 export.sh.
Study the below link for more details.
http://www.elated.com/articles/understanding-permissions/
Best regards,
Rafi.
http://rafioracledba.blogspot.com/
Similar Messages
-
I have scheduled a cron job to take rman backup of database everyday at 4PM. But I see that the empty backup.log file is getting generated while the script is not running. If I run the backup.sh script manually then the backup is running fine. The following are the scripts and the crontab details,
backup.sh script is as follows,
cd /u01/app/oracle/rmanbkp
rman target / <<EOF
run
allocate channel ch1 type disk format '/u01/app/oracle/rmanbkp/%d_%T_%s.bak';
backup database;
delete noprompt obsolete;
exit;
EOF
while the output of 'crontab -l' is
00 16 * * * /home/imsoracle/backup.sh > backup.log
Backup is not happening, what is wrong in my settings. Please advise.Dear $ami Malik,
There is another option that you can use for RMAN scripting as shown below;
0 2 * * 2 export ORACLE_HOME=/oracle/product/10.2.0/db_1/ ;
/oracle/product/10.2.0/db_1/bin/rman cmdfile /db/optima/archive/OPTPROD/RMAN/backup_full.sh
log /db/optima/archive/OPTPROD/RMAN/backup_full.log
vals2:/home/oracle#cat /db/optima/archive/OPTPROD/RMAN/backup_full.sh
connect target sys/password@optprod
backup database format '/db/optima/archive/OPTPROD/backupset/DB_FULL_%d_%t_%s_%p';
backup archivelog all delete all input format '/db/optima/archive/OPTPROD/backupset/ARC_%d_%t_%s';
exit;So you can construct such backup strategy with the cmdfile and log options of the recovery manager.
Hope That Helps.
Ogan -
Schedule Daily Export Using Datapump
Hi
I want to schedule my datapump daily full export using the oracle database features,Such as Enterprise Manager,
How can I do it?
Please tell me complete Solution,cause I have tried it so many times But...
Thank You In Advancehi hesam
u should better go with the shell scripting and add the script as the cron job at the required time that will be better
cronjobs :
http://www.quest-pipelines.com/newsletter/cron.htm
there are some problems that the exp commands will not work in the cronjobs
try below
or whatever your ksh shell is located.
# ! / bin / ksh
# setup the environment
export ORACLE_SID=TEST
export ORAENV_ASK=NO;
. oraenv
FILE_STUB=/u02/backup/$ORACLE_SID/exp_$ORACLE_SID_full
exp system/manager file=$FILE_STUB.dmp log=$FILE_STUB.log full=y direct=y
or, alternatively, you can setup your environment in your crontab:
10 0 * * * (/bin/ksh "export ORACLE_SID=TEST; export ORAENV_ASK=NO, . $HOME/.profile; $HOME/your_export_script_goes_here.ksh") -
Using Cron to schedule an rsync with Leopard?
Firstly, just wondering if any changes have taken place from Tiger to 10.5.5 in writing and running Cron command lines..I tried using Cron to run rsync at a scheduled time and accessing a command line on my desktop that works on its own, but didn't work. Would I have to put the command line into a specific location like Documents..??
ANy help would awesome!
Thanks
Mark50 15 * * * sh ~/Documents/mySOKdown.command
Yes it looks right, but that doesn't mean anything when dealing with cron.
First, I've seen systems where cron was running on a different timezone than what you are in. This is why I always try the
* * * * * date >/tmp/tmp.cron
which will tell me 2 things. First it will tell me cron is actually working as I will only get a /tmp/tmp.cron file with the current time and date in it if cront is working. The 2nd thing it will tell me is if cron is working in the same timezone that I'm working in.
Second, the cron environment does not include all the environment variables you normally see when you are using a terminal session. Here are the environment variables I see on my system:
SHELL=/bin/sh
USER=harris
PATH=/usr/bin:/bin
PWD=/Users/harris
SHLVL=1
HOME=/Users/harris
LOGNAME=harris
Make sure you do not depend on missing environment variables.
Again, I suggest you check to see if cron has reported any errors to you via /usr/bin/mail (NOTE: this is not Apple Mail). -
Hi,
I want to schedule some backups for certain tables at night... any idea about use cron with exp....
i tried as follows:
crontab -ei made the next line:
00 20 * * * exp userid=system/manager file=tt00.dmp tables=product
when i confirm the schedule is ok.... but it doenst work
Any help will be appreciatedthanks robert,
i do it.... here its the solution:
i made an script which contains:
ORACLE_SID=mySid
export ORACLE_SID
ORACLE_HOME=/oracle/product/8.1.7
export ORACLE_HOME
$ORACLE_HOME/bin/exp userid=bob/boby file=/home/user/bob/exporta.dmp tables=factventa,producto,cliente,acumventa
the script was save as name.bat
then i use cron as follows:
crontab -e
and in the file i wrote:
00 23 * * * /home/user/bob/name.bat
thats all !!! -
EJB Persistence using cron job
We have a EJB based bean managed persistence classes that run on the app server. I want to use same classes using cron job.
How do I do that???
Tried to execute these classes thru following steps...
In side stand alone java class - main method create instance of the pmf and get persistence manager.
Start a tread and call JDO object using persistence manager.
getting whole bunch or errors like system-server-config.xml file not found. I am not able to create InitialContext outside app server.
Please give your suggestions. TIAWhy don't you use a J2EE based scheduler?
http://java-source.net/open-source/job-schedulers/quartz
http://java-source.net/open-source/job-schedulers/jcrontab -
Schedule Export Processes with Custom Query
Hi,
Can I use Schedule Export Processes, with a custom query over HZ_PARTIES and HZ_LOCATIONS, in order to export suppliers data in a specific field order with join conditions?
I used Schedule Export Processes with "PARTY", selecting proper fields, but exported csv is in a different field order than the required by an external system.
Thanks,
Erick.Can you elaborate on the issue here. Are you looking for a way to run "ad-hoc" SQL queries for the filters ? I do not think that is possible, depending on your environment you may be able to create a custom view object to match your requirements.
As for the orders I think you should be able to control the order using the sequence as the FAQs state:
What happens if I change the sequence number or header text?
Changing the sequence number changes the order of the attributes in the exported data file. Changing the header text enables you to give a more intuitive meaning to the attribute and the associated data.
Jani Rautiainen
Fusion Applications Developer Relations
https://blogs.oracle.com/fadevrel/ -
Scheduled export hangs very often
Hi friends,
I've scheduled export running on DB server-Windows. I've created a batch file with export commands for about 5 databases and scheduled the job to run this batch file every night around 9 pm.
But, the export hangs quite often atleast 1-2 night a week with this message and when I come in the morning I had to hit enter for it to proceed:
batch file: (I've replaced the username,pwd etc in the below cmd, cannot provide on the forums...)
exp system/pwd@db1 file=exp_db1_full.dmp log=exp_db1_full.log full=y buffer=400000
exp username/pwd@db2 file=db2_user.dmp log=db2_user.log buffer=400000
exp system/pwd@db3 file=exp_db3_full.dmp log=exp_db3_full.log full=y buffer=400000
exp username/pwd@db4 file=db4_user.dmp log=db4_user.log buffer=300000
exp username/pwd@db5 file=db5_user.dmp log=db5_user.log buffer=300000
D:\>exp username/pwd@db4 file=db4_user.dmp log=db4_user.log buffer=300000
Export: Release 10.2.0.3.0 - Production on Wed Jan 13 21:17:36 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Release 10.2.0.3.0 - 64bit Production
EXP-00028: failed to open dbname_user.dmp for write
Export file: EXPDAT.DMP >It hangs mostly for db3,db4 or db5 exports.. Not consistent either. Just stops with this command. Please give me your suggestions...
Thanks a lotIt could be a couple of issues:
1. The file you gave it may be maxing out size wise and since there is no other file to use, exp builds the default file
expdat.dmp and prompts the user to make sure it is ok to use that file. To get around this, you could add a second
filename to the exp commands. If this happens for more than one job, you are overwriting the expdat.dmp file from the
previous job.
2. If for some reason it can't open that file, it would prompt for the expdat.dmp file.
So, by hitting <cr> when prompted, your exp job will write to the expdat.dmp file. It would be interesting to look at the size of the expdat.dmp file and the size of the file you thought the export was going to write to.
Dean -
Creating Schedule lines in Scheduling agreement using LSMW
Hi,
I want to upload multiple schedule lines in scheduling agreement using LSMW. I have done the recording and maintained multiple lines in input file. However during execution LSMW creates only one line and keeps on ovewritting the same line. Please let me know how can I create multiple schedule lines.
RegardsFirst you should search for existing standard LSMW migration objects that possibly cover your scenario. These normally take care of complex header / multiple items / multiple subitems relations.
With a recording you can only load flat structures. Sometimes a "create new lines" / "enter first line" / "save transaction" logic can be constructed to load multiple items to a header (or even subitems to an item?) with a recording, but this can be pretty awkward.
Thomas -
Creating a Scheduled Task using PowerShell v4
So here is my question. I'm trying/wanting to create a scheduled task using PowerShell v4 on Server 2012R2. I want to task to run on the 1st day of every month. The parameter for -Monthly -Day is not available. I need to use PowerShell
to create the Scheduled Task because the Task will be running with a group Managed Service Account credentials so the Scheduled Task can not be created via the Task Scheduler GUI.
Thanks in AdvanceHere is a functioning task It can be easily edited in PowerShell or in notepad to alter the timing
<?xml version="1.0" encoding="UTF-16"?>
<Task version="1.1" xmlns="http://schemas.microsoft.com/windows/2004/02/mit/task">
<RegistrationInfo>
<Author>W8TEST\user01</Author>
</RegistrationInfo>
<Triggers>
<CalendarTrigger>
<StartBoundary>2014-06-04T21:31:32.0459499</StartBoundary>
<Enabled>true</Enabled>
<ScheduleByMonth>
<DaysOfMonth>
<Day>1</Day>
<Day>12</Day>
<Day>24</Day>
</DaysOfMonth>
<Months>
<January />
<February />
<March />
<April />
<May />
<June />
<July />
<August /
<September />
<October />
<November />
<December />
</Months>
</ScheduleByMonth>
</CalendarTrigger>
</Triggers>
<Principals>
<Principal id="Author">
<RunLevel>LeastPrivilege</RunLevel>
<UserId>W8TEST\jvierra</UserId>
<LogonType>InteractiveToken</LogonType>
</Principal>
</Principals>
<Settings>
<DisallowStartIfOnBatteries>true</DisallowStartIfOnBatteries>
<StopIfGoingOnBatteries>true</StopIfGoingOnBatteries>
<IdleSettings>
<StopOnIdleEnd>true</StopOnIdleEnd>
<RestartOnIdle>false</RestartOnIdle>
</IdleSettings>
<Enabled>true</Enabled>
<Hidden>false</Hidden>
<RunOnlyIfIdle>false</RunOnlyIfIdle>
<WakeToRun>false</WakeToRun>
<ExecutionTimeLimit>P3D</ExecutionTimeLimit>
<Priority>7</Priority>
</Settings>
<Actions Context="Author">
<Exec>
<Command>notepad.exe</Command>
<Arguments>test.txt</Arguments>
<WorkingDirectory>c:\temp</WorkingDirectory>
</Exec>
</Actions>
</Task>
I have edited and reloaded the XML many, many times. It works very nicely.
¯\_(ツ)_/¯ -
INVALID_QUEUE_NAME : Error while scheduling message using qRFC
Hello SDNers
We are currently performing our PI 7.1 upgrade and one of our scenario uses a Sender SOAP of the type EOIO. We tried executing this scenario in PI 7.1 XID environment and in it worked fine without any errors but in our PI 7.1 QA environment it is giving the following errors
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
- <!-- Message Split According to Receiver List
-->
- <SAP:Error xmlns:SAP="http://sap.com/xi/XI/Message/30" xmlns:SOAP="http://schemas.xmlsoap.org/soap/envelope/" SOAP:mustUnderstand="">
<SAP:Category>XIServer</SAP:Category>
<SAP:Code area="INTERNAL">SCHEDULE_ERROR</SAP:Code>
<SAP:P1>XBQOC___*</SAP:P1>
<SAP:P2>INVALID_QUEUE_NAME</SAP:P2>
<SAP:P3 />
<SAP:P4 />
<SAP:AdditionalText />
<SAP:Stack>Error while scheduling message using qRFC (queue name = XBQOC___*, exception = INVALID_QUEUE_NAME)</SAP:Stack>
<SAP:Retry>M</SAP:Retry>
</SAP:Error>
Can you please exaplian what could be the issue we are facing.
Thanks.
Kiran
Edited by: Kiran Sakhardande on Oct 22, 2008 4:08 PMHI Kiran,
Have gone throgh the following link?
INVALID_QUEUE_NAME
Regards
Sridhar Goli -
SCHEDULER with using dbms_job.submit package
i want to use a procedure in SCHEDULER with using dbms_job.
How i use this procedure in SCHEDULER
Plz, Help me
Message was edited by:
Nilesh HoleHi,
For 10g and up you should be using dbms_scheduler for scheduling. Some examples are here
http://download.oracle.com/docs/cd/B28359_01/server.111/b28310/scheduse.htm#i1033533
and the homepage is at
http://www.oracle.com/technology/products/database/scheduler/index.html
Here's an example that runs daily at 2am
begin
dbms_scheduler.create_job('myjob',
job_type=>'plsql_block',
job_action=>'dbms_lock.sleep(10);',
start_date=>null,
repeat_interval=>'freq=daily;byhour=2;byminute=0',
enabled=>true , auto_drop=>true
end;
/Hope this helps,
Ravi. -
Schema Export using DBMS_DATAPUMP is extremely slow
Hi,
I created a procedure that duplicates a schema within a given database by first exporting the schema to a dump file using DBMS_DATAPUMP and then imports the same file (can't use network link because it fails most of the time).
My problem is that a regular schema datapump export takes about 1.5 minutes whereas the export using dbms_datapump takes about 10 times longer - something in the range of 14 minutes.
here is the code of the procedure that duplicates the schema:
CREATE OR REPLACE PROCEDURE MOR_DBA.copy_schema3 (
source_schema in varchar2,
destination_schema in varchar2,
include_data in number default 0,
new_password in varchar2 default null,
new_tablespace in varchar2 default null
) as
h number;
js varchar2(9); -- COMPLETED or STOPPED
q varchar2(1) := chr(39);
v_old_tablespace varchar2(30);
v_table_name varchar2(30);
BEGIN
/* open a new schema level export job */
h := dbms_datapump.open ('EXPORT', 'SCHEMA');
/* attach a file to the operation */
DBMS_DATAPUMP.ADD_FILE (h, 'COPY_SCHEMA_EXP' ||copy_schema_unique_counter.NEXTVAL || '.DMP', 'LOCAL_DATAPUMP_DIR');
/* restrict to the schema we want to copy */
dbms_datapump.metadata_filter (h, 'SCHEMA_LIST',q||source_schema||q);
/* apply the data filter if we don't want to copy the data */
IF include_data = 0 THEN
dbms_datapump.data_filter(h,'INCLUDE_ROWS',0);
END IF;
/* start the job */
dbms_datapump.start_job(h);
/* wait for the job to finish */
dbms_datapump.wait_for_job(h, js);
/* detach the job handle and free the resources */
dbms_datapump.detach(h);
/* open a new schema level import job */
h := dbms_datapump.open ('IMPORT', 'SCHEMA');
/* attach a file to the operation */
DBMS_DATAPUMP.ADD_FILE (h, 'COPY_SCHEMA_EXP' ||copy_schema_unique_counter.CURRVAL || '.DMP', 'LOCAL_DATAPUMP_DIR');
/* restrict to the schema we want to copy */
dbms_datapump.metadata_filter (h, 'SCHEMA_LIST',q||source_schema||q);
/* remap the importing schema name to the schema we want to create */
dbms_datapump.metadata_remap(h,'REMAP_SCHEMA',source_schema,destination_schema);
/* remap the tablespace if needed */
IF new_tablespace IS NOT NULL THEN
select default_tablespace
into v_old_tablespace
from dba_users
where username=source_schema;
dbms_datapump.metadata_remap(h,'REMAP_TABLESPACE', v_old_tablespace, new_tablespace);
END IF;
/* apply the data filter if we don't want to copy the data */
IF include_data = 0 THEN
dbms_datapump.data_filter(h,'INCLUDE_ROWS',0);
END IF;
/* start the job */
dbms_datapump.start_job(h);
/* wait for the job to finish */
dbms_datapump.wait_for_job(h, js);
/* detach the job handle and free the resources */
dbms_datapump.detach(h);
/* change the password as the new user has the same password hash as the old user,
which means the new user can't login! */
execute immediate 'alter user '||destination_schema||' identified by '||NVL(new_password, destination_schema);
/* finally, remove the dump file */
utl_file.fremove('LOCAL_DATAPUMP_DIR','COPY_SCHEMA_EXP' ||copy_schema_unique_counter.CURRVAL|| '.DMP');
/*EXCEPTION
WHEN OTHERS THEN --CLEAN UP IF SOMETHING GOES WRONG
SELECT t.table_name
INTO v_table_name
FROM user_tables t, user_datapump_jobs j
WHERE t.table_name=j.job_name
AND j.state='NOT RUNNING';
execute immediate 'DROP TABLE ' || v_table_name || ' PURGE';
RAISE;*/
end copy_schema3;
/The import part of the procedure takes about 2 minutes which is the same time a regular dp import takes on the same schema.
If I disable the import completely it (the export) still takes about 14 minutes.
Does anyone know why the export using dbms_datapump takes so long for exporting?
thanks.Hi,
I did a tkprof on the DM trace file and this is what I found:
Trace file: D:\Oracle\diag\rdbms\instanceid\instanceid\trace\instanceid_dm00_8004.trc
Sort options: prsela execpu fchela
count = number of times OCI procedure was executed
cpu = cpu time in seconds executing
elapsed = elapsed time in seconds executing
disk = number of physical reads of buffers from disk
query = number of buffers gotten for consistent read
current = number of buffers gotten in current mode (usually for update)
rows = number of rows processed by the fetch or execute call
SQL ID: bjf05cwcj5s6p
Plan Hash: 0
BEGIN :1 := sys.kupc$que_int.receive(:2); END;
call count cpu elapsed disk query current rows
Parse 3 0.00 0.00 0 0 0 0
Execute 229 1.26 939.00 10 2445 0 66
Fetch 0 0.00 0.00 0 0 0 0
total 232 1.26 939.00 10 2445 0 66
Misses in library cache during parse: 0
Optimizer mode: ALL_ROWS
Parsing user id: SYS (recursive depth: 2)
Elapsed times include waiting on following events:
Event waited on Times Max. Wait Total Waited
---------------------------------------- Waited ---------- ------------
wait for unread message on broadcast channel
949 1.01 936.39
********************************************************************************what does "wait for unread message on broadcast channel" mean and why did it take 939 seconds (more than 15 minutes) ? -
IMovies - Export Using Quicktime - can't find the finished file - help please!
Please help. I exported several imovies projects to a designated file on my mac, using Share - Export using Quicktime and then I uploaded them to Youtube.
I did this successfully several times, now when I export a project, it takes about 45 minutes to export it, then I can't find them anymore.
I have read several threads, which suggest looking in the iMovie projects folder and locate your project Right-click to get to "show package contents" and they are not there.
It looks as if it goes through successfully, no errors but this is so annoying as I am wasting hours sharing and then can't find it. Please help anyone.
ThanksHi Gee
I originally spent about 10 hours uploading 3 clips directly from imovies to Youtube to a private channel and they couldn't find any signs of my 3 uploads. So I read threads that said export to Quicktime from imovies and then load them to Youtube that way. I did that for 3 short, 10 minute videos and it worked fine. It changed the thread to .mov and it worked.
Then perhaps I changed something (not sure what) but now when I use Export to Quicktime, it spends 1 hour or so doing the motions and completes but when I go to the allocated folder to retrieve it. It can't be found. I read threads and this seems to be a regular occurrence but I haven't read any replies that has solved this issue.
Hope that's clearer. -
Can I export using a single dump directory to multiple locations in oracle
I'm trying to do a full database export using expdp utility in oracle 10g. I have a single dump directory that is mapped to a particular file location say /export/dump. I don't want the entire dump file to be stored in the above path. Instead I want the dump to be distributed among multiple files. I know that this can be done using FILESIZE parameter which will distribute the contents to multiple files according to the size we have specified.
My problems comes here, I want to export my data to multiple locations, the path is different than what I mentioned above, say /first/dump. Now my question is should I create multiple dump directories for each location before exporting or can I omit directory attribute in expdp and specify the complete path in the FILE parameter itself.No. EXPDP needs the server component LOGICAL DIRECTORY. If you don't specify the directory, it will go to the default EXPDP path which will be mostly /rdbms/log. It's defined by the parameters DATA_PUMP_DIR.
You will have to specify directory attribute if you want to point your dumpfile to go to any specific location and you cannot give the directory path in the file name in EXPDP (unlike conventional exp)
Maybe you are looking for
-
We recently migrated from Windows Server 2003 to 2008 R2. We have a 2 server clustered configuration in our production (PRD) and our acceptance testing (SAT) environments. Our PRD environment is exposing the internal server name randomly. This is
-
How to set page number & current date in sqlplus result
how to set page number & current date in sqlplus result
-
Hi Gurus, Please tell me the uses of House banks and Account IDs? How many Account Ids can be in one House Bank and is it compulsory that we should have equal no of Bank GL Accounts and Account ID in House Bank? Please revert asat. Thank You
-
NullPointerException when mouselistener calls method
Good Evening, I get a java.lang.NullPointerException when a mouselistener calls a method "populatethedatafields()" contained within the class. The populatethedatafields method tries to set the values of text fields and it is at this point the excepti
-
ABAP routine - looping a result set from table.
I am trying to access a table and pass the resulting rows to an internal table. However, before passing, I need to do some data manipulation on one of the columns of the table and then pass it. If I didn't have to do this, I could have just said INTO