Doubt in SUBMIT with job
Hi all,
I am confused in one scenario while using submit statement..
In my requirement i have two report programs say "X" and "Y".
X calls the report Y with some inputs .......and X is scheduled as a job.
Y calls a standard SAP program with some inputs and that output is what actually needed....
so my doubt is if i schedule X as job will the outpur from srandard sap program will be available in spool or should i schedule a job for Y also.
If i need to schedule a job for both x and y shud i use the same job name.
Please help!!!
Hi people,
Thanks for your reply.
But i checked it is not working .i mean i am not getting the spool output .
for your reference i will give sample code
<b>ZCALLMARK</b>
DATA: jobname LIKE tbtcjob-jobname VALUE 'hemajob4',
jobcount LIKE tbtcjob-jobcount VALUE 1,
start_time LIKE sy-uzeit,
params LIKE pri_params,
w_prog(30),
v_commit TYPE c,
variant_name(14) type c,
stringlen type i value 0,
temp_string(18) type c,
msg_string(50) type c.
move 'ZCTE0010_MARK' to w_prog.
perform open_job using jobname.
perform submit_job.
perform job_close.
Form open_job
FORM open_job using jobname.
CALL FUNCTION 'JOB_OPEN'
EXPORTING
jobname = jobname
IMPORTING
jobcount = jobcount
EXCEPTIONS
others = 8.
IF sy-subrc NE 0.
WRITE: /1 'Error Opening Job ', jobname.
ENDIF.
ENDFORM. " open_job
Form submit_job
FORM submit_job.
SUBMIT (w_prog) TO SAP-SPOOL
WITH crun = 'TEST4000'
WITH cdate = '20070701'
WITH rb1 = 'X'
SPOOL PARAMETERS params
WITHOUT SPOOL DYNPRO
USER sy-uname VIA JOB jobname NUMBER jobcount
AND RETURN.
if sy-subrc > 4.
WRITE: /1 'Error Submitting Job ', jobname.
endif.
ENDFORM. " submit_job
Form job_close
FORM job_close.
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = jobcount
jobname = jobname
strtimmed = 'X' "start immediately
EXCEPTIONS
cant_start_immediate = 1.
<b>ZCTE0010_MARK</b>
SELECTION-SCREEN BEGIN OF BLOCK b1 WITH FRAME TITLE text-000.
PARAMETERS : crun TYPE kala-kalaid OBLIGATORY,
cdate TYPE kala-kaladat OBLIGATORY.
PARAMETERS : rb1 RADIOBUTTON GROUP rb USER-COMMAND rbut,
rb2 RADIOBUTTON GROUP rb .
SELECTION-SCREEN END OF BLOCK b1.
SUBMIT saprck23
USING SELECTION-SET 'SAP&15'
WITH p_buper = '10'
WITH p_gjahr = '2007'
WITH kaladat = cdate
WITH kalaid = crun
WITH p_ckvo = 'X'
WITH p_test = ' '
WITH p_listau = 'X'
WITH p_batch = ' '.
Please refer this and tell me where i am going wrong!!!
Similar Messages
-
Hi All,
There is a requrement in which i have to pass an internal table from main program to another program .then do processing on that table i.e modify the value of some fileds and then pass the changed table to main program.
I have to use job i.e use the submit statement using job cration.
how i can do it because export/import does not work in job.
plz help.
thanks in advance.hi,
Refer to this link..
How to pass an internal table to "submit job" -
Submit Multiple Job Definitions/Job Chains with same Time window/Submit frame in mass
Hi,
We have a requirement to submit multiple Job Definition/Job Chains which are part of common Time Window/Submit frame/Queue....
Ex. We have over 50+ different jobs/job chains which will runs Monday to Friday for every 2 hours on same Queue "XXX_Queue". Instead of submitting each job/job chain manually, we would like to know if we could use any script that can achieve our requirement? since we have couple of other jobs which fall under same scenarios...
We are on M33.104 version. Please let me know if any one has any scripts or alternate way of submitting multiple jobs/job chains in mass.
Thanks in advance!
Nidhi.Hi Nidhish,
Here is some code to set some stuff on a job:
//Get the partition, for global this is not necessary as global is default
Partition part = jcsSession.getPartitionByName("GLOBAL");
//Get the job definition
JobDefinition jobdef=jcsSession.getJobDefinitionByName(part, "System_Info");
//Get the submit frame
SubmitFrame sf = jcsSession.getSubmitFrameByName(part, "SF_Every_Year");
//Get the time window
TimeWindow tw = jcsSession.getTimeWindowByName(part, "System_Week_WorkingHours");
//Set the start time
DateTimeZone dtz = new DateTimeZone(2015, 10, 18, 15, 0, 0, 0);
//Get the Queue
Queue SystemQ=jcsSession.getQueueByName(part, "System");
//Create the Job
Job infoJob=jobdef.prepare();
//Attach queue to job
infoJob.setQueue(SystemQ);
//Attach submit frame, time window, start time
infoJob.setSubmitFrame(sf);
infoJob.setTimeWindow(tw);
infoJob.setRequestedStartTime(dtz);
//Print out the jobid of the job
jcsOut.println(infoJob.getJobId());
//Submit the job
jcsSession.persist();
Regards,
HP -
Problem Submit Via Job in BADI
Hello All
I am using SUBMIT VIA JOB in BADI "work order update" and but no job is created....also sy-subrc is 0.
Here is the code
call function 'JOB_OPEN'
exporting
jobname = name
importing
jobcount = number
exceptions
cant_create_job = 1
invalid_job_data = 2
jobname_missing = 3
others = 4.
if sy-subrc = 0.
submit z_idoc_create_process_order and return
via job name number number
with p_aufnr = it_header1-aufnr
with p_werks = it_header1-werks
with p_autyp = c_autyp
with p_auart = it_header1-auart
with p_dispo = it_header1-dispo
with p_opt = c_opt
with p_mestyp = c_mestyp.
if sy-subrc = 0.
call function 'JOB_CLOSE'
exporting
jobcount = number
jobname = name
strtimmed = 'X'
exceptions
cant_start_immediate = 1
invalid_startdate = 2
jobname_missing = 3
job_close_failed = 4
job_nosteps = 5
job_notex = 6
lock_failed = 7
others = 8.
if sy-subrc <> 0.
endif.
Any reason why job is not created?
Thanks in advance.
regads
VInitHi guys,
I tried this
SUBMIT z_idoc_create_process_order USER creator using selection-set lv_variant TO SAP-SPOOL
SPOOL PARAMETERS print_parameters
WITHOUT SPOOL DYNPRO
WITH p_aufnr EQ it_header1-aufnr
WITH p_werks EQ it_header1-werks
WITH p_autyp EQ c_autyp
WITH p_auart EQ it_header1-auart
WITH p_dispo EQ it_header1-dispo
WITH p_opt EQ c_opt
WITH p_mestyp EQ c_mestyp
VIA JOB name NUMBER number
AND RETURN.
Now the job is getting created but my Variant has no values
How to pass values to variant? below values are not getting tranferred.
WITH p_aufnr EQ it_header1-aufnr
WITH p_werks EQ it_header1-werks
WITH p_autyp EQ c_autyp
WITH p_auart EQ it_header1-auart
WITH p_dispo EQ it_header1-dispo
WITH p_opt EQ c_opt
WITH p_mestyp EQ c_mestyp -
Hi,
We have a query regarding submission of Jobs in background. If we use JOB_OPEN,JOB_CLOSE in our executable program to submit a job to run this proram with a transaction code and also schedule this job in background using SM36 the what will be the effect?Hi pankaj,
1. If the program only display some data,
and does not update anything,
then nothing will go wrong.
2. Both programs will run independently of each other,
in background.
regards,
amit m. -
Getting an unusual error message in Compressor 4.1 when I try to submit a job
I'm running Mavericks and have Compressor 4.1 on my Mac Pro along with FCP 10.1. When I submit a job to compressor, I then add the Dolby Digital and Mpeg-2 6.2 Mbps/90. When I hit Start Batch I get this error message
/var/folders/k9/f6fyk4sj4f3_rj2wlrlwx9hr0000gn/T/46BDF064-B30F-4BF1-8D9C-D22DE91 8342B.export-archive
I've tried to uninstall and re-install Compressor but to no avail. What is this error message referring to and how do I rectify it?
Thank you
DrewHi Drew if you haven't resolved this. TRy this to see if the issue is a TRANSIENT object access error instead for submitting directly to Compressor. Do this to isolate any content error defects before making you distribution (older 2 stage workflow)..
in FCPX 10.1, make sure the projects Primary Storyline is completely rendered - select all and cntl+R (render selection or control+shift+R)In FCPX 10.1 watch the background tasks (+9) and wait for all the rendering to be completed.... (make sure no object errors)
in FCPX 10.1 export a master File/Share/Master File (+E) as PRORES 4xx and save it ~/Movies/Drews_final_master.mov (prores)
in Compressor.app V4.1,
create a new BATCH ,
import the MASTER object ~/Movies/Drews_final_master.mov)
(and add your setting and submit it. (dont use FCPX SEND TO COMPRESSOR just yet! until you resolve this issue)
This process will avoid the use of transient file storage areas that seem to be used in the FCPX to COMPRESSOR.
Post your results for others to see
Warwick
Hong Kong -
Is this the correct syntax to submit a job using DBMS_JOB.SUBMIT?
Hello,
Is this the correct syntax to submit a job?
DECLARE
v_job_number NUMBER;
v_job_command VARCHAR2(1000) := 'PREPARE_ORACLE_TEXT_SEARCH;';
v_interval VARCHAR2(1000) := 'trunc(SYSDATE)+1+7/24';
BEGIN
DBMS_JOB.SUBMIT(v_job_number, v_job_command, sysdate, v_interval, false);
COMMIT;
END;
Thanks
DougDECLARE
v_job_number NUMBER;
v_job_command VARCHAR2(1000) := 'BEGIN
PREPARE_ORACLE_TEXT_SEARCH; END;';
v_interval VARCHAR2(1000) :=
'trunc(SYSDATE)+1+7/24';
BEGIN
DBMS_JOB.SUBMIT(v_job_number, v_job_command, sysdate,
v_interval, false);
COMMIT;
END;
About your error:
PLS-00201: identifier 'PREPARE_ORACLE_TEXT_SEARCH'
must be declared
ORA-06550: line 1, column 96:
PL/SQL: Statement ignored
The problem is that the job cannot find the procedure
(maybe own by an other user). The user who run the
job is not the same as the owner of the package.
Bye, AronYou forget the semicolon after END.
But we don't need here begin - end Block.
So it's OK.
v_job_command VARCHAR2(1000) := 'PREPARE_ORACLE_TEXT_SEARCH;'[b];
As you right mentioned, it is probably problem with owner or typo in the name of procedure.
Regards
Dmytro Dekhtyaryuk
Message was edited by:
dekhtyar -
Submit remote job to HDInsight cluster using its IP address.
Hi there,
I am using HDInsight and trying to submit jobs in it. I am able to submit jobs using the API provided by Azure. This works fine. Also, I am able to submit job in remote machine by opening the remote machine using VM.
I am now trying to submit the job to the HDInsight cluster from my machine using the IP address of the remote machine. I am not able to submit any job into it. It throws out some sort of error.
Please help me on this.
Regards,
Athiram SHi Sudhir,
Thanks for looking into this.
We can submit job in hadoop cluster using the IP address by the following method.
1) Configure certain XML files like core-site.xml,Hdfs-site.xml, Yarn-site.xml in cluster machine(namenode) with the IP address of the machine. I also make the similar change in the configuration files in my machine under the location "..\\etc\\hadoopcluster_IPAddress".
2)Now, execute the command pig --config "..\\etc\\hadoopcluster_IPAddress" in my machine(which is connected to namenode machine of the cluster through LAN). Now, the Map-reduce job gets executed in remote machine.
I am trying a similar approach for submitting the job in HDInsight cluster. I use the Headnode IP address and modified the configuration files and used the same command as above. But, I am wondering why it not working.
I am able to get the job successfully executed to my cluster machine and job submission in HDInsight cluster fails.
Please help me on this issue.
Regards,
Athiram S -
Unable to submit a job when the variable is a composite one
Hi
I was trying to submit a job from dbms_job.submit
DBMS_JOB.SUBMIT(l_job,
'gitm_goaml_job_proc('||''''||p_wrk_gidamlrp||''''||'),
SYSDATE,
'SYSDATE + (10/(24*60*60))');
Here p_wrk_gidamlrp is a composite type of variable which i have created in a package spec.
when i try to complie the code I receive an error that '|| have not been used properly' But if i replace p_wrk_gidamlrp with a simple data type variable things work fine.
Can you please let me know the reason of this error?
Thanks
-vinod
Edited by: 965358 on Oct 15, 2012 4:07 AMpost details of your datatype etc. otherwise we don't know what you've got in there and how we can help.
Also make sure you include your database version and use {noformat}{noformat} tags as described in the FAQ: {message:id=9360002} -
Trying to submit a job for Compressor
I'm having trouble submitting my sequence once it's been imported into Compressor.
After performing the necessary steps then clicking on the submit button the window
displays the name and priority, but for the cluster it displays "no value". So I click on submit again, then a warning appears stating: "Unable to submit to queue"
Please restart or verify your Compressor installation is correct.
I've tried to restart which didn't work either.
I've also tried going into the QMaster under system preferences, but that seems to be fine. Please help me!I have been experiencing the same issue with Compressor, unable to submit a job to be compressed.
This issue started recently, after the Pro App update.
I am using Compressor 3.5, there are no current updates available in "Software Update".
I have not tried reinstalling Final Cut Studio yet, I would like to avoid that if possible. I'm sure there is a software update fix coming soon... In the mean time, I am unable to use compressor. -
Submit background job in APEX -- count on all_jobs return shadow jobs.
Hi, I am trying to submit a job in APEX. The setups are as below:
On Submit - After Computation and Validations
Run Process: Once Per Page Visit
Process:
DECLARE
v_instance_cnt NUMBER;
job_no NUMBER;
BEGIN
SELECT COUNT(0)
INTO v_instance_cnt
FROM user_jobs
WHERE what LIKE 'pagl_refresh.master_refresh%';
IF NVL(v_instance_cnt,0) = 0 THEN
DBMS_JOB.SUBMIT(job_no, 'pagl_refresh.master_refresh('''||:G_BSYS_USER_NAME||''');');
:P3_MESSAGE:= 'Job has submitted. Number is '||TO_CHAR(job_no);
ELSE
:P3_MESSAGE :='The refresh is in progress. Please wait ... ('||to_char(v_instance_cnt);
END IF;
END;
Now, if I run the process, the :P3_MESSAGE message returns "'The refresh is in progress. Please wait ... (5)." . This is due to the count is 5 instead of expected 0.
If I SELECT count(*) FROM dba_jobs WHERE lower(what) LIKE 'pagl_refresh.master_refresh%'; in SQLPLUS, it returns 0. Same result from all_jobs as well.
My suspect is that it returns job counts include those that has been removed before. Yet, how APEX can see this? Does APEX use some special way to look into Job queue?
Please help
ThanksFrom the looks of it, the job is being submitted and run - although I would check the elapsed time to see if it's anywhere close to the 20-30 minutes you anticipate. Assuming not, I would suggest that the problem is in one of the following areas:
1. The way in which you are passing in the arguments is not conforming to the expected input format or values and it's therefore not executing as expected.
2. Your process implictly relies on the state of your apex application in some manner, which is not being reproduced within the procedure when the job is submitted.
In the former case, I would check the procedure's specification against the page items types being passed in - you might have to explicitly convert some of your arguments into the appropriate type
In the latter case, well... bearing in mind that we don't know what your procedure looks like and it's therefore kind of difficult to diagnose the problem, you'll possibly need to pass your session information into the procedure as additional parameters and re-create your session from within the code. -
We run a Job in a 10g (10g Enterprise Edition Release 10.2.0.4.0) DB using this code in SQL*Plus.
SQL> EDIT
Wrote file afiedt.buf
1 DECLARE
2 v_JobNum NUMBER;
3 BEGIN
4 DBMS_JOB.SUBMIT(v_JobNum,'chs_job_test;',sysdate, NULL);
5 DBMS_OUTPUT.PUT_LINE('Job # = ' || v_jobnum);
6 commit;
7* END;
8 /
PL/SQL procedure successfully completed.
SQL> set serverout on
SQL> /
Job # = 462Job runs successfully. i.e. the procedure chs_job_test is executed OK. Problem is when we query the USER_JOBS view, we don't see any record for Job 462??
Why is this??
Where can we get information of the jobs submitted???
If an error occured when running job, from where, which DD View we can get this?If you submit a job without specifying the NEXT_DATE and INTERVAL, Oracle treats it as a "single execution" job. It is removed from the job queue once it is completed.
Hemant K Chitale -
How to submit a job into a job group
Using job events/actions we occasionally need for a scheduled job to submit another job into the current schedule.
The issue here is that the submitted jobs go straight into the highest level view. We will need to do this for over 400
times in one night, and do not wish to flood the operators view, so are simply trying to find a way to submit it so that
it shows up within an already existing group. Any ideas?It’s just a thought…… but you could..
Job Group A
Jobaa- run
Jobab – echo a blank file to a shared directory, example: /C "echo ''> \\USNPKETLP03\Infa_Shared\File_Inbound\zcmtyr_eu.csv
Job Group B
Jobba – run dependent on zcmtyr_eu.csv being present. Use the check box, “rerun each time the dependencies are met”
Jobbb – delete the echo file. -
Cannot submit the job "PSP: Create Distribution Lines"
Hello,
I am new to Labor Distribution and when I try to submit the job "PSP: Create Distribution Lines", I cannot pass any value for the parameter "Source Type" - it does not have any valid values (it is a table validated valueset).
Before trying to run this job, I ran the "PSP: Import Payroll Transactions from HRMS".
Could someone please guide me if I am missing any step in between?
Please advice.
Thanks,
SagarDuplicate thread (please post only once).
Cannot submit the job "PSP: Create Distribution Lines"
Re: Cannot submit the job "PSP: Create Distribution Lines"
Thanks,
Hussein -
Create billing failue with job
hi
i meet a question.
After created sales order, i changed plant and shipping point of one item, then create DN and post goods issue with job. It's correct until this step.
But when i created billing with job, the invoice was missed, but i could created manual.
Who can tell me its reason.
Thanks.Hi Kevin Zhou,
I have few questions for you, Was the delivery created for this item?, I asked that because this item is not in the same Delivery document where the others item of the sales order are. Is the Billing date of the item between of the billing date from which the Billing job runs, i.e. if your item has billing date 01/05/2009 and the billing job has as billind date 01/06/2009 to 01/07/2009, then this item is not taken in consideration.
Please check the all field of the variant used for the Billing Job and see if your item is filtered for anyone of that.
Thanks,
Mariano.
Maybe you are looking for
-
Regarding Generation of Reports
Hi!! The Link "Generating a report" is not working in tutorial section. Can any one tell me how to learn generating reports using Raptor Thanx Amit
-
ILife v2008 & idvd 7 transferring imovie 5.0.2 into idvd7
has anybody had this problem. i create a project in idvd7. i put my imovies into my movie folder. i have 6 movies from a vacation. when i open idvd7 i pick my theme. then when i go and get my imovies for my project out of the 6 movies for some reason
-
DIM-00020 error on 8.1.7 to 9.2.0.1 upgrade
Hi, Can anyone help us on how to resolve the error we are receiving on upgrade? 1. Installed 9i in a new home 2. Choose Enterprise edition / General installation 3. Installation was sucessful 4. Ran Upgrade assistant from start-program files 5. Got t
-
Links that don't open another window
Is there a way or a function that avoids opening another window / tab when the user clicks on a Flash button that directs to another webpage? I'd appreciate any help.
-
I cant seem to find any of the previous fx textures that i used to have before. Including the nature ones such as water, wall and moon textures.