Extra job getting created in dynamic handling of jobs
I have the below code and I notice an extra job that is being created in SM35. Any reasons/clues please
bdcjob will have A/P_ACCOUNTS_BDC
adrjob will have A/P_ACCOUNTS_ADDRESS
Name of the batch input session is A/P
The 3rd extra job that is coming up is 'A/P' and I did not open any job by that name.
Thanks for your help.
Kiran
DATA: bdcjob TYPE tbtcjob-jobname,
bdcnum TYPE tbtcjob-jobcount,
adrjob TYPE tbtcjob-jobname,
adrnum TYPE tbtcjob-jobcount,
params LIKE pri_params,
l_valid TYPE c.
CHECK fileonly IS INITIAL.
MOVE: jobname TO bdcjob.
adrjob = 'A/P_ACCOUNTS_ADDRESS'.
IF NOT logtable[] IS INITIAL.
IF NOT testrun IS INITIAL.
* If its a test run, Non Batch-Input-Session
SUBMIT rfbikr00 AND RETURN
USER sy-uname
WITH ds_name EQ file_o
WITH fl_check EQ 'X'. "X=No Batch-Input
ELSE.
* Create a session which will be processed by a job
SUBMIT rfbikr00 AND RETURN
USER sy-uname
WITH ds_name EQ file_o
WITH fl_check EQ ' '.
* Open BDC Job
CALL FUNCTION 'JOB_OPEN'
EXPORTING
jobname = bdcjob
IMPORTING
jobcount = bdcnum
EXCEPTIONS
cant_create_job = 1
invalid_job_data = 2
jobname_missing = 3
OTHERS = 4.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ELSE.
* Submit RSBDCSUB to trigger the session in background mode
SUBMIT rsbdcsub
VIA JOB bdcjob
NUMBER bdcnum
WITH von = sy-datum
WITH bis = sy-datum
WITH z_verab = 'X'
WITH logall = 'X'
AND RETURN.
IF sy-subrc EQ 0.
* Export data to a memory id. This data will be used by the program
* that updates the address & email id
EXPORT t_zzupdate TO SHARED BUFFER indx(st) ID 'MEM1'.
* Get Print Parameters
CALL FUNCTION 'GET_PRINT_PARAMETERS'
EXPORTING
no_dialog = 'X'
IMPORTING
valid = l_valid
out_parameters = params.
* Open a second job to trigger a program which updates addresses & email ids
CALL FUNCTION 'JOB_OPEN'
EXPORTING
jobname = adrjob
IMPORTING
jobcount = adrnum
EXCEPTIONS
cant_create_job = 1
invalid_job_data = 2
jobname_missing = 3
OTHERS = 4.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ELSE.
* submit the program to update email id & long addresses
SUBMIT zfpa_praa_address_update
VIA JOB adrjob
NUMBER adrnum
TO SAP-SPOOL WITHOUT SPOOL DYNPRO
SPOOL PARAMETERS params
AND RETURN.
IF sy-subrc EQ 0.
* First close the dependent job(address update job). Dependency
* is shown by using pred_jobcount & pred_jobname parameters
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = adrnum
jobname = adrjob
pred_jobcount = bdcnum
pred_jobname = bdcjob
EXCEPTIONS
cant_start_immediate = 1
invalid_startdate = 2
jobname_missing = 3
job_close_failed = 4
job_nosteps = 5
job_notex = 6
lock_failed = 7
invalid_target = 8
OTHERS = 9.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
ENDIF.
ENDIF.
ENDIF.
* Close the main job(BDC Job)
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = bdcnum
jobname = bdcjob
strtimmed = 'X'
EXCEPTIONS
cant_start_immediate = 1
invalid_startdate = 2
jobname_missing = 3
job_close_failed = 4
job_nosteps = 5
job_notex = 6
lock_failed = 7
invalid_target = 8
OTHERS = 9.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
ENDIF.
ENDIF.
Edited by: kiran dasari on Jul 9, 2010 12:58 AM
I tried changing the tags..that did not help.
Since there are two open job statements, I expect to see ONLY two jobs in SM37 and am seeing 3 jobs as mentioned. That is the problem and I wish to know how and from where the 3rd job is getting created.
Let me try again and paste the code in the tags:
DATA: bdcjob TYPE tbtcjob-jobname,
bdcnum TYPE tbtcjob-jobcount,
adrjob TYPE tbtcjob-jobname,
adrnum TYPE tbtcjob-jobcount,
params LIKE pri_params,
l_valid TYPE c.
CHECK fileonly IS INITIAL.
MOVE: jobname TO bdcjob.
adrjob = 'A/P_ACCOUNTS_ADDRESS'.
IF NOT logtable[] IS INITIAL.
IF NOT testrun IS INITIAL.
* If its a test run, Non Batch-Input-Session
SUBMIT rfbikr00 AND RETURN
USER sy-uname
WITH ds_name EQ file_o
WITH fl_check EQ 'X'. "X=No Batch-Input
ELSE.
* Create a session which will be processed by a job
SUBMIT rfbikr00 AND RETURN
USER sy-uname
WITH ds_name EQ file_o
WITH fl_check EQ ' '.
* Open BDC Job
CALL FUNCTION 'JOB_OPEN'
EXPORTING
jobname = bdcjob
IMPORTING
jobcount = bdcnum
EXCEPTIONS
cant_create_job = 1
invalid_job_data = 2
jobname_missing = 3
OTHERS = 4.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ELSE.
* Submit RSBDCSUB to trigger the session in background mode
SUBMIT rsbdcsub
VIA JOB bdcjob
NUMBER bdcnum
WITH von = sy-datum
WITH bis = sy-datum
WITH z_verab = 'X'
WITH logall = 'X'
AND RETURN.
IF sy-subrc EQ 0.
* Export data to a memory id. This data will be used by the program
* that updates the address & email id
EXPORT t_zzupdate TO SHARED BUFFER indx(st) ID 'MEM1'.
* Get Print Parameters
CALL FUNCTION 'GET_PRINT_PARAMETERS'
EXPORTING
no_dialog = 'X'
IMPORTING
valid = l_valid
out_parameters = params.
* Open a second job to trigger a program which updates addresses & email ids
CALL FUNCTION 'JOB_OPEN'
EXPORTING
jobname = adrjob
IMPORTING
jobcount = adrnum
EXCEPTIONS
cant_create_job = 1
invalid_job_data = 2
jobname_missing = 3
OTHERS = 4.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ELSE.
* submit the program to update email id & long addresses
SUBMIT zfpa_praa_address_update
VIA JOB adrjob
NUMBER adrnum
TO SAP-SPOOL WITHOUT SPOOL DYNPRO
SPOOL PARAMETERS params
AND RETURN.
IF sy-subrc EQ 0.
* First close the dependent job(address update job). Dependency
* is shown by using pred_jobcount & pred_jobname parameters
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = adrnum
jobname = adrjob
pred_jobcount = bdcnum
pred_jobname = bdcjob
EXCEPTIONS
cant_start_immediate = 1
invalid_startdate = 2
jobname_missing = 3
job_close_failed = 4
job_nosteps = 5
job_notex = 6
lock_failed = 7
invalid_target = 8
OTHERS = 9.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
ENDIF.
ENDIF.
ENDIF.
* Close the main job(BDC Job)
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = bdcnum
jobname = bdcjob
strtimmed = 'X'
EXCEPTIONS
cant_start_immediate = 1
invalid_startdate = 2
jobname_missing = 3
job_close_failed = 4
job_nosteps = 5
job_notex = 6
lock_failed = 7
invalid_target = 8
OTHERS = 9.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
ENDIF.
ENDIF.
ELSEIF NOT t_zzupdate[] IS INITIAL.
* some other process
endif.
Thanks,
Kiran
Similar Messages
-
Error in creating IO file handles for job (number 3152513)
Hi All -
I am using Tidal 5.3.1.307. And the Windows agent that is running these jobs is at 3.0.2.05.
Basically the error in the subject was received when starting a particular job once it was cancelled and a couple of other different jobs a few days before. These jobs have run successfully in the past.
This particular job was running for 500+ minutes when it should run at an estimated 40 minutes. At that time it would not allow for a re-start of the job, it just stayed in a launched status.
Trying to figure out what causes this error.
Error in creating IO file handles for job 3152513
Note - from that being said we were to see 2 instances of this process running at the same time, we noticed some blocking on the DB side of things.
Trying to figure out if this is a known tidal issue or a coding issue or both.
Another side note, after cancelling the 2nd rerun attempt the following error was encountered: Error activating job, Duplicate.
When we did receive the Error creating IO file, the job did actually restart, but Tidal actually lost hooks into it and the query was still running as an orphan on the db server.
Thanks All!The server to reboot is the agent server. You can try stopping the agent and then manually deleting the file. That may work. When the agent is running the agent process may keep the file locked, so rebooting may not be sufficient.
The numerical folders are found as sub-directories off of the services directory I mentioned. I think the numbers correspond to the job type, so one number corresponds to standard jobs, another to FTP jobs. I'd just look in the numbered directories until you find a filename matching the job number.
The extensions don't really matter since you will want to delete all files that match your job number. There should only be one or two files that you need to delete and they should all be in the same numbered sub-directory.
As to the root cause of the problem, I can't really say since it doesn't happen very often. My recollection is that it is either caused by a job blowing up spectacularly (e.g. a memory leak in the program being launched by Tidal) or someone doing something atypical with the client. -
Backend PO not getting created after EHP6 upgrade
Hi Friends,
Recently there was an SAP patch upgrade in ECC 6.0 and after that while creating the shopping cart the PO is created, but it is not transferring to ECC. We are using the extended classic scenario SAP SRM 7.0. I had checked almost everything right from debugging still clueless. even checked the forums but still the issue is not resolved. Any hints of how to proceed now.
Regards,
Ramesh.When We are creating shopping cart, system will create Po in back end based on our BAPI.
Check whether BAPI is running properly or not. You are updated new version EHP6 , I hope you should do required configuration. Try in development server whether Po are creating backend or not.
I hope you should check all the configuration in backend system i.e in ECC and SRM side.
Do again batch running and check whether back ground job getting created or not. -
I have recently noticed that I am getting strange files that I am not sure were they are coming from. The files when opened using VI show the text of MUTX and that is all they say. They are names starting with 0x, followed by a rather long string of Hexidecimal numbers.Example (0x6c6d48142) The files are getting placed directly in my main hard. The files are 44bytes in size but many many of them are getting created daily. I have found little to no articles explaing what may be related to MUTX,except it may be related to some type of "leak", I do not understand what that means. Has anyone seen this or this type of thing before? Thank you for your help
AnkitV wrote:
Hi
I am facing the below peculiar problem.
Our prod database D1 is installed on host H1 and PL/SQL jobs run on server S1 accessing D1 and create files on S1 only.
Recently I got same jobs scheduled to run on S2 accessing D2 database (test) installed on H2 serve,r but files are getting created on H2 instead of S2.
Directory object is EXTERNAL_TABLE_DIR on both S1 and S2.
Actually files should be created on S2 only as jobs are being run there only.
Can you please tell why are the files getting created on H2 given that jobs are run from S2 and what can be done to rectify this ?
Thanks a lotHi,
The job will be created the file only on the server where the database is installed not in any other server. -
SM:EXEC SERVICES job getting failed in Solution manager
HI All,
SM:EXEC SERVICES job getting failed in solution mnager and job log will be fowlloing:
01-31-2012 19:09:21 Job started
01-31-2012 19:09:21 Step 001 started (program RDSMOPBACK_AUTOSESSIONS_ALL, variant , user ID BASISUSER)
01-31-2012 19:09:21 MemSize Begin = 796480 Bytes
01-31-2012 19:09:21 *******************************
01-31-2012 19:09:21 MemSize Begin = 1919120 Bytes
01-31-2012 19:09:23 Solution <000000106120000> "SID..." is being edited (Operations)
01-31-2012 19:09:23 <000000106120000> "SID..."(Operations)
01-31-2012 19:09:23 "Create_Periodic_Services" uses = 0 bytes
01-31-2012 19:09:23 "Create_Periodic_Services" uses = 690 us time
01-31-2012 19:09:23 Trying to perform session EC2000000001313
01-31-2012 19:09:24 MODEL_KEY: SID Installation Number: 6 MODEL_MODE: E MODEL_VERSNR 00001 MODEL_CLASS: EWA Data Model
01-31-2012 19:09:30 Definition nicht vorhanden
01-31-2012 19:09:33 Definition nicht vorhanden
01-31-2012 19:09:45 Internal session terminated with a runtime error (see ST22)
01-31-2012 19:09:45 Job cancelled
St22 Dump:
The exception 'CX_DSVAS_API_CONTEXT_INSTANCE' was raised, but it was not caught
anywhere along
the call hierarchy.
Since exceptions represent error situations and this error was not
adequately responded to, the running ABAP program 'SAPLDSVAS_PROC' has to be
terminated.
Please suggest how to proceed further and provide your valuble inputs
Thanks
NekkalapuHI Siva,
I have done the changes according the note but still same problem.
Please suggest any other ways to resolve this issue .
Advance thanks
Regards
Nekkalapu -
SM:SELFDIAGNOSIS background job getting failed in Solution manager
Hi All,
SM:SELFDIAGNOSIS background job getting failed in Solution manger server :
Job Log:
2-22-2012 16:44:02 Job started 00 516 S
2-22-2012 16:44:02 Step 001 started (program RDSWP_SELF_DIAGNOSIS, variant &0000000000324, user ID BASISUSER) 00 550 S
2-22-2012 16:45:07 Internal session terminated with a runtime error (see ST22) 00 671 A
2-22-2012 16:45:07 Job cancelled 00 518 A
ST22: Dump:
Runtime Errors ITAB_DUPLICATE_KEY
Date and Time 02-22-2012 16:45:07
Short text
A row with the same key already exists.
What happened?
Error in the ABAP Application Program
The current ABAP program "CL_DSWP_SD_DIAGNOSE_CONSISTENTCP" had to be
terminated because it has
come across a statement that unfortunately cannot be executed.
What can you do?
Note down which actions and inputs caused the error.
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Error analysis
An entry was to be entered into the table
"\CLASS=CL_DSWP_SD_DIAGNOSE_CONSISTENT\METHOD=CHECK_USERS_BP\DATA=LT_USCP"
(which should have
had a unique table key (UNIQUE KEY)).
However, there already existed a line with an identical key.
The insert-operation could have ocurred as a result of an INSERT- or
MOVE command, or in conjunction with a SELECT ... INTO.
The statement "INSERT INITIAL LINE ..." cannot be used to insert several
initial lines into a table with a unique key.
How to correct the error
Probably the only way to eliminate the error is to correct the program.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"ITAB_DUPLICATE_KEY" " "
"CL_DSWP_SD_DIAGNOSE_CONSISTENTCP" or "CL_DSWP_SD_DIAGNOSE_CONSISTENTCM00M"
"CHECK_USERS_BP"
If you cannot solve the problem yourself and want to send an error
notification to SAP, include the following information:
1. The description of the current problem (short dump)
To save the description, choose "System->List->Save->Local File
(Unconverted)".
2. Corresponding system log
Display the system log by calling transaction SM21.
Restrict the time interval to 10 minutes before and five minutes
after the short dump. Then choose "System->List->Save->Local File
(Unconverted)".
3. If the problem occurs in a problem of your own or a modified SAP
program: The source code of the program
In the editor, choose "Utilities->More
Utilities->Upload/Download->Download".
4. Details about the conditions under which the error occurred or which
actions and input led to the error.
KIndly Suggest to resolve the above issue
Thanks
NekkalapuHI,
Deatils:
SAP EHP 1 for SAP Solution Manager 7.0
SAP_ABA 701 0006 SAPKA70106
SAP_BASIS 701 0006 SAPKB70106
PI_BASIS 701 0006 SAPK-70106INPIBASI
ST-PI 2008_1_700 0002 SAPKITLRD2
CRMUIF 500 0004 SAPK-50004INCRMUIF
SAP_BW 701 0006 SAPKW70106
SAP_AP 700 0019 SAPKNA7019
BBPCRM 500 0016 SAPKU50016
BI_CONT 704 0007 SAPK-70407INBICONT
CPRXRPM 400 0016 SAPK-40016INCPRXRP
ST 400 0024 SAPKITL434
ST-A/PI 01M_CRM570 0000 -
ST-ICO 150_700 0009 SAPK-15079INSTPL
ST-SER 701_2010_1 0002 SAPKITLOS2
Thanks
Nekkalapu -
Background job getting failed every day
Hi,
This background job getting failed every day.But this job was running fine initial periord.Please help me out this issue.
SAP_REORG_UPDATERECORDS
SAP_WP_CACHE_RELOAD_FULL
SLCA_LCK_SYNCHOWNERS
Regards
Naanas.If this is R/3, then none of the above jobs should exist in the system, if scheduled by default with other standard jobs, then these should be removed
Note 1034532 - Changes for standard jobs
Note 931436 - SLCA_LCK_SYNCHOWNERS standard job terminates -
Spool not getting created in batch job
Hello experts,
We have a requirement of printing BA00, BA01 and LD00 from different application.
We have done same code for all BA00, BA01, and LD00.
Normally the code is working fine except in few cases, when batch job is created but spool is not created.
DATA: job_name TYPE tbtcjob-jobname,
number TYPE tbtcjob-jobcount,
print_parameters TYPE pri_params,
wv_pripar TYPE pri_params,
wv_arcpar TYPE arc_params,
wv_val TYPE c VALUE 'X',
wv_false TYPE c VALUE 'X',
gv_mandt TYPE tsp01-rqclient,
gv_user TYPE tsp01-rqowner.
CLEAR: gv_mandt, gv_user, number.
gv_mandt = sy-mandt.
gv_user = sy-uname.
job_name = 'DELIVERY_PRINTING'.
CLEAR: range1.
range1-sign = 'I'.
range1-option = 'EQ'.
range1-low = 'LD00'.
CLEAR: range1-high.
APPEND range1 TO range.
CALL FUNCTION 'JOB_OPEN'
EXPORTING
jobname = job_name
IMPORTING
jobcount = number
EXCEPTIONS
cant_create_job = 1
invalid_job_data = 2
jobname_missing = 3
OTHERS = 4.
IF sy-subrc = 0.
SUBMIT sd70av2a WITH rg_kschl IN range
WITH rg_vbeln IN deli_tab
TO SAP-SPOOL
SPOOL PARAMETERS print_parameters
WITHOUT SPOOL DYNPRO
VIA JOB job_name NUMBER number
AND RETURN.
IF sy-subrc = 0.
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = number
jobname = job_name
strtimmed = 'X'
EXCEPTIONS
cant_start_immediate = 1
invalid_startdate = 2
jobname_missing = 3
job_close_failed = 4
job_nosteps = 5
job_notex = 6
lock_failed = 7
OTHERS = 8.
IF sy-subrc <> 0.
ENDIF.
ENDIF.
WAIT UP TO 10 SECONDS.
** to give the data to the printer***
* CLEAR gv_rqident.
SELECT SINGLE listident INTO gv_listident FROM tbtcp
WHERE jobname = job_name
AND jobcount = number.
IF sy-subrc = 0.
MOVE gv_listident to gv_rqident.
CALL FUNCTION 'RSPO_OUTPUT_SPOOL_REQUEST'
EXPORTING
spool_request_id = gv_rqident.
ENDIF.
The same code is for all the output BA00, BA01 and LD00.
The issue is that for few batch jobs (which is created in the above code) , spool is not getting created.
Thanks and Regards,
Paritosh PandeyHi,
The code starting from WAIT UP TO 10 SECONDS. - not only does it look attrocious, is error prone (if no free process for executing batch job is available for 10 seconds, or if the job runs for more than 10 seconds, what will happen...?), but seems utterly unnecessary... Is there any reason immediate spool output can not be handled by simply setting PRINT_PARAMETERS-PRIMM = 'X'?
cheers,
Janis
Edit in:
Ok, i just read Document in spool but not printed... Do not use an output device relying on frontend access methods to do printing from background processing. Frontend is not available during background processing, period. To my knowledge there is no easy, clean solution to this problem other than defining and assigning, per user, output devices not relying on frontend printing. What if the dialog user has logged off by the time batch job finishes..? May the spool remain unprinted?
Has the SAP Basis refused to define output devices (I have hard time believing this...)? Well, tell them one time to stop being silly and if they do not cooperate, tell the owner of the requirement that it can not be implemented due to uncooperative Basis then... and that the users will have to keep going to SP01 and manualy start the output!
Message was edited by: Jānis B -
Duplicate deliveries getting created by batch job against STO
Hi Experts,
I am facing one issue where duplicate deliveries are getting created by batch job against Intercopmany STO.
Scenario is PO having one line item with ordered qty of 8000kg.
Through batch job, two deliveries got created on same day for PO line item.
One delivery got created for 8000kg and another delivery has got created for 7000Kg. So user has deleted second delivery of 7000kg.
Next day again the delivery got created for 8000kg for the same PO line item through batch job.
I am wondering how the duplicate deliveries are getting created by batch job for PO even though it has no open items.
All deliveries got created through batch job only as cross checked the user name in delivery.
Kindly help to fix the issue.Hi Amit
I assume you are talking about outbound deliveries. In this case it would be worth checking the customer master record for the receiving plant. In the sales area data there is a shipping tab which contains several settings used to control delivery creation for customers.
It is possible to control how the system behaves when you have a stock shortage and restrict the number of partial deliveries. This might help you control this situation and might be the cause.
Regards
Robyn -
How to set/get value in dynamically created components?
I need to create dynamically form based on definition written in database.
I created an empty panelGreed in my jsf page
<h:panelGrid columns="2" id="parseg"
binding="#{ParsegBean.uiKatparam}">
</h:panelGrid>I can�t use dataTable, because my form contains various component types (SelectOneMenu, OutputText, and InputText).
In my bean I create components dynamically:
Private UIPanel uiKatparam = null;
Iterator componentIt = myComponentList.iterator()
while(componentIt.hasNext()){
MyComponent myComponent = (MyComponent) componentIt.next();
HtmlOutputText prompt = new HtmlOutputText();
prompt.setValue(myComponent.getPrompt());
prompt.setId(myComponent.getPromptId());
uiKatparam.getChildren().add(prompt);
switch (myComponent.type) {
case 1: //InputText
HtmlInputText iText = new HtmlInputText();
iText.setId(myComponent.getId());
iText.setValue(myComponent.getDefaultValue());
uiKatparam.getChildren().add(iText);
break;
case 2: //SelectOneMenu
HtmlSelectOneMenu selectOneMenu = new HtmlSelectOneMenu();
// �
uiKatparam.getChildren().add(select);
break;
default:
break;
}When I try to get values in my Action:
Iterator it = myBean.getUiKatparam().getChildren().iterator();
while (it.hasNext()) {
Object ob = it.next();
if (ob.getClass().getName().matches(".*HtlmInputText")) {
HtmlInputText t = (HtmlIputText) ob;
String id = t.getId();
String value = (String) t.getValue();
//�
//�
}and value is still equal initial value
How can I get velue entered to my InputText created dynamically?
MichalI solved my problem creating UIData with dynamically added and dynamically rendered components (in each row is rendered another component).
-
Hi,
I need to run an SQL script every night across all of my database targets. My problem is that the data contained in the script will change every day, so I need some way of creating a dynamic job. Any ideas on this?
Could I create/submit the job using emcli called from a korn shell cron job for example?
Any other suggestions?If you execute a script from the filesystem, it needs to be available at the Agent side Host.
So therefor it would be better to include the script itserlf in the Job Specification, you can then decide to store the job in the Job Library. Then the script is stored in the Library in a central location and you can execute it on any host target you like
Regards
Rob
http://oemgc.wordpress.com -
Create a Job for a transaction dynamically through ABAP program.
Hello Experts,
Can a job be created for a transaction dynamically, say for example i have a parameter on selection screen which will take the name of the transaction and then i have to create a job to run that transcation.
Plz provide sample code.
Regards,
Mansi.hiii
yes you can call transaction like that..take tcode in that parameter then you can use it in your program with statement like
call transaction (variable)..in background process.
regards
twinkal -
Spool list is not getting created for background job
I am creating background job using JOB_OPEN and then submitting my z-report using submit statement and then closing job using JOB_CLOSE. for this job is getting creating in sm37 and also gets finished but it does not create spool list showing output.
Any idea how to do this?
Thanks in advance.DATA: lv_jobname TYPE tbtcjob-jobname,
lv_jobcount TYPE tbtcjob-jobcount,
lv_variant TYPE variant,
wa_var_desc TYPE varid,
wa_var_text TYPE varit,
it_var_text TYPE TABLE OF varit,
it_var_contents TYPE TABLE OF rsparams.
REFRESH: it_var_contents, it_var_text.
CLEAR: wa_var_desc, wa_var_text.
CALL FUNCTION 'RS_REFRESH_FROM_SELECTOPTIONS'
EXPORTING
curr_report = sy-cprog
TABLES
selection_table = it_var_contents
EXCEPTIONS
not_found = 1
no_report = 2
OTHERS = 3.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
CONCATENATE sy-datum sy-timlo INTO lv_variant.
wa_var_desc-mandt = sy-mandt.
wa_var_desc-report = sy-cprog.
wa_var_desc-variant = lv_variant.
wa_var_desc-transport = 'F'.
wa_var_desc-environmnt = 'B'.
wa_var_desc-version = '1'.
wa_var_desc-protected = 'X'.
wa_var_text-mandt = sy-mandt.
wa_var_text-langu = sy-langu.
wa_var_text-report = sy-cprog.
wa_var_text-variant = lv_variant.
lv_jobname = lv_variant.
CONCATENATE 'Batch Job Variant -'(006)
sy-uname INTO wa_var_text-vtext.
APPEND wa_var_text TO it_var_text.
Create the varaint for the back ground job.
CALL FUNCTION 'RS_CREATE_VARIANT'
EXPORTING
curr_report = sy-cprog
curr_variant = lv_variant
vari_desc = wa_var_desc
TABLES
vari_contents = it_var_contents
vari_text = it_var_text
EXCEPTIONS
illegal_report_or_variant = 1
illegal_variantname = 2
not_authorized = 3
not_executed = 4
report_not_existent = 5
report_not_supplied = 6
variant_exists = 7
variant_locked = 8
OTHERS = 9.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
Open the job.
CALL FUNCTION 'JOB_OPEN'
EXPORTING
jobname = lv_jobname
IMPORTING
jobcount = lv_jobcount
EXCEPTIONS
cant_create_job = 1
invalid_job_data = 2
jobname_missing = 3
OTHERS = 4.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
submitt the job in background mode.
CALL FUNCTION 'JOB_SUBMIT'
EXPORTING
authcknam = sy-uname
jobcount = lv_jobcount
jobname = lv_jobname
report = sy-repid
variant = lv_variant
EXCEPTIONS
bad_priparams = 1
bad_xpgflags = 2
invalid_jobdata = 3
jobname_missing = 4
job_notex = 5
job_submit_failed = 6
lock_failed = 7
program_missing = 8
prog_abap_and_extpg_set = 9
OTHERS = 10.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
close the job.
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = lv_jobcount
jobname = lv_jobname
strtimmed = 'X'
EXCEPTIONS
cant_start_immediate = 1
invalid_startdate = 2
jobname_missing = 3
job_close_failed = 4
job_nosteps = 5
job_notex = 6
lock_failed = 7
invalid_target = 8
OTHERS = 9.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
Hope this will be helpful.. -
How can I create events dynamic for a Group/List?
Hey,
atm I'm programming a little application where i want to add elements to my center-pane.
Simpyfied I got:
- Center Pane : here shall the elements appear on right click in bottom-pane. This pane shall be used as anything like a playground where you can drag/drop and connect items from the bottom-pane
- Bottom Pane: here I got about 180 elements which are quite equal. This is sth like a menu of items which you can use. I realised them in java classes extended from a parent class with differend calulations.
What I want:
I want to create an event handler for EACH of the "menu" elements dynamically. Just sth like:
for (int i = 0; i < basic_menu_list.size(); i++)
final Element el = hbox_bottom.getChildren().get(i);
hbox_bottom.getChildren().get(i).setOnMouseClicked(new EventHandler<MouseEvent>()
public void handle(MouseEvent event)
if (event.isSecondaryButtonDown())
playground.add(el);
redrawPlayground();
}But as i expected this doesnt work...
Now my question:
How can I solve this problem? Is there any option to listen to all elements of a group without hard-coding every single listener?
Thanks for your help,
MartinHello User,
Why did you expect that it wouldn't work?
You have an example with "transition" apply on a bunch of circles in the Getting Started with JavaFx (http://download.oracle.com/javafx/2.0/get_started/jfxpub-get_started.htm)
Here a basic example class...
import javafx.application.Application;
import javafx.event.EventHandler;
import javafx.scene.Node;
import javafx.scene.Scene;
import javafx.scene.control.TextBox;
import javafx.scene.input.MouseEvent;
import javafx.scene.layout.Pane;
import javafx.scene.layout.VBox;
import javafx.stage.Stage;
public class HelloWorld extends Application {
//~ ----------------------------------------------------------------------------------------------------------------
//~ Methods
//~ ----------------------------------------------------------------------------------------------------------------
public static void main(String[] args) {
// Entry point
Application.launch(args);
@Override
public void start(Stage mainStage) throws Exception {
Pane pane = new Pane();
Scene scene = new Scene(pane, 200, 200);
VBox vBox = new VBox();
TextBox input1 = new TextBox();
TextBox input2 = new TextBox();
vBox.getChildren().addAll(input1, input2);
for (Node input : vBox.getChildren()) {
input.setOnMouseClicked(new EventHandler<MouseEvent>() {
public void handle(MouseEvent event) {
System.out.println("test click");
pane.getChildren().add(vBox);
mainStage.setScene(scene);
mainStage.setVisible(true);
}Niculaiu -
Creating a Dynamic Node for a Dynamic Graphic - Tutorial
Hi everyone,
I'm sharing my first tutorial, hope it'll be helpful for you.
In the Layout tab, it's possible to create an UI element "Business Graphic". It's a very simple tool that only requires a context node with a category attribute (that means, the values that will appear in the 'x' axis) and one or more series (each one with a color, generally a numerical value). It's possible to add a label for each series for better understanding of the Graphic.
Although it's a very powerful tool, there are some problems when we must create a graphic with N series and different labels. If the graphic use an ALV or internal table to fetch data, during runtime we can have more or less series. The definition of a static node and generic series is not enough in this case. That's why I'd like to present you this little tutorial to create a dynamic node that fetchs a dynamic graph:
In the WDDOINIT or Event Handler method that starts the application (once the data is available) we should create the dynamic node in the following way:
DATA: lr_node_info TYPE REF TO if_wd_context_node_info,
lt_attributes TYPE cl_abap_structdescr=>component_table,
attribute LIKE LINE OF lt_attributes,
struct_type TYPE REF TO cl_abap_structdescr,
lo_dyn_node TYPE REF TO if_wd_context_node.
* Let's suppouse we can LOOP at the data table, so we can fetch the
* category data type (for example company name, month, year, ...)
* and each series with a numeric type
attribute-name = 'CATEGORY'.
attribute-type ?= cl_abap_datadescr=>describe_by_name( 'STRING' ).
INSERT attribute INTO TABLE lt_attributes.
attribute-name = 'SERIE1'.
attribute-type ?= cl_abap_datadescr=>describe_by_name( 'I' ).
INSERT attribute INTO TABLE lt_attributes.
attribute-name = 'SERIE2'.
attribute-type ?= cl_abap_datadescr=>describe_by_name( 'I' ).
INSERT attribute INTO TABLE lt_attributes.
* Once we have all the attributs for the node, we create a formal structure
struct_type = cl_abap_structdescr=>create( lt_attributes ).
* Now we can get the context information to add a new node
lr_node_info = wd_context->get_node_info( ).
* Create the node
lr_node_info = lr_node_info->add_new_child_node(
name = 'GRPH_DYN'
IS_MANDATORY = ABAP_false
IS_MULTIPLE = ABAP_true
STATIC_ELEMENT_RTTI = struct_type
IS_STATIC = ABAP_false ).
* Now we should populates the node, I'll create a hardcoded table,
* simulating the internal table that you should already have
TYPES: BEGIN OF tw_alv,
category TYPE string,
SERIE1 TYPE i,
SERIE2 TYPE i,
END OF tw_alv.
TYPES: tt_alv TYPE STANDARD TABLE OF tw_alv.
DATA: lw_alv TYPE tw_alv,
lt_alv TYPE tt_alv.
lw_alv-category = 'Alfa'.
lw_alv-serie1 = 3.
lw_alv-serie2 = 8.
APPEND lw_alv TO lt_alv.
lw_alv-category = 'Beta'.
lw_alv-serie1 = 4.
lw_alv-serie2 = 4.
APPEND lw_alv TO lt_alv.
lw_alv-category = 'Gamma'.
lw_alv-serie1 = 1.
lw_alv-serie2 = 3.
APPEND lw_alv TO lt_alv.
* Now let's call the recently created node and bind the lt_alv table.
* Get node from context
lo_dyn_node = wd_context->get_child_node( name = 'GRPH_DYN' ).
* Bind table with ALV Container
CALL METHOD lo_dyn_node->bind_table
EXPORTING
new_items = lt_alv.
* It's always good to check if the table was succesfully binded.
* I refresh the lt_alv table and get the values from the node for controlling
REFRESH lt_alv.
lo_dyn_node->get_static_attributes_table( IMPORTING table = lt_alv ).
There are other ways of adding a node structure in the method add_new_child_node, but it works for me only with STATIC_ELEMENT_RTTI. The node is now created and has the data required for the graphic. Now, we should go to WDDOONMODIFYVIEW, or save the target view as a parameter, to create the graphic, bind the category and series and show it on the screen.
DATA: lr_graph TYPE REF TO cl_wd_business_graphics,
lr_cat TYPE REF TO cl_wd_category,
lr_series1 TYPE REF TO cl_wd_simple_series,
lr_series2 TYPE REF TO cl_wd_simple_series,
lr_container TYPE REF TO cl_wd_uielement_container,
lr_flow TYPE REF TO cl_wd_flow_data.
* Get the root element from the Layout or the specific
* container you have created for the graphic
lr_container ?= view->get_element( 'ROOTUIELEMENTCONTAINER' ).
* Creates a line busniess graph
lr_graph = cl_wd_business_graphics=>new_business_graphics(
BIND_SERIES_SOURCE = 'GRPH_DYN'
CHART_TYPE = cl_wd_business_graphics=>e_chart_type-lines
HEIGHT = 340
WIDTH = 750
ID = 'GRAPH' ).
* Create the flow data for the new UI Element business graphic
lr_flow = cl_wd_flow_data=>new_flow_data( element = lr_graph ).
* Set graph in the root container from the Layout tab
lr_container->add_child( lr_graph ).
* Bind the category from the dynamic node to the dynamic graphic
lr_cat = cl_wd_category=>new_category(
view = view
bind_description = 'GRPH_DYN.CATEGORY'
tooltip = 'Company Name' ).
lr_graph->set_category( lr_cat ).
* Bind the two series from the dynamic node to the dynamic graphic
lr_series1 = cl_wd_simple_series=>new_simple_series(
bind_value = 'GRPH_DYN.SERIE1'
label = 'Sales'
view = view
tooltip = 'Average Sales' ).
lr_graph->add_series( lr_series1 ).
lr_series2 = cl_wd_simple_series=>new_simple_series(
bind_value = 'GRPH_DYN.SERIE2'
label = 'Purchases'
view = view
tooltip = 'Average Purchases' ).
lr_graph->add_series( lr_series2 ).
Finally we have created our business graphic. Test the application and you'll see something like the attached image.
Hope you'll find it useful.
Daniel Monteros.Hi,
http://htmldb.oracle.com/pls/otn/f?p=26372
then ApEx>Trees and pick a level.
Use Help for more information.
I you think that tree could help I will put more explanation into help page.
Konstantin
Maybe you are looking for
-
Separate "iMessage accounts" on two iPhones with the same Apple ID
I have two iPhone 4 handsets. One is for work and one is my private number. Currently I am using the same Apple ID on both phones, as I want to use the apps I've purchased on both devices and get my contacts from iCloud. As I usually bring both hands
-
I can't fix the signature. I have 2 logos or twice the phone number.
I can't fix the signature. I have 2 logos or twice the phone number. I can go into the signature and fix it there, but it still shows up. I also cannot fix the font on my prewritten texts. It worked for 2 years, but at one point it got completely scr
-
How do you view original CRM billing document from R3
We have sales orders and billing now in CRM. We have a requirement to link the accounting document to the billing document in CRM. Does anyone know if there is a standard function module to get the document flow or link to the CRM document?
-
I used Migration Assistant to transfer from MBP (Leopard OSX 10.6.8) to Mac Mini (Lion OSX 10.7). Documents cannot be "seen" or "found". Ditto Music , movies, and pictures.Stumped!
-
Performance of Export Layers to Files
Hi! I know of an Export Layers to Files script that comes with Adobe Photoshop CS4 that exports the layers of a Photoshop document as image files. It exports layer by layer with the aid of a loop. It first sets all layers as invisible, except the fir