A question about Job Creation

Hi Expert,
By JOB_OPEN, JOB_SUBMIT, JOB_CLOSE, we can schedule a job.
When the first two FMs is executed, the job is still in 'scheduled' status. Only when the FM JOB_CLOSE is executed, the job will be in 'released' status.
In my applicaiton, I need to check the job status. If the job is in 'scheduled' status, I think that the job creation is not successsful; Otherwise, the job creation is successful.
But the issue is, if job is in 'scheduled' status, how can I know whether all three FMs is executed in the job creation or only JOB_OPEN and JOB_SUBMIT are executed while JOB_CLOSE still not starts?
In the later case, the applicaiton should wait some time and check the job later.
Thanks for your support
Best Regards, Johnney.

Hi,
Suppose there two APIs.
In API1, there are four steps:
1. call FM JOB_OPEN
2. call FM JOB_SUBMIT
3. Save the Jobname and Jobcount in the DB
4. CAll JOB_CLOSE
in  API2, there are two steps:
1. get the Jobname and Jobcount from the DB
2. Call FM BP_JOBLIST_STATUS_GET to get the job status.
API1 and API2 are executed asynchronically. Consider the following case:
After STEP3 is executed(STEP4 has not yet been started), the API2 starts to run. Now the job status that API2 get is 'Scheduled'.
This is not correct, because the job creation is still not finished. The API2 should wait some time and check later.
So my question, is how API2 know that the job creation is not finished.
Thanks & Best Regards, Johnney.

Similar Messages

  • A question about Job schdueling in cluster

    hi all
    I have a weblogic cluster and want to use the build-in commonj support to do some scheduling work.the pdf version document "Timer and Work Manager API (CommonJ) Programmer's Guide" has something like this on page 7,"The Timer Listener class must be pesent in the server system classpath" .does it mean that I should not put it in web-inf/classes?instead, I should jar it and put the jar somewhere inside wls_home/server/lib or ext ?
    thanks a lot :-]

    hi mchellap
    here is another question about timers in the cluster,
    1) I implemented a serializable timerlistener which I want to make it cluster aware
    2) put the JNDI items "timer/MyTimer" in web.xml which is to commonj.timers.TimerManager
    3) I created a datasource on cluster in console with the tables created in db
    after the cluster is started,the job is to print out the "new Date()" in console every 40 second,and it worked very well
    I am expecting something in the db table,but there is nothing,not even a exception ,anything wrong here?
    thanks a lot

  • A question about job

    Hi Expert,
    I have a question about the job count in the system. If I create a lot of jobs at the same time, for example, 10,000 or even more. What will happend? Is this possible? Does system have any limitation over the job count at the same time?
    Thanks in advance,
    Best Regards, Johnney.
    Edited by: Johnney Wu on Jun 8, 2009 4:08 PM

    hi ,
    please check this link...
    Max jobs
    hope it helps

  • Three questions about the creation of secondary indexes in ODS design mode

    1. When we right click the Indexes folder in ODS design to select Create to create the Index folder 010, a small window pops up with a check box called "Unique", do we have to check this checkbox to create folder 010?
    2. If we would like to include 3 InfoObjects into the secondary indexes, then how many folders we need to create? Need to create 3 folders like 010, 020, and 030 and place the 3 InfoObj. into the 3 folder respectively or create only one folder 010 and place all the 3 InfoObj. into this one folder 010?
    3. In SAP documentation titled "BW Performance Tuning" about the Indices, it says "If the (uncompressed) F fact table is small, it is usually faster to drop the secondary (bitmap) indices before the load and build them up after the load"
    Someone here says secondary index is only for ODS other than for cubes, but from the above statement, F fact table is related to cubes, then how to explain it? Also the above statement is talking above the variant in process chain that when you bring in a variant to load data to cube, then the drop index and generate index variants will be automatically created, does it refer to this? And what's bitmap index?
    Thanks in advance and we will continue to give you reward points!

    Hi Kevin,
    Cube has Secondary index's. The index's that we find in the Manage of the Cube are the Secondary Index's. Please check the following link for more information.
    http://help.sap.com/saphelp_erp2004/helpdata/en/80/1a6473e07211d2acb80000e829fbfe/frameset.htm
    Also, to know about Unique indexes, use the following link
    http://help.sap.com/saphelp_erp2004/helpdata/en/9b/c743f5b40711d194f900a0c929b3c3/frameset.htm
    "The indexes displayed are the secondary indexes of the F and E fact tables for the InfoCube. The primary indexes and those defined by the user are not displayed. " (SAP help).
    This should answer your questions.
    Regards,
    Praveen.

  • Questions about job scheduling

    i am a newbie in oracle, and i have several quesions about the scheduler.
    1 If I just have one job class, and there are many concurrent jobs inside. Then, how does oracle exactly schedule those jobs with the same priority?(Will they run for several time slices each time or be executed until completed one by one?)
    2 How does oracle exactly schedule those jobs with different priorities?
    3 If I have two job classes, how will oracle schedule those jobs in these two classes?(each of them is assigned some time slices based on the cpu allocation?)
    I have checked many turorials from oracle, but I have not got any detailed explanations or examples to show how oracle will exactly do it. I am currently doing some QoS management in an application based on oracle, so I am in great need of those information and details inside the scheduler.
    Thanks a lot.

    Hi,
    We have a situation here. We have a Info Warehouse, against which Batch Jobs are scheduled (around 200) using a 3rd party scheduling tool. The window for these jobs is 9pm-8am EST. All the batch jobs were completed withing the SLA window of 11 hours everyday until recently. The data has been growing in the database and the batch jobs have started taking longer time to complete and hence missing out on the SLA.
    We are thinking of using Database Resource Manager inorder to put the Batch Jobs in a Resource Group and bind it to Highest priotiy during the SLA window. Points to Ponder over are :
    Let's assume that we have put the Bacth Jobs in Hight Priority resource group with 80% CPU allocation and it comes into effect at 9pm everyday (that's when the jobs start executing). And, we have put all other users in a different resource group with Low priority (say 10%).
    The situation here is : - What if a user logs into the system at, say 8:45pm. and initiates execution of a huge reporting query against dataset. Assuming that the quesry takes about 60 minutes (during idle database condition), initially it would acquire all the CPU which it needs (as there is no load on the db at this time), and the jobs are scheduled to start at 9pm.
    Q: Will the priority for jobs set per the resource directives to the resource group for jobs take preceedence over the CPU already held by the low priority User?
    -- If Yes, what happens to the User session?
    -- If No, what are the possible scenarios?
    Q: Also, is it feasible to issue a KILL ALL SESSIONS just before the scheduled batch jobs, so that the jobs have a clear path for execution (taking into account that we have already prioritized the Jobs into High Priority Resouce group). So, that we are sure that there is no user connection right before the batch jobs start. and once they start, the resource allocation is going to be done based on the Resource Group priority and CPU assigned to respective group.?
    Q: Or, if it is more feasibile/beneficial to have all other user sessions switch to lower priority after a defined interval, so that it gives the CPU for the jobs.
    Q: Is there a difference between the Database Resource Manager in 9i/10g/11i.
    Any detailed explanation would be appreciated.
    Thanks
    Gaurav

  • Questions about PO creation / update / cancel interfaces

    Hi there,
    An E-Business Suite 11.5.10.2 customer wants to use the 11i PO creation / update / cancel interfaces in order to manage Purchase Orders created from a 3rd party system.
    I've been through the 'Oracle Purchasing Open Interfaces' guide (115mfgapi.pdf), but it's still unclear to me whether:
    1/ It is possible or not to add a new PO line to an existing PO through the interfaces ? If yes, how do one do that: what interface ? sample code ?
    2/ It is possible or not to change the distribution (= accounting information) of an existing PO line ? If yes, ...
    3/ It is possible or not to reject the full PO document if a single line fails during the PO creation ? Same for a batch of POs: reject the full batch if a single PO fails ?
    4/ Is it possible or not to cancel a PO line even if the shipment already occured ? And even if the payment already occured ? Are there some controls here ?
    Thanks for your help,
    Kind regards - Hugues

    1. Answer is YES. You can add a line to an existing standard purchase order by choosing the UPDATE. This can be done through Purchase Documents Open Interface.
    I can not give an example. It's quite long to write it. But You should use Open interface user guide to find nessesary columns. Just insert new row with status UPDATE and try to import
    2. Think You can NOT.
    3. Answer is YES. You can do it. Use "Cancel PO API". Function name is PO_Document_Control_PUB.control_document ().
    For technical details read Open interface UG.
    You must create exception to be avoid problems if some lines can not be canceled.
    4. Answer is YES. You can do it if there's still expected quantity to recive.
    boldJust from UG:*bold* :)
    bold1. Purchase Order Change APIs. bold
    boldThe APIs enable you to do the following*bold*:
    - Record Acceptance/Rejection in Oracle Purchasing
    - Update quantity, price, and promise date on standard purchase orders or releases in Oracle Purchasing
    boldSo only qty, promissed date, price*bold*
    boldRead next*bold*:
    Line Level Validation and Update: This logic occurs when LINE_NUM is not null
    and SHIPMENT_NUM is null.
    1. No update occurs if the line status is FINALLY CLOSED or CANCELLED.
    2. The new quantity or price value must be positive.
    3. If updating quantity, the new quantity must be greater than or equal to the
    greater of total quantity_received of all shipments and total quantity_billed
    of all shipments for this line. After the update takes place, the new quantity
    will be prorated at the shipment level and for each shipment the quantity is
    prorated at the distribution level if applicable.
    4. If updating price, no update occurs if a receipt has been created against one
    of the line’s shipments and it's been accrued upon receipt. No update
    occurs if an invoice has been created against one of the line’s shipments.
    After a price update takes place, price changes are rolled down to the
    shipment level for standard POs. No price update occurs for a release if the
    Price Override flag on the blanket purchase agreement Line is No.
    boldSummary:*bold*
    boldYou can update qty if still there's a qty to receive (Expected) and if PO still is not canceled*bold*
    boldEXAMPLE:*bold*
    Usage Example
    set serveroutput on;
    -- After the API completes, do not forget to commit if the result is 1
    -- and rollback if the result is 0.
    DECLARE
    l_result NUMBER;
    l_api_errors PO_API_ERRORS_REC_TYPE;
    BEGIN
    -- This needs to be changed according to your environment setup.
    FND_GLOBAL.apps_initialize ( user_id => 1318, resp_id => 50578, resp_appl_id => 201 );
    -- Record an acceptance of Y for PO 1261.
    l_result := PO_CHANGE_API1_S.record_acceptance(
    x_po_number => 1261,
    x_release_number => null,
    x_revision_number => 0,
    x_action => 'NEW',
    x_action_date => null,
    x_employee_id => 588,
    x_accepted_flag => 'Y',
    x_acceptance_lookup_code => 'On Schedule',
    x_note => 'All valid',
    x_interface_type => 'APITEST',
    x_transaction_id => null,
    version => '1.0');
    IF (l_result <> 1) THEN
    -- Handle the errors in the PO_INTERFACE_ERRORS table.
    END IF;
    -- Change the quantity to 5 on line 1, shipment 1 of PO 1263.
    l_result := PO_CHANGE_API1_S.update_po (
    x_po_number => 1263,
    x_release_number => 1,
    x_revision_number => 1,
    x_line_number => 1,
    x_shipment_number => 1,
    new_quantity => 5,
    new_price => NULL,
    new_promised_date => NULL,
    launch_approvals_flag =>'Y',
    update_source => NULL,
    version => '1.0',
    x_override_date => NULL,
    x_api_errors => l_api_errors,
    p_buyer_name => null
    IF (l_result <> 1) THEN
    -- Display the errors
    FOR i IN 1..l_api_errors.message_text.COUNT LOOP
    dbms_output.put_line ( l_api_errors.message_text(i) );
    END LOOP;
    END IF;

  • Question about user creation.

    So I've created using the Wiki article https://wiki.archlinux.org/index.php/Users_and_groups here.
    I've created a user and gave it a password and added to the wheel, and it works fine. I can login and it starts X and everything, however there is no file structure within the home/username directory. I always see on Arch Wiki and other ones about editing X rc files and what not, and they're referring to ~/.xinitrc or something similar.
    However my X files are located in the root place still. Is this something I should be taking into consideration and changing? I don't want to get further down the road and then have something be setup wrong. As of right now the only folders in my user folder are Desktop and Downloads, and I created those. I'm just wondering if there is something I'm supposed to do to transfer my X server and Window Manager stuff from the root positions to the user level. If running these from a root point is an okay thing to do I'm fine with that, I'm just under the impression I'm not supposed to do that.

    jasonwryan wrote:Don't run X as root. Follow the xinitrc wiki article and move all the files you need to run X as your user to their ~.
    Alright. Thanks for the link, should I follow that as a general rule of thumb(Although I can't see myself really installing anything else as root in the future)? Right now my Display Manager and Window Manager as both installed under root as well, and a couple other things I believe.

  • Question about:Job Pricing & Basic Pay Range Restriction...

    Hi Experts,
    I did servral settings to acheive Basic Pay Range Restriction(Warning) which uses Job Pricing concept:
    1-Define Pay Grades And Levels
        In this step, I build a records to describe a payment range for  certain evaluation points
    2-Create a record of infotype 1050(Evaluation Results) for certain position(let it be 10000001), in which I evaluate this position as 20 points.
    3-Assign 10000001 to an employee
    But when I input 0008 baisc pay amount for this employee,  range check does not happen as expected. Is there any steps I missed?
    It's urgent.
    Thanks in adv.
    Br,Kee

    In-line to tht check the IT1005
    http://help.sap.com/saphelp_47x200/helpdata/en/66/9bb8923aff11d189370000e829fbbd/frameset.htm

  • Questions about job postings...

    Hi ALL:
               Do we have a unique job posting forum here?
    I require candidates for a CRM manager post....
    Where do I post this requirement?
    Any help would be appreciated.
    I am sorry if this mail is misplaced.
    Thanks and Regards,
    Hetal Shah

    sorry....you can send me the resumes at [email protected]
    thanks again..
    Hetal Shah

  • Question about job overview

    Hello Gurus,
             for RSM37, I found some job record with me as the user,  for example " SAP_CCMS_MONI_BATCH_DP", "BI_PROCESS_COMPRESS". but I did not do anything on this system , so I wonder why those jobs are sticking on me? , and what does this user column really refer to?
    Many thanks

    HI,
    Chk this help link, it will give some idea:
    http://help.sap.com/saphelp_nw04/helpdata/en/c3/0dec3b6e011341e10000000a114084/content.htm
    Hope this helps.
    thanks,
    rahul

  • Workload management question (about task creation)

    Hi everybody! When I run "ps -efo taskid" command I see a lot of tasks. My questions are: "Are these tasks created by "default" by Solaris?" , "What is the criteria by which Solaris creates these "default" tasks?" and "Shouldn't it be just one default task per zone?"
    Thanks a lot!

    Hi Satish,
    You may find relevant information using following links.
    http://help.sap.com/saphelp_scm2007/helpdata/en/f2/5e89c05cf44fee814174c0af27c061/frameset.htm
    http://help.sap.com/saphelp_scm2007/helpdata/en/8f/9d6937089c2556e10000009b38f889/frameset.htm
    Regards,
    Harshil Desai

  • Question about job offer

    Hi,
    Is it allowed here to put job offer?
    Regards,
    Daniel

    No, Daniel, that is discouraged.
    http://www.sun.com/termsofuse.jsp
    http://developers.sun.com/aboutsdn/index.html
    http://www.sun.com/privacy/index.html

  • A question about item "type and release" of  source system creation

    Hello expert,
    I have a question about item "type and release" of  source system creation.
    As we know,when we create a web servie source system,there will display a pop-up which includes three items as "logical system","source system"and "type and release".
    About the item "type and release",when we push "F4" button,there will be three default selections as below:
    "ORA 115     Oracle Applications 11i
    TLF 205     Tealeaf 2.05B
    XPD 020     SAP xPD".
    Who can tell me when and how should I use the three selections.
    And also I attempted to input the item by some optional letters except the default three selections and it seems that I can input it freely.
    Thank you and Best Regards,
    Maggie

    Hello DMK,
    Thank you very much for your answer.It is very helpful for me.
    Can I ask you further about it?
    I got that it is a semantic description item.
    You said the default selections are set by our basis people.Would you like to tell me how should we creat a new value except the default ones for item "type and release"?Only by inputing the value in the item directly?But you see we canot see the new value item we created by ourself when we push "F4" button next time ,is that ok?Or do we have to ask basis people to define one new value item just like the default seletions before we use it.
    Also if possible would you like to describe detail about "This becomes important when you are troubleshooting certain issues especially when RFC connection problems."
    Thank you and Best Regards,
    Maggie
    Message was edited by: Maggie

  • Follow-up question about forms and SharePoint Online

    I asked a question about life after InfoPath earlier, and got a good answer:
    http://social.technet.microsoft.com/Forums/sharepoint/en-US/fb23b3d9-8a09-4267-aab5-09929f6a3082/life-after-infopath-seeking-advice
    After looking at all of the limitations of SharePoint Online, I'm wondering how developers are dealing with the limitations. Lets say you are asked to develop something that has complex logic, including fetching data from external web services, dynamically
    displaying parts of a process to people depending on role, and ending up with a printable document. In our on-premises environment, InfoPath is well suited to this task, with some code behind for some things. Or, if not using InfoPath, we would use application
    pages and workflow.
    Neither of those are available in SharePoint Online, so what would you do?

    Some things, such as the conditional display of content, can be done via JavaScript. More advanced items, such as integrating external web services would likely require a SharePoint "app". A SharePoint app is essentially a link to a separate site
    that is running an asp.net web app (or PHP, or whatever). This asp.net site can do anything it needs with any web services, or conditional formatting, or anything. Because it's registered as a SharePoint app, it can also call back into the SharePoint site
    and work with data. So, a SharePoint App could present the user with a robust form that simply sends the data back to a SharePoint list. The SharePoint app can also be surfaced on the SharePoint site itself in an iframe, so the user won't know that the form
    is hosted by another server.
    By the way, the ideas behind the app model permeate the entire SharePoint environment: instead of having the SharePoint server itself run all kinds of custom business logic, that workload is handled by other servers, so the SharePoint servers can be focused
    on running the core bits of SharePoint. InfoPath puts a large load on the servers, so it's out.  XSLT list views also put a load on the server, so they're also out. SSRS is an amazingly fantastic tool, but is not supported in the cloud (and there's no
    alternative). Timer jobs, event handlers, workflow, and many other things have been re-architected to take the load off the SharePoint servers.
    Mike G.

  • Some questions about the integration between BIEE and EBS

    Hi, dear,
    I'm a new bie of BIEE. In these days, have a look about BIEE architecture and the BIEE components. In the next project, there are some work about BIEE development based on EBS application. I have some questions about the integration :
    1) generally, is the BIEE database and application server decentralized with EBS database and application? Both BIEE 10g and 11g version can be integrated with EBS R12?
    2) In BIEE administrator tool, the first step is to create physical tables. if the source appliation is EBS, is it still needed to create the physical tables?
    3) if the physical tables creation is needed, how to complete the data transfer from the EBS source tables to BIEE physical tables? which ETL tool is prefer for most developers? warehouse builder or Oracle Data Integration?
    4) During data transfer phase, if there are many many large volume data needs to transfer, how to keep the completeness? for example, it needs to transfer 1 million rows from source database to BIEE physical tables, when 50%is completed, the users try to open the BIEE report, can they see the new 50% data on the reports? is there some transaction control in ETL phase?
    could anyone give some guide for me? I'm very appreciated if you can also give any other information.
    Thanks in advance.

    1) generally, is the BIEE database and application server decentralized with EBS database and application? Both BIEE 10g and 11g version can be integrated with EBS R12?You, shud consider OBI Application here which uses OBIEE as a reporting tool with different pre-built modules. Both 10g & 11g comes with different versions of BI apps which supports sources like Siebel CRM, EBS, Peoplesoft, JD Edwards etc..
    2) In BIEE administrator tool, the first step is to create physical tables. if the source appliation is EBS, is it still needed to create the physical tables?Its independent of any soure. This is OBIEE modeling to create RPD with all the layers. If you build it from scratch then you will require to create all the layers else if BI Apps is used then you will get pre-built RPD along with other pre-built components.
    3) if the physical tables creation is needed, how to complete the data transfer from the EBS source tables to BIEE physical tables? which ETL tool is prefer for most developers? warehouse builder or Oracle Data Integration?BI apps comes with pre-built ETL mapping to use with the tools majorly with Informatica. Only BI Apps 7.9.5.2 comes with ODI but oracle has plans to have only ODI for any further releases.
    4) During data transfer phase, if there are many many large volume data needs to transfer, how to keep the completeness? for example, it needs to transfer 1 million rows from source database to BIEE physical tables, when 50%is completed, the users try to open the BIEE report, can they see the new 50% data on the reports? is there some transaction control in ETL phase?User will still see old data because its good to turn on Cache and purge it after every load.
    Refer..http://www.oracle.com/us/solutions/ent-performance-bi/bi-applications-066544.html
    and many more docs on google
    Hope this helps

Maybe you are looking for