Stop BPM-Process

Hi all,
how can I stop a process instance in SXMB_MONI. Unfortounately the process is in an endless loop.
Regards
Mathias

Hi,
if it's on development system - transaction SWWL
Regards,
michal
<a href="/people/michal.krawczyk2/blog/2005/06/28/xipi-faq-frequently-asked-questions"><b>XI / PI FAQ - Frequently Asked Questions</b></a>

Similar Messages

  • How to Stop a BPM Process

    Hi,
    I have a scenario in which the BPM process went thru an endless loop. There was a logical flaw but the point is that there was no way to kill this process which was running through BPM . We tried RWB, BPM Engine Mointoring but with no success.
    In MONI all we see is a clock symbol next to PE icon.
    It would be really helpful if someone can tell me how to kill a process running in BPM engine.
    Both MONI and RWB tell me the message is Successful therefore cannot be stopped but internally the message is going through BPM workflow and never comes out.
    Thanks
    Ashish

    Hi Ashish,
    Logon to the ABAP stack of XI.
    Execute the transaction "SWWL" to delete the workflow items.
    Regards,
    Sridhar

  • BPM Process stop to work

    when i start a bpm process during the runtime the process stop to work and in the default trace i found this error:
    2011 09 26 15:24:11:785--+0200--Error--com.sap.glx.core.kernel.mmtx.PrimaryTransaction--
    com.sap.BPM.core_svc.000064--BC-BMT-BPM-SRV--com.sap.glx.core.svc--F65520AD5DBD13F60000000000004958--29830050000000004----com.sap.glx.core.kernel.mmtx.PrimaryTransaction--SAP_BPM_Service--0----217287D5E83F11E0AF57000001C72BA2--217287d5e83f11e0af57000001c72ba2--217287d5e83f11e0af57000001c72ba2--0--Galaxy 2079 / Follower Worker--Plain----
    afterCompletion(int):Unexpected error during deferred transaction commit: java.lang.IllegalMonitorStateException: Attempted to release a write lock, but the ticket-local lock count is zero (danger, inconsistency detected!)
         at com.sap.glx.core.dock.impl.GalaxyCowLock$LockImpl$WriteLockImpl.unlock(GalaxyCowLock.java:378)
         at com.sap.glx.core.kernel.mmtx.AbstractTransactionBase.localImageRelease(AbstractTransactionBase.java:798)
         at com.sap.glx.core.kernel.mmtx.AbstractTransactionBase.clean(AbstractTransactionBase.java:240)
         at com.sap.glx.core.kernel.mmtx.PrimaryTransaction.clean(PrimaryTransaction.java:86)
         at com.sap.glx.core.kernel.mmtx.AbstractTransaction.internalComplete(AbstractTransaction.java:392)
         at com.sap.glx.core.kernel.mmtx.AbstractTransaction.complete(AbstractTransaction.java:345)
         at com.sap.glx.core.kernel.mmtx.AbstractTransaction.rollback(AbstractTransaction.java:305)
         at com.sap.glx.core.kernel.mmtx.PrimaryTransaction$PersistentCommit.afterCompletion(PrimaryTransaction.java:401)
         at com.sap.engine.services.ts.jta.impl.SynchronizationWrapper.afterCompletion(SynchronizationWrapper.java:48)
         at com.sap.engine.services.ts.jta.impl2.TXR_TransactionImpl.rollback_internal(TXR_TransactionImpl.java:1286)
         at com.sap.engine.services.ts.jta.impl2.TXR_TransactionImpl.rollback(TXR_TransactionImpl.java:1048)
         at com.sap.engine.services.ts.jta.impl2.TXR_TransactionManagerImpl.rollback(TXR_TransactionManagerImpl.java:509)
         at com.sap.engine.services.ts.jta.impl2.TXR_UserTransaction.rollback(TXR_UserTransaction.java:170)
         at com.sap.glx.core.resource.impl.j2ee.J2EETransactionManagerFactory$JTATransactionManagerImpl.rollback(J2EETransactionManagerFactory.java:158)
         at com.sap.glx.core.kernel.mmtx.PrimaryTransaction.inRollback(PrimaryTransaction.java:274)
         at com.sap.glx.core.kernel.mmtx.AbstractTransaction.rollback(AbstractTransaction.java:279)
         at com.sap.glx.core.kernel.mmtx.AbstractTransactionBase.rollback(AbstractTransactionBase.java:778)
         at com.sap.glx.core.kernel.mmtx.AbstractTransaction.do_prepare(AbstractTransaction.java:202)
         at com.sap.glx.core.kernel.mmtx.AbstractTransaction.commit(AbstractTransaction.java:81)
         at com.sap.glx.core.kernel.execution.LeaderWorkerPool$Follower.run(LeaderWorkerPool.java:129)
         at com.sap.glx.core.resource.impl.common.WorkWrapper.run(WorkWrapper.java:58)
         at com.sap.glx.core.resource.impl.j2ee.J2EEResourceImpl$Sessionizer.run(J2EEResourceImpl.java:231)
         at com.sap.glx.core.resource.impl.j2ee.ServiceUserManager$ServiceUserImpersonator$1.run(ServiceUserManager.java:150)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:337)
         at com.sap.glx.core.resource.impl.j2ee.ServiceUserManager$ServiceUserImpersonator.run(ServiceUserManager.java:147)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:182)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:299
    (in other post i insert the rest of the default trace)
    anyone know how can i resolve it?

    011 09 26 15:34:27:382--+0200--Error--com.sap.glx.core.kernel.mmtx.AbstractTransactionBase--
    com.sap.BPM.core_svc.000127--BC-BMT-BPM-SRV--com.sap.glx.core.svc--F65520AD5DBD14090000000000004958--29830050000000004----com.sap.glx.core.kernel.mmtx.AbstractTransactionBase--SAP_BPM_Service--0----217287D5E83F11E0AF57000001C72BA2--217287d5e83f11e0af57000001c72ba2--217287d5e83f11e0af57000001c72ba2--0--Galaxy 4198 / Follower Worker / Script [ita18.ferrero.com/fce_ita18_bu_bp_etl_lib/ETL_DB_MappingParallel_Run_POOL/4a8be0fffb1191332ea3088e50c48cf2/BLACK_HOLE]--Plain----
    getLocalImage0(DockObject, boolean, boolean, boolean, boolean, boolean, long):Someone attempted to stain an object, which has already been deleted (OID = 3d806fd7-e844-11e0-a45d-000001c72ba2, separation = 3e58308c-e83f-11e0-b267-000001c72ba2, class = com.sap.glx.adapter.BPMNAdapter:Instance_0_TaskRun_1ae30a59b281399e4cf102c2ac458587). Blame this code: java.lang.NullPointerException: Object has been terminated (OID = 3d806fd7-e844-11e0-a45d-000001c72ba2, separation = 3e58308c-e83f-11e0-b267-000001c72ba2, class = com.sap.glx.adapter.BPMNAdapter:Instance_0_TaskRun_1ae30a59b281399e4cf102c2ac458587)
         at com.sap.glx.core.kernel.mmtx.AbstractTransactionBase.getLocalImage0(AbstractTransactionBase.java:560)
         at com.sap.glx.core.kernel.mmtx.AbstractTransactionBase.getLocalImage(AbstractTransactionBase.java:499)
         at com.sap.glx.core.kernel.mmtx.AbstractTransactionBase.getWritableImage(AbstractTransactionBase.java:612)
         at com.sap.glx.core.dock.impl.DockObjectImpl.bind(DockObjectImpl.java:386)
         at com.sap.glx.core.dock.impl.DockObjectImpl.bindChecked(DockObjectImpl.java:402)
         at com.sap.glx.process.adapter.bpmn.impl.BPMNAdapter.deleteChildInstances(BPMNAdapter.java:2245)
         at com.sap.glx.process.adapter.bpmn.impl.BPMNInstanceHandler.onDestruction(BPMNInstanceHandler.java:157)
         at com.sap.glx.core.dock.impl.DockObjectImpl.deletion(DockObjectImpl.java:226)
         at com.sap.glx.core.dock.impl.DockObjectImpl.delete(DockObjectImpl.java:563)
         at com.sap.glx.process.adapter.bpmn.impl.BPMNAdapter.deleteFirstChildOfFrameIfInstanceOrTask(BPMNAdapter.java:2947)
         at com.sap.glx.process.adapter.bpmn.impl.BPMNFrameHandler.onDestruction(BPMNFrameHandler.java:69)
         at com.sap.glx.core.dock.impl.DockObjectImpl.deletion(DockObjectImpl.java:226)
         at com.sap.glx.core.dock.impl.DockObjectImpl.delete(DockObjectImpl.java:563)
         at com.sap.glx.core.kernel.trigger.config.Script$DeleteInstance.execute(Script.java:298)
         at com.sap.glx.core.kernel.trigger.config.Script.execute(Script.java:798)
         at com.sap.glx.core.kernel.execution.transition.ScriptTransition.execute(ScriptTransition.java:78)
         at com.sap.glx.core.kernel.execution.transition.Transition.commence(Transition.java:138)
         at com.sap.glx.core.kernel.execution.LeaderWorkerPool$Follower.run(LeaderWorkerPool.java:127)
         at com.sap.glx.core.resource.impl.common.WorkWrapper.run(WorkWrapper.java:58)
         at com.sap.glx.core.resource.impl.j2ee.J2EEResourceImpl$Sessionizer.run(J2EEResourceImpl.java:231)
         at com.sap.glx.core.resource.impl.j2ee.ServiceUserManager$ServiceUserImpersonator$1.run(ServiceUserManager.java:150)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:337)
         at com.sap.glx.core.resource.impl.j2ee.ServiceUserManager$ServiceUserImpersonator.run(ServiceUserManager.java:147)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:182)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:299)

  • Stop a running BPM process

    Hello everybody,
    due to a loop with a never reached end condition we have a process running and sending lots of messages.
    How can we stop the process from running?
    Thanks Mario

    Hi,
    take a look at this blog by Michal on how BPM can be stopped.
    /people/michal.krawczyk2/blog/2006/06/27/xi-who-said-he-cannot-be-stopped-bpm-jim--sp17
    Its from SP17.
    Regards,
    Bhavesh

  • BPM Process chain takes long time to process

    We have BI7, Netweaver 2004s on Oracle and SUN Solaris
    There is a process chain (BPM) which pulls data from the CRM system into BW. The scheduled time to run this chain is 0034 hrs. This chain should ideally complete before / around 0830 Hrs. <b>Now the problem is that every alternate day this chain behaves normally and gets completed well before 0830 hrs but every alternate day this chain fails…</b> there are almost 40 chains running daily. Some are event triggered (dependent with each other) or some run in parallel. In this, (BPM) process chain, usually there are 5 requests with 3 Delta and 2 full uploads (Master Data). The delta uploads finishes in 30 minutes without any issues with very few record transfers. The first full upload is from 0034 hrs to approximately 0130 hrs and the 2nd upload is from 0130 hrs to 0230 hrs. Now if the 1st upload gets delayed then the people who are initiating these chains, stop the 2nd full upload and continue it after all the process chains are completed. Now this entire BPM process chain sometimes takes 17 -18 hrs to complete!!!!!
    No other loads in CRM or BW when these process chains are running
    CRM has background jobs to push IDOCS to BW which run every 2 minutes which runs successfully
    Yesterday this chain got completed successfully (well within stipulated time) with over 33,00,000 records transferred but sometimes it has failed to transfer even 12,00,000 records!!
    Attaching a zip file, please refer the “21 to 26 Analysis screen shot.doc” from the zip file
    Within the zip file, attaching “Normal timings of daily process chains.xls” – the name explains it….
    Also within the zip file refer “BPM Infoprovider and data source screen shot.doc” please refer this file as the infopackage (page 2) which was used in the process chain is not displayed later on in page number 6 BUT CHAIN GOT SUCESSFULLY COMPLETED
    We have analyzed:--
    1)     The PSA data for BPM process chain for past few days
    2)     The info providers for BPM process chain for past few days
    3)     The ODS entries for BPM process chain for past few days
    4)     The point of failure of BPM process chain for past few days
    5)     The overall performance of all the process chains for past few days
    6)     The number of requests in BW for this process chain
    7)     The load on CRM system for past few days when this process chain ran on BW system
    As per our analysis, there are couple of things which can be fixed in the BW system:--
    1)     The partner agreement (transaction WE20) defined for the partner LS/BP3CLNT475 mentions both message types RSSEND and RSINFO: -- collect IDOCs and pack size = 1 Since the pack size = 1 will generate 1 TRFC call per IDOC, it should be changed to 10 so that less number of TRFCs will be generated thus less overhead for the BW server resulting in the increase in performance
    2)     In the definition of destination for the concerned RFC in BW (SM59), the “Technical Setting” tab says the “Load balancing” option = “No”. We are planning to make it “Yes”
    But we believe that though these changes will bring some increase in performance, this is not the root cause of the abnormal behavior of this chain as this chain runs successfully on every alternate day with approximately the same amount of load in it.
    I was not able to attach the many screen shots or the info which I had gathered during my analysis. Please advice how do I attach these files
    Best Regards,

    Hi,
    Normally  index  creation or deletion can take long time in case  your database statistics are not updated properly, so can check  stat  after your data loading is completed and index generation is done,  Do creation of database statistics.
    Then try to recheck ...
    Regards,
    Satya

  • How to stop message processing based on validation?

    Hello experts,
    I have a requirement to stop message processing in the graphical mapping based on validation results. Here is the scenario - messages are translated using graphical mapping and sent to the target system. An RFC lookup will be done to ECC to determine if the data in the message is good. If the lookup returns a negative result, message processing should be stopped right there.
    I guess we can throw an exception from the mapping to force a failure and stop further processing, but that will cause the message to show up as failed on SXMB_MONI and cause alert emails to be sent out in PROD. Another option will be to supress creation of the root node itself, but I think the message will then fail in the subsequent "call adapter" step if the target schema has a min occurence of 1 for the root node (as in the case of IDocs).
    Is it possible to do it without using BPM?
    Thanks,
    Michelle

    Hi Michelle,
       If your requirement, to stop message processing without sending an alert?
    If yes, then you can have a alert rule, not to trigger alerts on a failure (And to raise an exception based on the result from RFC loookup).
    If your requirement is not to make the message fail, then you have to go for the ccBPM route.
    Best Regards,
    Ravikanth Talagana

  • Fault message in BPM process

    I have a bpm process with a sync send step. i have a default fault message strcuture and my sync interface ( a proxy) is using this as fault message format. In case of an fault in the proxy i get the fault message returned. In my bpm i have an exception handler installed for the exception. What i would like to do in my process is send the fault message to another step like a send step.
    i m not looking to understand sync/async bridges or how to setup a bpm process. I am looking for help in how i can work with the fault message in my bpm process other then only let it trigger the exception routine

    Hi Frank,
    in the view of your BP an exception at proxy side is no reason to stop processing. The message will be put to your interface and it is your decision to put data into a container and to send it. For using a "global" container for every kind of message have a look to my weblog <a href="/people/udo.martens/blog/2006/04/28/bpm-container-for-any-message-type Container for any Message Type</a>. If your send step is not successful and you want to catch that: Define an exception branch for your block, where you send a senseful message to notice the error. You can refer in the original send step the exception, during runtime the process jumps into the exception branch and is doing that what you defined there.
    Regards,
    Udo

  • Sequential processing of messages with bpm process

    hi,
    i have a bpm process that i want to process my messages sequential. for this i initially have played around with crerating my own queue but that didn t work. then i mocved to use the collect pattern and then process the messages. this works only for around 150 messages and the others are staying stuck in the queue for the bpm process the queue is in status stop. if a drop in another file then the queue get cleared and the new messages are placed in it and it still stays in stop and i need to delete queu and start again. Is there a solutio for this that the queue will collect the messages and when the bpm proces is finished it will start again ?
    i have tried to avoid using the collect pattern and worked with my bpm process to let it use a specific queue  but that also didn t work. i can process 1 message and then the rest stays in the queue? any help would be welcome

    Hey
    is there any reason specific reason for using BPM?
    you can use EOIO to transfer your files sequentially.
    thanx
    ahmad

  • MDM web dynpro in BPM process

    This is my first development in BPM, so please be charitable and understanding....
    I have following error in my process:
    Error: Build stopped due to an error: com.sap.glx.paradigmInterface.bpmn.compiler.BPMNCompilerException: [BPM.rt_c_bpmn.000024] I'm sorry, I'm afraid I can't do that: The data object 'UIResponse' does not have any type assigned to it
    I have MDM web dynpro Item detail component embeed in my process in the human activity. I suppose that the problem is maybe in WebDynpro part, but I have no idea where - I was following the instructions given in e-book "How to integrate MDM with BPM", until now :-(. Do you have any idea what can be the reason of such error? I tried to find something in google, but no results for that....
    Thanks,

    Yes, I follow instruction "How to Integrate Master Data Management(MDM) and Business Process Management(BPM)", chapter number 5 - "Passing MDM Semantic Data between BPM Process Steps using wrapper application". In my example I need to have BPMStatus visible in the context - I don't know why yet, but it is empty now. The previous problem which I described above is already solved. I am not sure where the problem exactly was, nothing helped, so I decided to create new process from scratch. Now I am able to build the project (no error occur), but my contrainer is empty. I drag and drop it as it is descibed on page 58. The only one difference is that I made connection with BPM status.
    Besides, I discovered strange thing in this documentation on page number 61 - specify event handler. I use CE ver 7.2, and I have different windows than here - I have methods, events and event handler in separate tabs. So, I created event and event handler in those separate tabs, and then I should be able to copy them to Intefrace controller, as it is described it instruction. But in interface controller I don't have tab "event handler". Do you know why is that? Is it enough to have event handler in Component Controller? Maybe this is the problem why I don't see my container and cannot complete task?
    Regards and thanks for your replies,

  • How to kill a BPM Process

    Hi Guys,
    I want to know how to kill a BPM process.
    any help would be appreciated
    Thanks,
    Srini

    Srini,
    Refer this -Re: How to stop infinite loop?
    raj.

  • Correlationid in ccBPM in stopped bpm's

    Hi,
    We use PI ccBPM to coordinate between several async services.
    Each service is called async and a deadline (timeout) is opened waiting for it to callback async. The receiver is configured with a correlation id so that the correct bpm context is restored when the call returns. If the deadline is meet before the callback receiver has been called, we want to report an error, and stop the bpm process.
    The process woks fine when there are no timeouts. However, we found that if a process times out, and we exit the bpm instance, but later a callback arrives with a correlation id for that closed process, the whole bpm queue (for ALL existing and future instances) is stuck, until queue is handled manually.
    What can be done to avoid this result?
    10x

    Try disabling the 'Create New Transaction' flag from the receiver step.
    Also, please check the OSS note [1042379 - BPE-HT: Deleting messages that are no longer used|https://websmp230.sap-ag.de/sap(bD1wdCZjPTAwMQ==)/bc/bsp/spn/sapnotes/index2.htm?numm=1042379] for removing the unused messages.

  • Oracle BPM Process Data mart

    I am required to create audit reports on BPM workflows.
    I am new to thid & need some guidance on configuring BPM Process Data mart. What are the pre-requisites for configuring it & what are the steps to do it.
    Also, need some inputs on BAM database. What is the frequency of data upload. Is it data update or insert in BAM.

    Hi,
    You might want to check out the Administration and Configuration Guides on http://download.oracle.com/docs/cd/E13154_01/bpm/docs65/index.html.
    I suspect you might find the BAM and Data Mart portions of this documentation a bit terse, so I've added the steps below that provides more detail. I wrote this for ALBPM 6.0, but believe it will still work for Oracle BPM 10g. It was created from an earlier ALBPM 5.7 document Support wrote called "ALBPM 5_7 Configuring and Troubleshooting the BAM and DataMart Updater.pdf.
    You can define how often you want the contents in both databases updated (actually inserted) and how long you want to persist the contents of the BAM database during the configuration.
    Here's the contents of the document:
    1. Introduction
    The use of BAM (Business Activity Monitoring) and Data Mart (or Warehouse) information is becoming more and more widespread in today’s BPM project implementations for the obvious benefits they bring to the management and tuning of processes.
    BAM is basically composed by a collection of measurements of current processes load and execution times. This gives us an idea of how the business is doing at this moment (in a pseudo real-time fashion).
    Data Mart, on the other hand, is a historical view of the processes load and execution times. And this gives us an idea of how the business has developed since the moment the projects have been put in place.
    In this document we are not going to describe exhaustively all configuration aspects of the BAM and Data Mart Updater, but rather we will quickly move from one configuration step to another paying more attention to subjects that have presented some difficulties in real-life projects.
    2. Creating the Service Endpoints
    The databases for BAM and for Data Mart first have to be defined in the External Resources section of the BPM Process Administrator.
    In this following example the service endpoint ‘BAMJ2EEWL’ is being defined. This definition is going to be used later as BAM storage. At this point nothing is created.
    Add an External Resource with the name ‘BAMJ2EEWL’ and, as we use Oracle, select the Oracle driver, then click <Next>:
    On the following screen, specify:
    ·     the hostname – here I have used ‘localhost’ as I am just setting this up to work on my laptop
    ·     the port for the Oracle service
    ·     the SID – here I have use Oracle Express so the SID is ‘XE’
    ·     the new user to create / use in Oracle for this database – here I have specified ‘BPMBAM’. This user, and its database, will be created later
    ·     the password for the user
    Scroll down to the bottom of the page and click <Save>.
    In addition to a standard JDBC connection that is going to be used by the Updater Service, a remote JDBC configuration needs to be added as the Engine runs in a WebLogic J2EE container. This Data Source is needed to grant the Engine access over BAM tables thru the J2EE Connection Pool instead of thru a dedicated JDBC. The following is an example of how to set this up.
    Add an External Resource with the name ‘BAMremote’ and select the Oracle driver, then click <Next>
    On the following screen, specify:
    ·     the Lookup Name that will be used subsequently in WebLogic - here I have given it the name ‘XAbamDS’
    Then click <Save>.
    In the next example the definition ‘DWHJ2EEWL’ is created to be used later as Data Mart storage. If you are not going to use a Data Mart storage you can skip this step.
    Add an External Resource with the name ‘DWHJ2EEWL’ and select the Oracle driver, then click <Next>:
    On the following screen, specify:
    ·     the hostname – here I have used ‘localhost’ as I am just setting this up to work on my laptop
    ·     the port for the Oracle service
    ·     the SID – here I have use Oracle Express so the SID is ‘XE’
    ·     the new user to create / use in Oracle for this database – here I have specified ‘BPMDWH’. This user, and its database, will be created later
    ·     the password for the user
    3. Configuring BAM Updater Service
    Once the service endpoint has been created the next step is to enable the BAM update, select the service endpoint to be used as BAM storage and configure update frequency and others. Here the “Updater Database Configuration” is the standard JDBC we configured earlier and the “Runtime Database Configuration” is the Remote JDBC as we are using the J2EE Engine.
    So, here’s the example of how to set up the BAM Updater service….
    Go into ‘Process Monitoring’ and select the ‘BAM’ tab and enter the relevant information (using the names created earlier – use the drop down list to select):
    Note that here, to allow me to quickly test BAM reporting, I have set the update frequency to 1 minute. This would not be the production setting.
    Once the data is input, click <Save>.
    We now have to create the schema and related tables. For this we will open the “Manage Database” page that has appeared at the bottom of the BAM screen (you may have to re-select that Tab) and select to create the database and the data structure. The user required to perform this operation is the DB system administrator:
    Text showing the successful creation of the database and data structures should appear.
    Once we are done with the schema creation, we can move to the Process Data Mart configuration screen to set up the Common Updater Service parameters. Notice that the service has not been started yet… We will get to that point later.
    4. Configuring Process Data Mart Updater Service
    In the case that Data Mart information is not going to be used, the “Enable Automatic Update” checkbox must be left off and the “Runtime Database Configuration” empty for this service. Additionally, the rest of this section can be skipped.
    In the case it is going to be used, the detail level, snapshot time and the time of update should be configured; in addition to enabling the updater and choosing the storage configuration. An example is shown below:
    Still in ‘Process Monitoring’, select the ‘Process Data Mart’ tab and enter the name created earlier (use the drop down list to select).
    Also, un-tick the Generate O3 Cubes (see later notes):
    Then click <Save>.
    Once those properties have been configured the database and the data structure have to be created. This is performed at the “Manage Database” page for which the link has appeared at the bottom of the page (as with BAM). Even when this page is identical to the one shown above (for the BAM configuration) it has been opened from the link in the “Process Data Mart” page and this makes it different.
    Text showing the successful creation of the database and data structures should appear.
    5. Configuring Common Updater Service Parameters
    In the “Process Data Mart” tab of the Process Monitoring section -along with the parameters that are specific to the Data Mart - we will find some parameters that are common to all services. These parameters are:
    • Log directory: location of the log file
    • Messages logged from Data Store Updater: severity level of the Updater logs
    • Language
    • Generate Performance Metrics: enables performance metrics generation
    • Generate Workload Metrics: enables workload metrics generation
    • Generate O3 Cubes: enables O3 Cubes generation
    In this document we are not going to describe in detail each parameter. But we will mention a few caveats:
    a. the Log directory must be specified in order for the logs to be generated
    b. the Messages logged from Data Store Updater, which indicates the level
    of the logs, should be DEBUG for troubleshooting and WARNING otherwise
    c. Performance and Workload Metrics need to be on for the typical BAM usage and, even when either metric might not be used on the initial project releases, it is recommended to leave them on in case they turn out to be useful in the future
    d. the Generation of O3 Cubes must be off if this service is not used, otherwise the Data Mart Updater service might not work properly .
    The only changes required on this screen was to de-select the ‘Generate O3 Cubes’ as shown in the last section.
    6. Set up the WebLogic configuration
    We need to set up the JDBC data source specified above, so go to Services / JDBC / Data Sources.
    Click on <Lock and Edit> and then <New> to add a New data source.
    Specify:
    ·     the Name – use the name you set up in the Process Administrator
    ·     the JNDI Name – again use the name you set up in the Process Administrator
    ·     the Database Type – Oracle
    ·     use the default Oracle Database Driver
    Then click <Next>
    On the next screen, click <Next>
    On the next screen specify:
    ·     the Database Name – this is the SID – for me that is XE
    ·     the Host Name – as I am running on my laptop, I’ve just specified ‘localhost’
    ·     the Database User Name and Password – this is the BAM database user specified in the Process Administrator
    Then click <Next>
    On the next screen, you can test the configuration to make sure you have got it right, then click <Next>
    On the next screen, select your server as the target server and click <Finish>:
    Finally, click <Activate Changes>.
    7. The Last Step: Starting Up and Shutting Down the Updater Service
    ALBPM distribution is different depending on the Operating System. In the case of the Updater Service:
    -     For Unix like Operating Systems the service is started or stopped with the albpmwarehouse.sh shell script. The command in this case is going to look like this:
    $ALBPM_HOME/bin$ ./albpmwarehouse.sh start
    -     For Windows Operating Systems the service is installed or uninstalled as a Windows Service with the albpmwarehouse.bat batch file. The command will look like:
    %ALBPM_HOME%\bin> albpmwarehouse.bat install
    After installing the service, it has to be started|stopped from the Microsoft Management Console. Note also that Windows will start automatically the installed service when the computer starts. In either case the location of the script is ALBPM_HOME/bin Where ALBPM_HOME is the ALBPM installation directory. An example will be:
    C:\bea\albpm6.0\j2eewl\bin\albpmwarehouse.bat
    8. Finally: Running BAM dashboards to show it is Working
    Now we have finally got the BAM service running, we can run dashboards from within Workspace and see the results:
    9. General BAM and Data Mart Caveats
    a. The basic difference between these two collections of measurements is that BAM keeps track of current processes load and execution times while Data Mart contains a historical view of those same measurements. This is why BAM information is collected frequently (every minute) and cleared out every several hours (or every day) and why Data Mart is updated infrequently (once a day) and grows indefinitely. Moreover, BAM measurements can be though of as a minute-by-minute sequence of Engine Events snapshots, while Data Mart measurements will be a daily sequence of Engine Events snapshots.
    b. BAM and Data Mart table schemas are very similar but they are not the same. Thus, it is important not to use a schema created with the Manage Database for BAM as Data Mart storage or vice-versa. If these schemas are exchanged by mistake, the updater service will run anyway but no data will be added to the tables and there will be errors in the log indicating that the schema is incorrect or that some tables could not be found.
    c. BAM and Data Mart Information and Services are independent from one another. Any of them can be configured and running without the other one. The information is extracted directly from the Engine Database (PPROCINSTEVENT table is the main source of info) for both of them.
    d. So far there has not been a mention of engines, projects or processes in any of the BAM or Data Mart configurations. This is because the metrics of all projects published under the current Process Administrator (or, more precisely, FDI Directory) are going to be collected.
    e. It is also important to note that only activities for which events are generated are going to be measured (and therefore, shown in the metrics). The project default is to generate events only for Interactive activities. This can be changed for any particular activity and for the whole process (where the activity setting, when specified, overrides the process setting). Unfortunately, there is no project setting for events generation so far; thus, remember to edit the level of event generation for every new process that is added to the project.
    f. BAM and Data Mart metrics are usually enriched with Business Variables. These variables are a special type of External Variables. An External Variable is a process variable with the scope of an Instance and whose value is stored on a separate column in the Engine Instances table. This allows the creation of views and filters based on this variable. A Business Variable, then, shares all the properties of an External Variable plus the fact that its value is collected in all BAM and Data Mart measurements (in some cases the value is shown as it is for a particular instance and in others the value is aggregated).
    The caveat here is that there is a maximum number of 256 Business Variables per FDI. Therefore, when publishing several projects into a single FDI directory it is recommendable to reuse business variables. This is achieved by mapping similar Business Variables of different projects with a unique real Variable (on the variable mapping performed at publish time).
    g. Configuring the Updater Service Log
    In section 5. Configuring Common Updater Service Parameters we have seen that there are two common Updater properties related to logging. These properties are “Log directory” and “Messages logged from Data Store Updater”, and they specify the location and level of these two files:
    - dwupdater.log: which is the log for the Data Mart updater service
    - bam-dwupdater.log: which is the log for the BAM updater service
    In addition to these two properties, there is a configuration file called ‘WarehouseService.conf’ that allows us to modify these other properties:
    - wrapper.console.loglevel: level for the updater service log
    - wrapper.logfile.loglevel: level for the updater service log
    - wrapper.java.additional.n: additional argument to the service JVM
    - wrapper.logfile.maxsize: maximum size of the updater service log files
    - wrapper.logfile.maxfiles: maximum number of updater service log files
    - wrapper.logfile: updater service log file name (the default value is dwupdater-service.log)
    9.1. Updater Service Log Configuration Caveats
    a. The first three parameters listed above have to be modified when increasing the log level to DEBUG (since the default is WARNING). The loglevel parameters have to be set to DEBUG and a java.additional.n (where n is a consecutive integer to the already used ones) has to be set to –ea to enable asserts, since without this option no DEBUG message is going to be generated.
    b. Of the other arguments, maxfiles might need to be increased to hold a few more days of data when the log level is set to DEBUG (with the default value up to two days are stored).
    c. The updater service has to be stopped, uninstalled, installed and then started for any of these changes to take effect.
    Hope this helps,
    Dan

  • NWDI, CBS, BPM process build fails

    Hello.
    We have a BPM process which we are trying to build on our DEVINF server. The devinf server is running on NW 7.01 SP7. We update the JDK to 1.6. All DC builds fine but the BPM process fails with an error:
    Error: java.lang.OutOfMemoryError: PermGen space
         at java.lang.Class.getName0(Native Method)
         at java.lang.Class.getName(Class.java:552)
         at java.lang.Throwable.toString(Throwable.java:342)
         at org.apache.tools.ant.BuildException.<init>(BuildException.java:88)
         at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1225)
         at org.apache.tools.ant.Project.executeTarget(Project.java:1185)
         at com.sap.tc.buildplugin.techdev.ant.util.AntRunner.run(AntRunner.java:114)
         at com.sap.tc.buildplugin.DefaultAntBuildAction.execute(DefaultAntBuildAction.java:57)
         at com.sap.tc.buildplugin.DefaultPlugin.handleBuildStepSequence(DefaultPlugin.java:195)
         at com.sap.tc.buildplugin.DefaultPlugin.performBuild(DefaultPlugin.java:167)
         at com.sap.tc.buildplugin.DefaultPluginV3Delegate$BuildRequestHandler.handle(DefaultPluginV3Delegate.java:66)
         at com.sap.tc.buildplugin.DefaultPluginV3Delegate.requestV3(DefaultPluginV3Delegate.java:48)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at com.sap.tc.buildtool.v2.impl.PluginHandler2.maybeInvoke(PluginHandler2.java:403)
         at com.sap.tc.buildtool.v2.impl.PluginHandler2.request(PluginHandler2.java:149)
         at com.sap.tc.buildtool.v2.impl.PluginHandler2.build(PluginHandler2.java:87)
         at com.sap.tc.buildtool.PluginHandler2Wrapper.execute(PluginHandler2Wrapper.java:59)
         at com.sap.tc.devconf.internal.DCProxyMake.make(DCProxyMake.java:318)
         at com.sap.tc.devconf.internal.DCProxy.make(DCProxy.java:1432)
         at com.sap.tc.devconf.internal.DCProxy.make(DCProxy.java:1414)
         at com.sap.tc.buildcontroller.CBSBuildController.build(CBSBuildController.java:713)
         at com.sap.tc.buildcontroller.CBSBuildController.execCommand(CBSBuildController.java:478)
         at com.sap.tc.buildcontroller.CBSBuildController.evalCmdLine(CBSBuildController.java:401)
         at com.sap.tc.buildcontroller.CBSBuildController.run(CBSBuildController.java:278)
         at com.sap.tc.buildcontroller.CBSBuildController.mainLoop(CBSBuildController.java:187)
         at com.sap.tc.buildcontroller.CBSBuildController.main(CBSBuildController.java:143)
    Error: Build stopped due to an error: PermGen space
    Should say that the process builds fine on the local workstation.
    We tried to change the parameter
    -XX:PermSize=1024m
    -XX:MaxPermSize=1024m
    but it doesn't help.
    Should we open the OSS ticket or not?

    Hi Jun Wu,
    I was getting exact the same error message for all activation requests and after selecting the option you mentioned I was able to activate/build again.
    CE 7.3 SP08
    Thumbs up!
    Regards, Roberto Viana

  • How to localize HumanTask names and process names of oracle BPM process ?

    Anybody known how to localize HumanTask names and process names of oracle BPM process ?

    Oracle Apex is an API, if that helps you understand / visualize. You do not start Apex process nor stop it.
    When an Apex session starts it starts calling the API.
    You can however start / stop the listener. It may be OHS, ApexListener and the J2EE container running it, OC4J or any other "server" that you are using.
    The built-in EPG is something like an API again, you cannot start / stop it but you can disable/enable it with DBMS_XDB.SETHTTPPORT API.
    Regards,

  • Dynamic Filename in BPM process (SOAP with attachm. and PayloadSwapBean)

    Hello together
    I have the following BPM process:
    1. IDoc=>WebServiceRequest
    2. WebServiceResponse (payload) => IDoc
    3. WebServiceResponse (attachment) => File
    XI receivs an IDoc an map it to an WebService. The Webservice is called by XI and we receive the WebServiceResponse including a PDF attachment.
    The challenge is to store the PDF attachment with a dynamic filename from the payload of WebServiceResponse.
    We use the PayloadSwapBean to change the payload to the PDF attachment. But then we are not able to access the required information on the original WebService-XML-Response via variable substitution.
    Is there a solution in the standard or have we to use a custom adapter module?
    Thx
    manuku

    Hi Jayasimha,
    We can do this by "Adapter Specific Message Properties" of ur comunication channels.
    1.If u want to keep the output filename same as input filename, no need to use the UDF. only the 'adapter specific parameters' in both sender n receiver file adapter will do that.
    In case if u want to get the filename inside our mapping we have to create a user defined function
    which will return the filename and map it to one of our XML tags. 2nd point gives solution 4 that:
    2. If u want to generate an output file taking some input from the payload,then u hav to use the UDF.There u hav to populate the name.
    Pretty much.... if you set an attribute from the sender side, for example, you can use a UDF and access the particular attribute sent and use it in the mapping. In another example, where no attributes are sent from the sender, you can still actually set a particular attribute, say a filename derived from the payload, using a UDF, and enable the receiver attribute to use it. That's where the UDFs come in - either to get or set particular adapter specific message attributes.
    This will be a very helpful blog which solves ur query:
    /people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14
    Regards,
    Vinod.

Maybe you are looking for